skip to main content
research-article

Non-Acted Text and Keystrokes Database and Learning Methods to Recognize Emotions

Authors Info & Claims
Published:16 February 2022Publication History
Skip Abstract Section

Abstract

The modern computing applications are presently adapting to the convenient availability of huge and diverse data for making their pattern recognition methods smarter. Identification of dominant emotion solely based on the text data generated by humans is essential for the modern human–computer interaction. This work presents a multimodal text-keystrokes dataset and associated learning methods for the identification of human emotions hidden in small text. For this, a text-keystrokes data of 69 participants is collected in multiple scenarios. Stimuli are induced through videos in a controlled environment. After the stimuli induction, participants write their reviews about the given scenario in an unguided manner. Afterward, keystroke and in-text features are extracted from the dataset. These are used with an assortment of learning methods to identify emotion hidden in the short text. An accuracy of 86.95% is achieved by fusing text and keystroke features. Whereas, 100% accuracy is obtained for pleasure-displeasure classes of emotions using the fusion of keystroke/text features, tree-based feature selection method, and support vector machine classifier. The present work is also compared with four state-of-the-art techniques for the same task, where the results suggest that the present proposal performs better in terms of accuracy.

REFERENCES

  1. [1] Liu J., Li T., Xie P., Du S., Teng F., and Yang X.. 2020. Urban big data fusion based on deep learning: An overview. Information Fusion 53 (2020), 123133.Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. [2] Ramírez-Gallego S., Fernández A., García S., Chen M., and Herrera F.. 2018. Big data: Tutorial and guidelines on information and process fusion for analytics algorithms with MapReduce. Information Fusion 42 (2018), 5161.Google ScholarGoogle ScholarCross RefCross Ref
  3. [3] Bechtoldt M. N., Rohrmann S., De Pater I. E., and Beersma B.. 2011. The primacy of perceiving: Emotion recognition buffers negative effects of emotional labor. Journal of Applied Psychology 96, 5 (2011), 1087.Google ScholarGoogle ScholarCross RefCross Ref
  4. [4] Hassan M. M., Alam M. G. R., Uddin M. Z., Huda S., Almogren A., and Fortino G.. 2019. Human emotion recognition using deep belief network architecture. Information Fusion 51 (2019), 1018.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. [5] Ekman P.. 1999. Basic emotions. Handbook of Cognition and Emotion. John Wiley & Sons, USA, 4560.Google ScholarGoogle Scholar
  6. [6] Abaalkhail R., Guthier B., Alharthi R., and El Saddik A.. 2018. Survey on ontologies for affective states and their influences. Semantic Web 9, 4 (2018), 441458.Google ScholarGoogle ScholarCross RefCross Ref
  7. [7] Plutchik R.. 2001. The nature of emotions: Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. American Scientist 89, 4 (2001), 344350.Google ScholarGoogle ScholarCross RefCross Ref
  8. [8] Nathanson Donald L.. 1992. Shame and Pride: Affect, Sex, and the Birth of the Self. W.W. Norton, New York.Google ScholarGoogle Scholar
  9. [9] Valdivia A., Luzón M. V., Cambria E., and Herrera F.. 2018. Consensus vote models for detecting and filtering neutrality in sentiment analysis. Information Fusion 44 (2018), 126135.Google ScholarGoogle ScholarCross RefCross Ref
  10. [10] Chaturvedi I., Cambria E., Welsch R. E., and Herrera F.. 2018. Distinguishing between facts and opinions for sentiment analysis: Survey and challenges. Information Fusion 44 (2018), 6577.Google ScholarGoogle ScholarCross RefCross Ref
  11. [11] Nwe T. L., Foo S. W., and De Silva L. C.. 2003. Speech emotion recognition using hidden Markov models. Speech Communication 41, 4 (2003), 603623.Google ScholarGoogle ScholarCross RefCross Ref
  12. [12] Shikder R., Rahaman S., Afroze F., and Islam A. A.. 2017. Keystroke/mouse usage based emotion detection and user identification. In Proceedings of the International Conference on Networking, Systems and Security (NSysS), 96104.Google ScholarGoogle ScholarCross RefCross Ref
  13. [13] Salmeron-Majadas S., Baker R. S., Santos O. C., and Boticario J. G.. 2018. A machine learning approach to leverage individual keyboard and mouse interaction behavior from multiple users in real-world learning scenarios. IEEE Access 6 (2018), 3915439179.Google ScholarGoogle ScholarCross RefCross Ref
  14. [14] Koelstra S., Muhl C., Soleymani M., Lee J. S., Yazdani A., Ebrahimi T., Pun T., Nijholt A., and Patras I.. 2012. Deap: A database for emotion analysis; using physiological signals. IEEE Transactions on Affective Computing 3, 1 (2012), 1831. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. [15] Gu Y., Chen S., and Marsic I.. 2018. Deep multimodal learning for emotion recognition in spoken language. In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing.Google ScholarGoogle Scholar
  16. [16] Zhalehpour S., Onder O., Akhtar Z., and Erdem C. E.. 2017. BAUM-1: A spontaneous audio-visual face database of affective and mental states. IEEE Transactions on Affective Computing 8, 3 (2017), 300313.Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. [17] R. L. Payne and C. Cooper (Eds.). 2003. Emotions at Work: Theory, Research and Applications for Management. John Wiley & Sons.Google ScholarGoogle Scholar
  18. [18] Averill J. R.. 1983. Studies on anger and aggression: Implications for theories of emotion. American Psychologist 38, 11 (1983), 1145.Google ScholarGoogle ScholarCross RefCross Ref
  19. [19] Adolphs R.. 2017. How should neuroscience study emotions? By distinguishing emotion states, concepts, and experiences. Social Cognitive and Affective Neuroscience 12, 1 (2017), 2431.Google ScholarGoogle ScholarCross RefCross Ref
  20. [20] Busso C., Bulut M., Lee C.C., Kazemzadeh A., Mower E., Kim S., Chang J.N., Lee S., and Narayanan S.S.. 2008. IEMOCAP: Interactive emotional dyadic motion capture database. Journal of Language Resources and Evaluation. 42, 4 (December 2008), 335359.Google ScholarGoogle ScholarCross RefCross Ref
  21. [21] Tripathi S. and Beigi H.. 2018. Multi-modal emotion recognition on IEMOCAP dataset using deep learning. Retrieved from https://arxiv.org/abs/1804.05788.Google ScholarGoogle Scholar
  22. [22] Alhuzali H., Abdul-Mageed M., and Ungar L.. 2018. Enabling deep learning of emotion with first-person seed expressions. In Proceedings of the 2nd Workshop on Computational Modeling of People's Opinions, Personality, and Emotions in Social Media. 2535.Google ScholarGoogle ScholarCross RefCross Ref
  23. [23] Kratzwald B., Ilic S., Kraus M., Feuerriegel S., and Prendinger H.. 2018. Decision support with text-based emotion recognition: Deep learning for affective computing. Decision Support Systems 115 (2018), 2435.Google ScholarGoogle ScholarCross RefCross Ref
  24. [24] Sebe N., Cohen I., Gevers T., and Huang T.S.. 2015. A multimodal approach to detect user's emotion. Procedia Computer Science 70, 1 (2015), 296303.Google ScholarGoogle ScholarCross RefCross Ref
  25. [25] Gupta N., Gilbert M., and Fabbrizio G. D.. 2013. Emotion detection in email customer care. Computational Intelligence 29, 3 (2013), 489505.Google ScholarGoogle ScholarCross RefCross Ref
  26. [26] Seo Y. S. and Huh J. H.. 2019. Automatic emotion-based music classification for supporting intelligent IoT applications. Electronics 8, 2 (2019), 164.Google ScholarGoogle ScholarCross RefCross Ref
  27. [27] Wolff S., Stiglmayr C., Bretz H. J., Lammers C. H., and Auckenthaler A.. 2007. Emotion identification and tension in female patients with borderline personality disorder. British Journal of Clinical Psychology 46, 3 (2007), 347360.Google ScholarGoogle ScholarCross RefCross Ref
  28. [28] Stojanovski D., Strezoski G., Madjarov G., and Dimitrovski I.. 2015. Emotion identification in FIFA world cup tweets using convolutional neural network. In Proceedings of the 11th International Conference on Innovations in Information Technology. 5257. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. [29] Subhashini R. and Niveditha P. R.. 2015. Analyzing and detecting employee's emotion for amelioration of organizations. Procedia Computer Science 48 (2015), 530536.Google ScholarGoogle ScholarCross RefCross Ref
  30. [30] Liu H., Fernando O. N. N., and Rajapakse J. C.. 2018. Predicting affective states of programming using keyboard data and mouse behaviors. In Proceedings of the 15th International Conference on Control, Automation, Robotics and Vision (ICARCV), 5557.Google ScholarGoogle ScholarCross RefCross Ref
  31. [31] Leong F. H.. 2016. Fine-grained detection of programming students’ frustration using keystrokes, mouse clicks and interaction logs. Open Journal of Social Sciences 4, 9 (2016), 9.Google ScholarGoogle ScholarCross RefCross Ref
  32. [32] Halim Z., Waqar M., and Tahir M.. 2020. A machine learning-based investigation utilizing the in-text features for the identification of dominant emotion in an email. Knowledge-Based Systems 208 (2020), 106443.Google ScholarGoogle ScholarCross RefCross Ref
  33. [33] Khare S. K., and Bajaj V.. 2020. Time-frequency representation and convolutional neural network-based emotion recognition. IEEE Transactions on Neural Networks and Learning Systems, 32, 7 (2020), 2901--2909.Google ScholarGoogle Scholar
  34. [34] Kazemzadeh A., Lee S., and Narayanan S.. 2013. Fuzzy logic models for the meaning of emotion words. IEEE Computational Intelligence Magazine 8, 2 (2013), 3449. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. [35] Akhtar M. S., Ekbal A., and Cambria E.. 2020. How intense are you? Predicting intensities of emotions and sentiments using stacked ensemble. IEEE Computational Intelligence Magazine 15, 1 (2020), 6475.Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. [36] Martin O., Kotsia I., Macq B., and Pitas I.. 2006. The eNTERFACE'05 audio-visual emotion database. In Proceedings of the 22nd International Conference on Data Engineering Workshops. 8. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. [37] Burkhardt F., Paeschke A., Rolfes M., Sendlmeier W. F., and Weiss B.. 2005. A database of German emotional speech. In Proceedings of the 9th European Conference on Speech Communication and Technology.Google ScholarGoogle ScholarCross RefCross Ref
  38. [38] Epp C., Lippold M., and Mandryk R. L.. 2011. Identifying emotional states using keystroke dynamics. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 715724. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. [39] Nahin A. N. H., Alam J. M., Mahmud H., and Hasan K.. 2014. Identifying emotion by keystroke dynamics and text pattern analysis. Behaviour & Information Technology 33, 9 (2014), 987996. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. [40] Ghosh S., Ganguly N., Mitra B., and De P.. 2017. Evaluating effectiveness of smartphone typing as an indicator of user emotion. In Proceedings of the 7th International Conference on Affective Computing and Intelligent Interaction. 146151.Google ScholarGoogle ScholarCross RefCross Ref
  41. [41] Kolakowska A.. 2015. Recognizing emotions on the basis of keystroke dynamics. In Proceedings of the 8th International Conference on Human System Interaction. 291297.Google ScholarGoogle ScholarCross RefCross Ref
  42. [42] Kolakowska A.. 2016. Towards detecting programmers' stress on the basis of keystroke dynamics. In Proceedings of the Federated Conference on Computer Science and Information Systems. 16211626.Google ScholarGoogle ScholarCross RefCross Ref
  43. [43] Grover S. and Verma A.. 2016. Design for emotion detection of punjabi text using hybrid approach. In Proceedings of the International Conference on Inventive Computation Technologies, Vol. 2, 16.Google ScholarGoogle Scholar
  44. [44] Calix R. A., Mallepudi S. A., Chen B., and Knapp G. M.. 2010. Emotion recognition in text for 3-D facial expression rendering. IEEE Transactions on Multimedia 12, 6 (2010), 544551. Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. [45] Busso C., Bulut M., Lee C. C., Kazemzadeh A., Mower E., Kim S., Chang J. N., Lee S., and Narayanan S.S.. 2008. IEMOCAP: Interactive emotional dyadic motion capture database. Journal of Language Resources and Evaluation 42, 4 (December 2008), 335359.Google ScholarGoogle ScholarCross RefCross Ref
  46. [46] Talarico J. M., Berntsen D., and Rubin D. C.. 2009. Positive emotions enhance recall of peripheral details. Cognition and Emotion 23, 2 (2009), 380398.Google ScholarGoogle ScholarCross RefCross Ref
  47. [47] Zhang T., Zheng W., Cui Z., Zong Y., and Li. Y. 2018. Spatial–temporal recurrent neural network for emotion recognition. IEEE Transactions on Cybernetics 49, 3 (2018), 839847.Google ScholarGoogle ScholarCross RefCross Ref
  48. [48] Li J., Qiu S., Shen Y. Y., Liu C. L., and He H.. 2019. Multisource transfer learning for cross-subject EEG emotion recognition." IEEE Transactions on Cybernetics 50, 7 (2019), 32813293.Google ScholarGoogle Scholar
  49. [49] Zheng W. L., Liu W., Lu Y., Lu B. L., Cichocki A.. 2018. Emotionmeter: A multimodal framework for recognizing human emotions. IEEE Transactions on Cybernetics 49, 3 (2018), 11101122.Google ScholarGoogle ScholarCross RefCross Ref
  50. [50] Pons G. and Masip D.. 2020. Multitask, multilabel, and multidomain learning with convolutional networks for emotion recognition. IEEE Transactions on Cybernetics. DOI: DOI: 10.1109/TCYB.2020.3036935.Google ScholarGoogle ScholarCross RefCross Ref
  51. [51] Aguado G., Julián V., García-Fornes A., and Espinosa A.. 2020. Using keystroke dynamics in a multi-agent system for user guiding in online social networks. Applied Sciences 10, 11 (2020), 3754.Google ScholarGoogle ScholarCross RefCross Ref
  52. [52] Nguyen C., Hassner T., Seeger M., and Archambeau C.. 2020. Leep: A new measure to evaluate transferability of learned representations. In Proceedings of the International Conference on Machine Learning. 72947305, PMLR.Google ScholarGoogle Scholar
  53. [53] Alzubaidi L., Al-Amidie M., Al-Asadi A., Humaidi A. J., Al-Shamma O., Fadhel M. A.,… and Duan Y.. 2021. Novel transfer learning approach for medical imaging with limited labeled data. Cancers 13, 7, (2021), 1590.Google ScholarGoogle ScholarCross RefCross Ref
  54. [54] Xu Z. and Wang S.. 2021. Emotional attention detection and correlation exploration for image emotion distribution learning. IEEE Transactions on Affective Computing. DOI: DOI: 10.1109/TAFFC.2021.3071131 Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. [55] Xu Z., Wang S., and Wang C.. 2020. Exploiting multi-emotion relations at feature and label levels for emotion tagging. In Proceedings of the 28th ACM International Conference on Multimedia, 29552963. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Non-Acted Text and Keystrokes Database and Learning Methods to Recognize Emotions

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    Full Access

    • Published in

      cover image ACM Transactions on Multimedia Computing, Communications, and Applications
      ACM Transactions on Multimedia Computing, Communications, and Applications  Volume 18, Issue 2
      May 2022
      494 pages
      ISSN:1551-6857
      EISSN:1551-6865
      DOI:10.1145/3505207
      Issue’s Table of Contents

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 16 February 2022
      • Accepted: 1 August 2021
      • Revised: 1 July 2021
      • Received: 1 January 2021
      Published in tomm Volume 18, Issue 2

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Refereed
    • Article Metrics

      • Downloads (Last 12 months)124
      • Downloads (Last 6 weeks)10

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Full Text

    View this article in Full Text.

    View Full Text

    HTML Format

    View this article in HTML Format .

    View HTML Format
    About Cookies On This Site

    We use cookies to ensure that we give you the best experience on our website.

    Learn more

    Got it!