Abstract
The modern computing applications are presently adapting to the convenient availability of huge and diverse data for making their pattern recognition methods smarter. Identification of dominant emotion solely based on the text data generated by humans is essential for the modern human–computer interaction. This work presents a multimodal text-keystrokes dataset and associated learning methods for the identification of human emotions hidden in small text. For this, a text-keystrokes data of 69 participants is collected in multiple scenarios. Stimuli are induced through videos in a controlled environment. After the stimuli induction, participants write their reviews about the given scenario in an unguided manner. Afterward, keystroke and in-text features are extracted from the dataset. These are used with an assortment of learning methods to identify emotion hidden in the short text. An accuracy of 86.95% is achieved by fusing text and keystroke features. Whereas, 100% accuracy is obtained for pleasure-displeasure classes of emotions using the fusion of keystroke/text features, tree-based feature selection method, and support vector machine classifier. The present work is also compared with four state-of-the-art techniques for the same task, where the results suggest that the present proposal performs better in terms of accuracy.
- [1] . 2020. Urban big data fusion based on deep learning: An overview. Information Fusion 53 (2020), 123–133.Google Scholar
Digital Library
- [2] . 2018. Big data: Tutorial and guidelines on information and process fusion for analytics algorithms with MapReduce. Information Fusion 42 (2018), 51–61.Google Scholar
Cross Ref
- [3] . 2011. The primacy of perceiving: Emotion recognition buffers negative effects of emotional labor. Journal of Applied Psychology 96, 5 (2011), 1087.Google Scholar
Cross Ref
- [4] . 2019. Human emotion recognition using deep belief network architecture. Information Fusion 51 (2019), 10–18.Google Scholar
Digital Library
- [5] . 1999. Basic emotions. Handbook of Cognition and Emotion. John Wiley & Sons, USA, 45–60.Google Scholar
- [6] . 2018. Survey on ontologies for affective states and their influences. Semantic Web 9, 4 (2018), 441–458.Google Scholar
Cross Ref
- [7] . 2001. The nature of emotions: Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. American Scientist 89, 4 (2001), 344–350.Google Scholar
Cross Ref
- [8] . 1992. Shame and Pride: Affect, Sex, and the Birth of the Self. W.W. Norton, New York.Google Scholar
- [9] . 2018. Consensus vote models for detecting and filtering neutrality in sentiment analysis. Information Fusion 44 (2018), 126–135.Google Scholar
Cross Ref
- [10] . 2018. Distinguishing between facts and opinions for sentiment analysis: Survey and challenges. Information Fusion 44 (2018), 65–77.Google Scholar
Cross Ref
- [11] . 2003. Speech emotion recognition using hidden Markov models. Speech Communication 41, 4 (2003), 603–623.Google Scholar
Cross Ref
- [12] . 2017. Keystroke/mouse usage based emotion detection and user identification. In Proceedings of the International Conference on Networking, Systems and Security (NSysS), 96–104.Google Scholar
Cross Ref
- [13] . 2018. A machine learning approach to leverage individual keyboard and mouse interaction behavior from multiple users in real-world learning scenarios. IEEE Access 6 (2018), 39154–39179.Google Scholar
Cross Ref
- [14] . 2012. Deap: A database for emotion analysis; using physiological signals. IEEE Transactions on Affective Computing 3, 1 (2012), 18–31. Google Scholar
Digital Library
- [15] . 2018. Deep multimodal learning for emotion recognition in spoken language. In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing.Google Scholar
- [16] . 2017. BAUM-1: A spontaneous audio-visual face database of affective and mental states. IEEE Transactions on Affective Computing 8, 3 (2017), 300–313.Google Scholar
Digital Library
- [17] R. L. Payne and C. Cooper (Eds.). 2003. Emotions at Work: Theory, Research and Applications for Management. John Wiley & Sons.Google Scholar
- [18] . 1983. Studies on anger and aggression: Implications for theories of emotion. American Psychologist 38, 11 (1983), 1145.Google Scholar
Cross Ref
- [19] . 2017. How should neuroscience study emotions? By distinguishing emotion states, concepts, and experiences. Social Cognitive and Affective Neuroscience 12, 1 (2017), 24–31.Google Scholar
Cross Ref
- [20] . 2008. IEMOCAP: Interactive emotional dyadic motion capture database. Journal of Language Resources and Evaluation. 42, 4 (December 2008), 335–359.Google Scholar
Cross Ref
- [21] . 2018. Multi-modal emotion recognition on IEMOCAP dataset using deep learning. Retrieved from https://arxiv.org/abs/1804.05788.Google Scholar
- [22] . 2018. Enabling deep learning of emotion with first-person seed expressions. In Proceedings of the 2nd Workshop on Computational Modeling of People's Opinions, Personality, and Emotions in Social Media. 25–35.Google Scholar
Cross Ref
- [23] . 2018. Decision support with text-based emotion recognition: Deep learning for affective computing. Decision Support Systems 115 (2018), 24–35.Google Scholar
Cross Ref
- [24] . 2015. A multimodal approach to detect user's emotion. Procedia Computer Science 70, 1 (2015), 296–303.Google Scholar
Cross Ref
- [25] . 2013. Emotion detection in email customer care. Computational Intelligence 29, 3 (2013), 489–505.Google Scholar
Cross Ref
- [26] . 2019. Automatic emotion-based music classification for supporting intelligent IoT applications. Electronics 8, 2 (2019), 164.Google Scholar
Cross Ref
- [27] . 2007. Emotion identification and tension in female patients with borderline personality disorder. British Journal of Clinical Psychology 46, 3 (2007), 347–360.Google Scholar
Cross Ref
- [28] . 2015. Emotion identification in FIFA world cup tweets using convolutional neural network. In Proceedings of the 11th International Conference on Innovations in Information Technology. 52–57. Google Scholar
Digital Library
- [29] . 2015. Analyzing and detecting employee's emotion for amelioration of organizations. Procedia Computer Science 48 (2015), 530–536.Google Scholar
Cross Ref
- [30] . 2018. Predicting affective states of programming using keyboard data and mouse behaviors. In Proceedings of the 15th International Conference on Control, Automation, Robotics and Vision (ICARCV), 55–57.Google Scholar
Cross Ref
- [31] . 2016. Fine-grained detection of programming students’ frustration using keystrokes, mouse clicks and interaction logs. Open Journal of Social Sciences 4, 9 (2016), 9.Google Scholar
Cross Ref
- [32] . 2020. A machine learning-based investigation utilizing the in-text features for the identification of dominant emotion in an email. Knowledge-Based Systems 208 (2020), 106443.Google Scholar
Cross Ref
- [33] . 2020. Time-frequency representation and convolutional neural network-based emotion recognition. IEEE Transactions on Neural Networks and Learning Systems, 32, 7 (2020), 2901--2909.Google Scholar
- [34] . 2013. Fuzzy logic models for the meaning of emotion words. IEEE Computational Intelligence Magazine 8, 2 (2013), 34–49. Google Scholar
Digital Library
- [35] . 2020. How intense are you? Predicting intensities of emotions and sentiments using stacked ensemble. IEEE Computational Intelligence Magazine 15, 1 (2020), 64–75.Google Scholar
Digital Library
- [36] . 2006. The eNTERFACE'05 audio-visual emotion database. In Proceedings of the 22nd International Conference on Data Engineering Workshops. 8. Google Scholar
Digital Library
- [37] . 2005. A database of German emotional speech. In Proceedings of the 9th European Conference on Speech Communication and Technology.Google Scholar
Cross Ref
- [38] . 2011. Identifying emotional states using keystroke dynamics. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 715–724. Google Scholar
Digital Library
- [39] . 2014. Identifying emotion by keystroke dynamics and text pattern analysis. Behaviour & Information Technology 33, 9 (2014), 987–996. Google Scholar
Digital Library
- [40] . 2017. Evaluating effectiveness of smartphone typing as an indicator of user emotion. In Proceedings of the 7th International Conference on Affective Computing and Intelligent Interaction. 146–151.Google Scholar
Cross Ref
- [41] . 2015. Recognizing emotions on the basis of keystroke dynamics. In Proceedings of the 8th International Conference on Human System Interaction. 291–297.Google Scholar
Cross Ref
- [42] . 2016. Towards detecting programmers' stress on the basis of keystroke dynamics. In Proceedings of the Federated Conference on Computer Science and Information Systems. 1621–1626.Google Scholar
Cross Ref
- [43] . 2016. Design for emotion detection of punjabi text using hybrid approach. In Proceedings of the International Conference on Inventive Computation Technologies, Vol. 2, 1–6.Google Scholar
- [44] . 2010. Emotion recognition in text for 3-D facial expression rendering. IEEE Transactions on Multimedia 12, 6 (2010), 544–551. Google Scholar
Digital Library
- [45] . 2008. IEMOCAP: Interactive emotional dyadic motion capture database. Journal of Language Resources and Evaluation 42, 4 (December 2008), 335–359.Google Scholar
Cross Ref
- [46] . 2009. Positive emotions enhance recall of peripheral details. Cognition and Emotion 23, 2 (2009), 380–398.Google Scholar
Cross Ref
- [47] 2018. Spatial–temporal recurrent neural network for emotion recognition. IEEE Transactions on Cybernetics 49, 3 (2018), 839–847.Google Scholar
Cross Ref
- [48] . 2019. Multisource transfer learning for cross-subject EEG emotion recognition." IEEE Transactions on Cybernetics 50, 7 (2019), 3281–3293.Google Scholar
- [49] . 2018. Emotionmeter: A multimodal framework for recognizing human emotions. IEEE Transactions on Cybernetics 49, 3 (2018), 1110–1122.Google Scholar
Cross Ref
- [50] . 2020. Multitask, multilabel, and multidomain learning with convolutional networks for emotion recognition. IEEE Transactions on Cybernetics.
DOI: DOI: 10.1109/TCYB.2020.3036935.Google ScholarCross Ref
- [51] . 2020. Using keystroke dynamics in a multi-agent system for user guiding in online social networks. Applied Sciences 10, 11 (2020), 3754.Google Scholar
Cross Ref
- [52] . 2020. Leep: A new measure to evaluate transferability of learned representations. In Proceedings of the International Conference on Machine Learning. 7294–7305, PMLR.Google Scholar
- [53] . 2021. Novel transfer learning approach for medical imaging with limited labeled data. Cancers 13, 7, (2021), 1590.Google Scholar
Cross Ref
- [54] . 2021. Emotional attention detection and correlation exploration for image emotion distribution learning. IEEE Transactions on Affective Computing.
DOI: DOI: 10.1109/TAFFC.2021.3071131 Google ScholarDigital Library
- [55] . 2020. Exploiting multi-emotion relations at feature and label levels for emotion tagging. In Proceedings of the 28th ACM International Conference on Multimedia, 2955–2963. Google Scholar
Digital Library
Index Terms
Non-Acted Text and Keystrokes Database and Learning Methods to Recognize Emotions
Recommendations
Beyond the basic emotions: what should affective computing compute?
CHI EA '13: CHI '13 Extended Abstracts on Human Factors in Computing SystemsOne of the primary goals of Affective Computing (AC) is to develop computer interfaces that automatically detect and respond to users' emotions. Despite significant progress, "basic emotions" (e.g., anger, disgust, sadness) have been emphasized in AC at ...
Expressing emotions as emoticons for online intelligent agents
HCI '16: Proceedings of the 30th International BCS Human Computer Interaction Conference: Fusion!Without emotional annotation, online communication can be ambiguous and lead to misunderstandings. This paper addresses the questions of which emotions are commonly expressed online, how these emotions can be encapsulated in emoticons, and how people ...
Designing Expressive Lights and In-Situ Motions for Robots to Express Emotions
HAI '18: Proceedings of the 6th International Conference on Human-Agent InteractionIn this paper, we explore how a utility robot might express emotions via expressive lights and in-situ motions. In most previous work, methods for either modality were investigated alone, leaving a huge potential to improve the expression of emotions by ...






Comments