Abstract
With the recent surge of smart wearable devices, it is possible to obtain the physiological and behavioral data of human beings in a more convenient and non-invasive manner. Based on such data, researchers have developed a variety of systems or applications to recognize and understand human behaviors, including both physical activities (e.g., gestures) and mental states (e.g., emotions). Specifically, it has been proved that different emotions can cause different changes in physiological parameters. However, other factors, such as activities, may also impact one’s physiological parameters. To accurately recognize emotions, we need not only explore the physiological data but also the behavioral data. To this end, we propose an adaptive emotion recognition system by exploring a sensor-enriched wearable smart watch. First, an activity identification method is developed to distinguish different activity scenes (e.g., sitting, walking, and running) by using the accelerometer sensor. Based on the identified activity scenes, an adaptive emotion recognition method is proposed by leveraging multi-mode sensory data (including blood volume pulse, electrodermal activity, and skin temperature). Specifically, we extract fine-grained features to characterize different emotions. Finally, the adaptive user emotion recognition model is constructed and verified by experiments. An accuracy of 74.3% for 30 participants demonstrates that the proposed system can recognize human emotions effectively.
- R. Bailon, L. Sornmo, and P. Laguna. 2006. A robust method for ECG-based estimation of the respiratory frequency during stress testing. IEEE Transactions on Biomedical Engineering 53, 7 (2006), 1273--1285.Google Scholar
Cross Ref
- J. Cabibihan and S. S. Chauhan. 2017. Physiological responses to affective tele-touch during induced emotional stimuli. IEEE Transactions on Affective Computing 8, 1 (2017), 108--118.Google Scholar
Digital Library
- Yixiang Dai, Xue Wang, Pengbo Zhang, and Weihang Zhang. 2017. Wearable biosensor network enabled multimodal daily-life emotion recognition employing reputation-driven imbalanced fuzzy classification. Measurement 109 (2017), 408--424.Google Scholar
- Guido H. E. Gendolla. 2000. On the impact of mood on behavior: An integrative theory and a review. Review of General Psychology 4, 4 (2000), 378--408.Google Scholar
Cross Ref
- Y. Hsu, J. Wang, W. Chiang, and C. Hung. 2020. Automatic ECG-based emotion recognition in music listening. IEEE Transactions on Affective Computing 11, 1 (2020), 85--89.Google Scholar
Cross Ref
- Eun-Hye Jang, Byoung-Jun Park, Mi-Sook Park, Sang-Hyeob Kim, and Jin-Hun Sohn. 2015. Analysis of physiological signals for recognition of boredom, pain, and surprise emotions. Journal of Physiological Anthropology 34, 1 (2015), Article 25, 12 pages.Google Scholar
- J. Kim and E. Andre. 2008. Emotion recognition based on physiological changes in music listening. IEEE Transactions on Pattern Analysis and Machine Intelligence 30, 12 (Dec. 2008), 2067--2083.Google Scholar
Digital Library
- A. Kleinsmith and N. Bianchi-Berthouze. 2013. Affective body expression perception and recognition: A survey. IEEE Transactions on Affective Computing 4, 1 (Jan. 2013), 15--33.Google Scholar
Digital Library
- S. Koelstra, C. Muhl, M. Soleymani, J. Lee, A. Yazdani, T. Ebrahimi, T. Pun, A. Nijholt, and I. Patras. 2012. DEAP: A database for emotion analysis ;using physiological signals. IEEE Transactions on Affective Computing 3, 1 (2012), 18--31.Google Scholar
Digital Library
- D. Kulic and E. A. Croft. 2007. Affective state estimation for human-robot interaction. IEEE Transactions on Robotics 23, 5 (2007), 991--1000.Google Scholar
Digital Library
- P. Kuppens, F. Tuerlinckx, J. Russell, and L. Barrett. 2013. The relation between valence and arousal in subjective experience. Psychological Bulletin 139, 4 (2013), 917--940.Google Scholar
Cross Ref
- M. Kusserow, O. Amft, and G. Troster. 2013. Modeling arousal phases in daily living using wearable sensors. IEEE Transactions on Affective Computing 4, 1 (2013), 93--105.Google Scholar
Digital Library
- Fan Liu, Xingshe Zhou, Zhu Wang, Jinli Cao, Hua Wang, and Yanchun Zhang. 2019. Unobtrusive mattress-based identification of hypertension by integrating classification and association rule mining. Sensors 19, 7 (2019), Article 1489, 25 pages.Google Scholar
- I. Mauss, R. Levenson, L. McCater, F. Wilhelm, and Gross J. 2005. The tie that binds—Coherence among emotion experience, behavior, and physiology. Emotion 5, 2 (2005), 175--190.Google Scholar
Cross Ref
- Michele Orini, Raquel Bailón, Ronny Enk, Stefan Koelsch, Luca T. Mainardi, and Pablo Laguna. 2010. A method for continuously assessing the autonomic response to music-induced emotions through HRV analysis. Medical 8 Biological Engineering 8 Computing 48, 5 (2010), 423--433.Google Scholar
- Markus Quirin, Miguel Kazén, and Julius Kuhl. 2009. When nonsense sounds happy or helpless: The implicit positive and negative affect test (IPANAT).Journal of Personality and Social Psychology 97, 3 (2009), 500--516.Google Scholar
- Pierre Rainville, Antoine Bechara, Nasir Naqvi, and Antonio R. Damasio. 2006. Basic emotions are associated with distinct patterns of cardiorespiratory activity. International Journal of Psychophysiology 61, 1 (2006), 5--18.Google Scholar
- Georgios Rigas, Christos D. Katsis, George Ganiatsas, and Dimitrios I. Fotiadis. 2007. A user independent, biosignal based, emotion recognition method. In Proceedings of the International Conference on User Modeling. 314--318.Google Scholar
- N. Sarode and S. Bhatia. 2010. Facial expression recognition. International Journal on Computer Science and Engineering 2, 5 (2010), 1552--1557.Google Scholar
- K. Schaaff and T. Schultz. 2009. Towards emotion recognition from electroencephalographic signals. In Proceedings of the 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops. 1--6.Google Scholar
- C. Setz, B. Arnrich, J. Schumm, R. La Marca, G. Troster, and U. Ehlert. 2010. Discriminating stress from cognitive load using a wearable EDA device. IEEE Transactions on Information Technology in Biomedicine 14, 2 (2010), 410--417.Google Scholar
Digital Library
- K. Wac and C. Tsiourti. 2014. Ambulatory assessment of affect: Survey of sensor systems for monitoring of autonomic nervous systems activation in emotion. IEEE Transactions on Affective Computing 5, 3 (2014), 251--272.Google Scholar
Cross Ref
- J. Wang, Y. Wang, D. Zhang, and S. Helal. 2018. Energy saving techniques in mobile crowd sensing: Current state and future opportunities. IEEE Communications Magazine 56, 5 (2018), 164--169.Google Scholar
Cross Ref
- J. Wang, Y. Wang, D. Zhang, Q. Lv, and C. Chen. 2019. Crowd-powered sensing and actuation in smart cities: Current issues and future directions. IEEE Wireless Communications 26, 2 (2019), 86--92.Google Scholar
Cross Ref
- Zhu Wang, Xingshe Zhou, Weichao Zhao, Fan Liu, Hongbo Ni, and Zhiwen Yu. 2017. Assessing the severity of sleep apnea syndrome based on ballistocardiogram. PLoS ONE 12, 4 (2017), 1--24. https://doi.org/10.1371/journal.pone.0175351Google Scholar
Cross Ref
- Siqing Wu, Tiago H. Falk, and Wai-Yip Chan. 2011. Automatic speech emotion recognition using modulation spectral features. Speech Communication 53, 5 (2011), 768--785.Google Scholar
Digital Library
- B. Zhao, Z. Wang, Z. Yu, and B. Guo. 2018. EmotionSense: Emotion recognition based on wearable wristband. In Proceedings of the 15th IEEE International Conference on Ubiquitous Intelligence and Computing. 346--355.Google Scholar
- M. D. Zwaag, J. H. Janssen, and J. M. Westerink. 2013. Directing physiology and mood through music: Validation of an affective music player. IEEE Transactions on Affective Computing 4, 1 (2013), 57--68.Google Scholar
Index Terms
EmotionSense: An Adaptive Emotion Recognition System Based on Wearable Smart Devices
Recommendations
EmotionSense: a mobile phones based adaptive platform for experimental social psychology research
UbiComp '10: Proceedings of the 12th ACM international conference on Ubiquitous computingToday's mobile phones represent a rich and powerful computing platform, given their sensing, processing and communication capabilities. Phones are also part of the everyday life of billions of people, and therefore represent an exceptionally suitable ...
What you wear know how you feel: an emotion inference system with multi-modal wearable devices
MobiCom '20: Proceedings of the 26th Annual International Conference on Mobile Computing and NetworkingEmotions show high significance on human health. Automatic emotion recognition is helpful for monitoring psychological disorders, mental problems and exploring behavioral mechanisms. Existing approaches adopt costly and bulky specialized hardware such ...
On assisting a visual-facial affect recognition system with keyboard-stroke pattern information
Towards realizing a multimodal affect recognition system, we are considering the advantages of assisting a visual-facial expression recognition system with keyboard-stroke pattern information. Our work is based on the assumption that the visual-facial ...






Comments