skip to main content
research-article

EmotionSense: An Adaptive Emotion Recognition System Based on Wearable Smart Devices

Published:30 September 2020Publication History
Skip Abstract Section

Abstract

With the recent surge of smart wearable devices, it is possible to obtain the physiological and behavioral data of human beings in a more convenient and non-invasive manner. Based on such data, researchers have developed a variety of systems or applications to recognize and understand human behaviors, including both physical activities (e.g., gestures) and mental states (e.g., emotions). Specifically, it has been proved that different emotions can cause different changes in physiological parameters. However, other factors, such as activities, may also impact one’s physiological parameters. To accurately recognize emotions, we need not only explore the physiological data but also the behavioral data. To this end, we propose an adaptive emotion recognition system by exploring a sensor-enriched wearable smart watch. First, an activity identification method is developed to distinguish different activity scenes (e.g., sitting, walking, and running) by using the accelerometer sensor. Based on the identified activity scenes, an adaptive emotion recognition method is proposed by leveraging multi-mode sensory data (including blood volume pulse, electrodermal activity, and skin temperature). Specifically, we extract fine-grained features to characterize different emotions. Finally, the adaptive user emotion recognition model is constructed and verified by experiments. An accuracy of 74.3% for 30 participants demonstrates that the proposed system can recognize human emotions effectively.

References

  1. R. Bailon, L. Sornmo, and P. Laguna. 2006. A robust method for ECG-based estimation of the respiratory frequency during stress testing. IEEE Transactions on Biomedical Engineering 53, 7 (2006), 1273--1285.Google ScholarGoogle ScholarCross RefCross Ref
  2. J. Cabibihan and S. S. Chauhan. 2017. Physiological responses to affective tele-touch during induced emotional stimuli. IEEE Transactions on Affective Computing 8, 1 (2017), 108--118.Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Yixiang Dai, Xue Wang, Pengbo Zhang, and Weihang Zhang. 2017. Wearable biosensor network enabled multimodal daily-life emotion recognition employing reputation-driven imbalanced fuzzy classification. Measurement 109 (2017), 408--424.Google ScholarGoogle Scholar
  4. Guido H. E. Gendolla. 2000. On the impact of mood on behavior: An integrative theory and a review. Review of General Psychology 4, 4 (2000), 378--408.Google ScholarGoogle ScholarCross RefCross Ref
  5. Y. Hsu, J. Wang, W. Chiang, and C. Hung. 2020. Automatic ECG-based emotion recognition in music listening. IEEE Transactions on Affective Computing 11, 1 (2020), 85--89.Google ScholarGoogle ScholarCross RefCross Ref
  6. Eun-Hye Jang, Byoung-Jun Park, Mi-Sook Park, Sang-Hyeob Kim, and Jin-Hun Sohn. 2015. Analysis of physiological signals for recognition of boredom, pain, and surprise emotions. Journal of Physiological Anthropology 34, 1 (2015), Article 25, 12 pages.Google ScholarGoogle Scholar
  7. J. Kim and E. Andre. 2008. Emotion recognition based on physiological changes in music listening. IEEE Transactions on Pattern Analysis and Machine Intelligence 30, 12 (Dec. 2008), 2067--2083.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. A. Kleinsmith and N. Bianchi-Berthouze. 2013. Affective body expression perception and recognition: A survey. IEEE Transactions on Affective Computing 4, 1 (Jan. 2013), 15--33.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. S. Koelstra, C. Muhl, M. Soleymani, J. Lee, A. Yazdani, T. Ebrahimi, T. Pun, A. Nijholt, and I. Patras. 2012. DEAP: A database for emotion analysis ;using physiological signals. IEEE Transactions on Affective Computing 3, 1 (2012), 18--31.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. D. Kulic and E. A. Croft. 2007. Affective state estimation for human-robot interaction. IEEE Transactions on Robotics 23, 5 (2007), 991--1000.Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. P. Kuppens, F. Tuerlinckx, J. Russell, and L. Barrett. 2013. The relation between valence and arousal in subjective experience. Psychological Bulletin 139, 4 (2013), 917--940.Google ScholarGoogle ScholarCross RefCross Ref
  12. M. Kusserow, O. Amft, and G. Troster. 2013. Modeling arousal phases in daily living using wearable sensors. IEEE Transactions on Affective Computing 4, 1 (2013), 93--105.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Fan Liu, Xingshe Zhou, Zhu Wang, Jinli Cao, Hua Wang, and Yanchun Zhang. 2019. Unobtrusive mattress-based identification of hypertension by integrating classification and association rule mining. Sensors 19, 7 (2019), Article 1489, 25 pages.Google ScholarGoogle Scholar
  14. I. Mauss, R. Levenson, L. McCater, F. Wilhelm, and Gross J. 2005. The tie that binds—Coherence among emotion experience, behavior, and physiology. Emotion 5, 2 (2005), 175--190.Google ScholarGoogle ScholarCross RefCross Ref
  15. Michele Orini, Raquel Bailón, Ronny Enk, Stefan Koelsch, Luca T. Mainardi, and Pablo Laguna. 2010. A method for continuously assessing the autonomic response to music-induced emotions through HRV analysis. Medical 8 Biological Engineering 8 Computing 48, 5 (2010), 423--433.Google ScholarGoogle Scholar
  16. Markus Quirin, Miguel Kazén, and Julius Kuhl. 2009. When nonsense sounds happy or helpless: The implicit positive and negative affect test (IPANAT).Journal of Personality and Social Psychology 97, 3 (2009), 500--516.Google ScholarGoogle Scholar
  17. Pierre Rainville, Antoine Bechara, Nasir Naqvi, and Antonio R. Damasio. 2006. Basic emotions are associated with distinct patterns of cardiorespiratory activity. International Journal of Psychophysiology 61, 1 (2006), 5--18.Google ScholarGoogle Scholar
  18. Georgios Rigas, Christos D. Katsis, George Ganiatsas, and Dimitrios I. Fotiadis. 2007. A user independent, biosignal based, emotion recognition method. In Proceedings of the International Conference on User Modeling. 314--318.Google ScholarGoogle Scholar
  19. N. Sarode and S. Bhatia. 2010. Facial expression recognition. International Journal on Computer Science and Engineering 2, 5 (2010), 1552--1557.Google ScholarGoogle Scholar
  20. K. Schaaff and T. Schultz. 2009. Towards emotion recognition from electroencephalographic signals. In Proceedings of the 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops. 1--6.Google ScholarGoogle Scholar
  21. C. Setz, B. Arnrich, J. Schumm, R. La Marca, G. Troster, and U. Ehlert. 2010. Discriminating stress from cognitive load using a wearable EDA device. IEEE Transactions on Information Technology in Biomedicine 14, 2 (2010), 410--417.Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. K. Wac and C. Tsiourti. 2014. Ambulatory assessment of affect: Survey of sensor systems for monitoring of autonomic nervous systems activation in emotion. IEEE Transactions on Affective Computing 5, 3 (2014), 251--272.Google ScholarGoogle ScholarCross RefCross Ref
  23. J. Wang, Y. Wang, D. Zhang, and S. Helal. 2018. Energy saving techniques in mobile crowd sensing: Current state and future opportunities. IEEE Communications Magazine 56, 5 (2018), 164--169.Google ScholarGoogle ScholarCross RefCross Ref
  24. J. Wang, Y. Wang, D. Zhang, Q. Lv, and C. Chen. 2019. Crowd-powered sensing and actuation in smart cities: Current issues and future directions. IEEE Wireless Communications 26, 2 (2019), 86--92.Google ScholarGoogle ScholarCross RefCross Ref
  25. Zhu Wang, Xingshe Zhou, Weichao Zhao, Fan Liu, Hongbo Ni, and Zhiwen Yu. 2017. Assessing the severity of sleep apnea syndrome based on ballistocardiogram. PLoS ONE 12, 4 (2017), 1--24. https://doi.org/10.1371/journal.pone.0175351Google ScholarGoogle ScholarCross RefCross Ref
  26. Siqing Wu, Tiago H. Falk, and Wai-Yip Chan. 2011. Automatic speech emotion recognition using modulation spectral features. Speech Communication 53, 5 (2011), 768--785.Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. B. Zhao, Z. Wang, Z. Yu, and B. Guo. 2018. EmotionSense: Emotion recognition based on wearable wristband. In Proceedings of the 15th IEEE International Conference on Ubiquitous Intelligence and Computing. 346--355.Google ScholarGoogle Scholar
  28. M. D. Zwaag, J. H. Janssen, and J. M. Westerink. 2013. Directing physiology and mood through music: Validation of an affective music player. IEEE Transactions on Affective Computing 4, 1 (2013), 57--68.Google ScholarGoogle Scholar

Index Terms

  1. EmotionSense: An Adaptive Emotion Recognition System Based on Wearable Smart Devices

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      Full Access

      • Published in

        cover image ACM Transactions on Computing for Healthcare
        ACM Transactions on Computing for Healthcare  Volume 1, Issue 4
        Special Issue on Wearable Technologies for Smart Health: Part 1
        October 2020
        184 pages
        ISSN:2691-1957
        EISSN:2637-8051
        DOI:10.1145/3427421
        Issue’s Table of Contents

        Copyright © 2020 ACM

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 30 September 2020
        • Online AM: 7 May 2020
        • Accepted: 1 February 2020
        • Revised: 1 December 2019
        • Received: 1 July 2019
        Published in health Volume 1, Issue 4

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article
        • Research
        • Refereed

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format .

      View HTML Format
      About Cookies On This Site

      We use cookies to ensure that we give you the best experience on our website.

      Learn more

      Got it!