skip to main content
research-article

Physiological Signals-based Emotion Recognition via High-order Correlation Learning

Published:07 December 2019Publication History
Skip Abstract Section

Abstract

Emotion recognition by physiological signals is an effective way to discern the inner state of human beings and therefore has been widely adopted in many user-centered applications. The majority of current state-of-the-art methods focus on exploring relationship among emotion and physiological signals. Given some particular features of the natural process of emotional expression, it is still a challenging and urgent issue to efficiently combine such high-order correlations among multimodal physiological signals and subjects. To tackle the problem, a novel multi-hypergraph neural networks is proposed, in which one hypergraph is established with one type of physiological signals to formulate inter-subject correlations. Each one of the vertices in a hypergraph stands for one subject with a description of its related stimuli, and the complex correlations among the vertices can be formulated through hyperedges. With the multi-hypergraph structure of the subjects, emotion recognition is translated into classification of vertices in the multi-hypergraph structure. Experimental results with the DEAP dataset and ASCERTAIN dataset demonstrate that the proposed method outperforms the current state-of-the-art methods.

References

  1. Claudio Babiloni, Carlo Miniussi, Fabio Babiloni, Filippo Carducci, Febo Cincotti, Claudio Del Percio, Giulia Sirello, Claudia Fracassi, Anna C. Nobre, and Paolo Maria Rossini. 2004. Sub-second “temporal attention“ modulates alpha rhythms. A high-resolution EEG study. Cogn. Brain Res. 19, 3 (2004), 259--268.Google ScholarGoogle ScholarCross RefCross Ref
  2. Tadas Baltrušaitis, Chaitanya Ahuja, and Louis-Philippe Morency. 2019. Multimodal machine learning: A survey and taxonomy. IEEE Trans. Pattern Anal. Mach. Intell. 41, 2 (2019), 423--443.Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Erol Başar, Canan Başar-Eroglu, Sirel Karakaş, and Martin Schürmann. 2001. Gamma, alpha, delta, and theta oscillations govern cognitive processes. Int. J. Psychophysiol. 39, 2--3 (2001), 241--248.Google ScholarGoogle ScholarCross RefCross Ref
  4. Hans Berger. 1929. Über das elektrenkephalogramm des menschen. Archiv psychiat. nervenkrank. 87, 1 (1929), 527--570.Google ScholarGoogle Scholar
  5. Rafael A. Calvo and Sidney D’Mello. 2010. Affect detection: An interdisciplinary review of models, methods, and their applications. IEEE Trans. Affect. Comput. 1, 1 (2010), 18--37.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Erik Cambria. 2016. Affective computing and sentiment analysis. IEEE Intell. Syst. 31, 2 (2016), 102--107.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Seong Youb Chung and Hyun Joong Yoon. 2012. Affective classification using Bayesian classifier and supervised learning [C]. ICCAS. 1768--1771.Google ScholarGoogle Scholar
  8. Paul Ekman, Robert W. Levenson, and Wallace V. Friesen. 1983. Autonomic nervous system activity distinguishes among emotions. Science 221, 4616 (1983), 1208--1210.Google ScholarGoogle Scholar
  9. Rana El Kaliouby and Peter Robinson. 2005. Real-time inference of complex mental states from facial expressions and head gestures. In Real-time Vision for Human--Computer Interaction. Springer, 181--200.Google ScholarGoogle Scholar
  10. Yifan Feng, Haoxuan You, Zizhao Zhang, Rongrong Ji, and Yue Gao. 2018. Hypergraph neural networks. In Proceedings of the Conference of the Association for the Advancement of Artificial Intelligence (AAAI’18).Google ScholarGoogle Scholar
  11. Raul Fernandez and Rosalind W. Picard. 2003. Modeling drivers’speech under stress. Speech Commun. 40, 1--2 (2003), 145--159.Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Yue Gao, Meng Wang, Zheng-Jun Zha, Jialie Shen, Xuelong Li, and Xindong Wu. 2013. Visual-textual joint relevance learning for tag-based social image search. IEEE Trans. Image Process. 22, 1 (2013), 363--376.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Alastair J. Gill, Robert M. French, Darren Gergle, and Jon Oberlander. 2008. Identifying emotional characteristics from short blog texts. In Proceedings of the 30th Annual Conference of the Cognitive Science Society, B. C. Love, K. McRae, and V. M. Sloutsky (eds.). 2237--2242.Google ScholarGoogle Scholar
  14. Jennifer Healey, Rosalind W. Picard, et al. 2005. Detecting stress during real-world driving tasks using physiological sensors. IEEE Trans. Intell. Transport. Syst. 6, 2 (2005), 156--166.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Yuchi Huang, Qingshan Liu, Fengjun Lv, Yihong Gong, and Dimitris N. Metaxas. 2011. Unsupervised image categorization by hypergraph partition. IEEE Trans. Pattern Anal. Mach. Intel. 33, 6 (2011), 1266--1273.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Robert Jenke, Angelika Peer, and Martin Buss. 2014. Feature extraction and selection for emotion recognition from EEG. IEEE Trans. Affect. Comput. 5, 3 (2014), 327--339.Google ScholarGoogle ScholarCross RefCross Ref
  17. Jonghwa Kim and Elisabeth André. 2008. Emotion recognition based on physiological changes in music listening. IEEE Trans. Pattern Anal. Mach. Intell. 30, 12 (2008), 2067--2083.Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Jonghwa Kim, Elisabeth André, Matthias Rehm, Thurid Vogt, and Johannes Wagner. 2005. Integrating information from speech and physiological signals to achieve emotional sensitivity. In Proceedings of the 9th European Conference on Speech Communication and Technology.Google ScholarGoogle ScholarCross RefCross Ref
  19. Gennady G. Knyazev. 2007. Motivation, emotion, and their inhibitory control mirrored in brain oscillations. Neurosci. Biobehav. Rev. 31, 3 (2007), 377--395.Google ScholarGoogle ScholarCross RefCross Ref
  20. Sander Koelstra, Christian Muhl, Mohammad Soleymani, Jong-Seok Lee, Ashkan Yazdani, Touradj Ebrahimi, Thierry Pun, Anton Nijholt, and Ioannis Patras. 2012. Deap: A database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 3, 1 (2012), 18--31.Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Sander Koelstra and Ioannis Patras. 2013. Fusion of facial expressions and EEG for implicit affective tagging. Image Vision Comput. 31, 2 (2013), 164--174.Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Helmut Laufs, K. Krakow, P. Sterzer, E. Eger, A. Beyerle, A. Salek-Haddadi, and A. Kleinschmidt. 2003. Electroencephalographic signatures of attentional and cognitive default modes in spontaneous brain activity fluctuations at rest. Proc. Natl. Acad. Sci. U.S.A. 100, 19 (2003), 11053--11058.Google ScholarGoogle ScholarCross RefCross Ref
  23. Joseph E. LeDoux. 1995. Emotion: Clues from the brain. Annu. Rev. Psychol. 46, 1 (1995), 209--235.Google ScholarGoogle ScholarCross RefCross Ref
  24. Ting Li, Yoann Baveye, Christel Chamaret, Emmanuel Dellandréa, and Liming Chen. 2015. Continuous arousal self-assessments validation using real-time physiological responses. In Proceedings of the 1st International Workshop on Affect 8 Sentiment in Multimedia. ACM, 39--44.Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Qian Luo, Tom Holroyd, Matthew Jones, Talma Hendler, and James Blair. 2007. Neural dynamics for facial threat processing as revealed by gamma band synchronization using MEG. Neuroimage 34, 2 (2007), 839--847.Google ScholarGoogle ScholarCross RefCross Ref
  26. Hector P. Martinez, Yoshua Bengio, and Georgios N. Yannakakis. 2013. Learning deep physiological models of affect. IEEE Comput. Intell. Mag. 8, 2 (2013), 20--33.Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Arturo Nakasone, Helmut Prendinger, and Mitsuru Ishizuka. 2005. Emotion recognition from electromyography and skin conductance. In Proceedgins of the 5th International Workshop on Biosignal Interpretation. Citeseer, 219--222.Google ScholarGoogle Scholar
  28. Soujanya Poria, Erik Cambria, Rajiv Bajpai, and Amir Hussain. 2017. A review of affective computing: From unimodal analysis to multimodal fusion. Frontiers in ICT 7 (2017), 98--125.Google ScholarGoogle Scholar
  29. William J. Ray and Harry W. Cole. 1985. EEG alpha activity reflects attentional demands, and beta activity reflects emotional and cognitive processes. Science 228, 4700 (1985), 750--752.Google ScholarGoogle ScholarCross RefCross Ref
  30. Viktor Rozgić, Shiv N. Vitaladevuni, and Rohit Prasad. 2013. Robust EEG emotion classification using segment level decision fusion. In Proceeedings of the 2013 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP’13). IEEE, 1286--1290.Google ScholarGoogle ScholarCross RefCross Ref
  31. James A. Russell. 1983. Pancultural aspects of the human conceptual organization of emotions.J. Pers. Soc. Psychol. 45, 6 (1983), 1281.Google ScholarGoogle Scholar
  32. James A. Russell. 1991. Culture and the categorization of emotions. Psychol. Bull. 110, 3 (1991), 426.Google ScholarGoogle ScholarCross RefCross Ref
  33. Paul Sauseng, Wolfgang Klimesch, Waltraud Stadler, Manuel Schabus, Michael Doppelmayr, Simon Hanslmayr, Walter R. Gruber, and Niels Birbaumer. 2005. A shift of visual spatial attention is selectively associated with human EEG alpha activity. Eur. J. Neurosci. 22, 11 (2005), 2917--2926.Google ScholarGoogle ScholarCross RefCross Ref
  34. Yangyang Shu and Shangfei Wang. 2017. Emotion recognition through integrating EEG and peripheral signals. In Proceedings of the 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP’17). IEEE, 2871--2875.Google ScholarGoogle ScholarCross RefCross Ref
  35. Mohammad Soleymani, Sadjad Asghari-Esfeden, Yun Fu, and Maja Pantic. 2016. Analysis of EEG signals and facial expressions for continuous emotion detection [J]. IEEE Transactions on Affective Computing 7, 1 (2016), 17--28.Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Mohammad Soleymani, Maja Pantic, and Thierry Pun. 2012. Multimodal emotion recognition in response to videos. IEEE Trans. Affect. Comput. 3, 2 (2012), 211--223.Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Mohammad Soleymani, Frank Villaro-Dixon, Thierry Pun, and Guillaume Chanel. 2017. Toolbox for emotional feature extraction from physiological signals (TEAP). Front. ICT 13 (2017), 1.Google ScholarGoogle Scholar
  38. Ramanathan Subramanian, Julia Wache, Mojtaba Khomami Abadi, Radu L. Vieriu, Stefan Winkler, and Nicu Sebe. 2018. ASCERTAIN: Emotion and personality recognition using commercial sensors. IEEE Trans. Affect. Comput. 7, 1 (2018), 17--28.Google ScholarGoogle Scholar
  39. Samarth Tripathi, Shrinivas Acharya, Ranti Dev Sharma, Sudhanshu Mittal, and Samit Bhattacharya. 2017. Using deep and convolutional neural networks for accurate emotion classification on DEAP dataset. In Proceedings of the Innovation Aviation 8 Aerospace Industry International Conference (IAAI’17). 4746--4752.Google ScholarGoogle Scholar
  40. Zizhao Zhang, Haojie Lin, Xibin Zhao, Rongrong Ji, and Yue Gao. 2018. Inductive multi-hypergraph learning and its application on view-based 3D object classification. IEEE Trans. Image Process. 27, 12 (2018), 5957--5968.Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Sicheng Zhao, Amir Gholaminejad, Guiguang Ding, Yue Gao, Jungong Han, and Kurt Keutzer. 2019. Personalized emotion recognition by personality-aware high-order learning of physiological signals. ACM Trans. Multimedia Comput. Commun. Appl. 15, 1s (2019), 14.Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Sicheng Zhao, Chuang Lin, Pengfei Xu, Sendong Zhao, Yuchen Guo, Ravi Krishna, Guiguang Ding, and Kurt Keutzer. 2019. CycleEmotionGAN: Emotional semantic consistency preserved cycleGAN for adapting image emotions. In Proceedings of the Conference of the Association for the Advancement of Artificial Intelligence (AAAI’19).Google ScholarGoogle ScholarCross RefCross Ref
  43. Sicheng Zhao, Hongxun Yao, Wenlong Xie, and Xiaolei Jiang. 2016. User-centric affective computing of image emotion perceptions. In Proceedings of the Conference of the Association for the Advancement of Artificial Intelligence (AAAI’16). 4284--4285.Google ScholarGoogle Scholar
  44. Sicheng Zhao, Xin Zhao, Guiguang Ding, and Kurt Keutzer. 2018. EmotionGAN: Unsupervised domain adaptation for learning discrete probability distributions of image emotions. In Proceedings of the 2018 ACM Multimedia Conference on Multimedia Conference. ACM, 1319--1327.Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. Dengyong Zhou, Jiayuan Huang, and Bernhard Schölkopf. 2007. Learning with hypergraphs: Clustering, classification, and embedding. In Advances in Neural Information Processing Systems. 1601--1608.Google ScholarGoogle Scholar

Index Terms

  1. Physiological Signals-based Emotion Recognition via High-order Correlation Learning

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in

        Full Access

        • Published in

          cover image ACM Transactions on Multimedia Computing, Communications, and Applications
          ACM Transactions on Multimedia Computing, Communications, and Applications  Volume 15, Issue 3s
          Special Issue on Face Analysis for Applications and Special Issue on Affective Computing for Large-Scale Heterogeneous Multimedia Data
          November 2019
          304 pages
          ISSN:1551-6857
          EISSN:1551-6865
          DOI:10.1145/3368027
          Issue’s Table of Contents

          Copyright © 2019 ACM

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 7 December 2019
          • Accepted: 1 May 2019
          • Revised: 1 April 2019
          • Received: 1 January 2019
          Published in tomm Volume 15, Issue 3s

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article
          • Research
          • Refereed

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format .

        View HTML Format
        About Cookies On This Site

        We use cookies to ensure that we give you the best experience on our website.

        Learn more

        Got it!