Abstract
Emotion recognition by physiological signals is an effective way to discern the inner state of human beings and therefore has been widely adopted in many user-centered applications. The majority of current state-of-the-art methods focus on exploring relationship among emotion and physiological signals. Given some particular features of the natural process of emotional expression, it is still a challenging and urgent issue to efficiently combine such high-order correlations among multimodal physiological signals and subjects. To tackle the problem, a novel multi-hypergraph neural networks is proposed, in which one hypergraph is established with one type of physiological signals to formulate inter-subject correlations. Each one of the vertices in a hypergraph stands for one subject with a description of its related stimuli, and the complex correlations among the vertices can be formulated through hyperedges. With the multi-hypergraph structure of the subjects, emotion recognition is translated into classification of vertices in the multi-hypergraph structure. Experimental results with the DEAP dataset and ASCERTAIN dataset demonstrate that the proposed method outperforms the current state-of-the-art methods.
- Claudio Babiloni, Carlo Miniussi, Fabio Babiloni, Filippo Carducci, Febo Cincotti, Claudio Del Percio, Giulia Sirello, Claudia Fracassi, Anna C. Nobre, and Paolo Maria Rossini. 2004. Sub-second “temporal attention“ modulates alpha rhythms. A high-resolution EEG study. Cogn. Brain Res. 19, 3 (2004), 259--268.Google Scholar
Cross Ref
- Tadas Baltrušaitis, Chaitanya Ahuja, and Louis-Philippe Morency. 2019. Multimodal machine learning: A survey and taxonomy. IEEE Trans. Pattern Anal. Mach. Intell. 41, 2 (2019), 423--443.Google Scholar
Digital Library
- Erol Başar, Canan Başar-Eroglu, Sirel Karakaş, and Martin Schürmann. 2001. Gamma, alpha, delta, and theta oscillations govern cognitive processes. Int. J. Psychophysiol. 39, 2--3 (2001), 241--248.Google Scholar
Cross Ref
- Hans Berger. 1929. Über das elektrenkephalogramm des menschen. Archiv psychiat. nervenkrank. 87, 1 (1929), 527--570.Google Scholar
- Rafael A. Calvo and Sidney D’Mello. 2010. Affect detection: An interdisciplinary review of models, methods, and their applications. IEEE Trans. Affect. Comput. 1, 1 (2010), 18--37.Google Scholar
Digital Library
- Erik Cambria. 2016. Affective computing and sentiment analysis. IEEE Intell. Syst. 31, 2 (2016), 102--107.Google Scholar
Digital Library
- Seong Youb Chung and Hyun Joong Yoon. 2012. Affective classification using Bayesian classifier and supervised learning [C]. ICCAS. 1768--1771.Google Scholar
- Paul Ekman, Robert W. Levenson, and Wallace V. Friesen. 1983. Autonomic nervous system activity distinguishes among emotions. Science 221, 4616 (1983), 1208--1210.Google Scholar
- Rana El Kaliouby and Peter Robinson. 2005. Real-time inference of complex mental states from facial expressions and head gestures. In Real-time Vision for Human--Computer Interaction. Springer, 181--200.Google Scholar
- Yifan Feng, Haoxuan You, Zizhao Zhang, Rongrong Ji, and Yue Gao. 2018. Hypergraph neural networks. In Proceedings of the Conference of the Association for the Advancement of Artificial Intelligence (AAAI’18).Google Scholar
- Raul Fernandez and Rosalind W. Picard. 2003. Modeling drivers’speech under stress. Speech Commun. 40, 1--2 (2003), 145--159.Google Scholar
Digital Library
- Yue Gao, Meng Wang, Zheng-Jun Zha, Jialie Shen, Xuelong Li, and Xindong Wu. 2013. Visual-textual joint relevance learning for tag-based social image search. IEEE Trans. Image Process. 22, 1 (2013), 363--376.Google Scholar
Digital Library
- Alastair J. Gill, Robert M. French, Darren Gergle, and Jon Oberlander. 2008. Identifying emotional characteristics from short blog texts. In Proceedings of the 30th Annual Conference of the Cognitive Science Society, B. C. Love, K. McRae, and V. M. Sloutsky (eds.). 2237--2242.Google Scholar
- Jennifer Healey, Rosalind W. Picard, et al. 2005. Detecting stress during real-world driving tasks using physiological sensors. IEEE Trans. Intell. Transport. Syst. 6, 2 (2005), 156--166.Google Scholar
Digital Library
- Yuchi Huang, Qingshan Liu, Fengjun Lv, Yihong Gong, and Dimitris N. Metaxas. 2011. Unsupervised image categorization by hypergraph partition. IEEE Trans. Pattern Anal. Mach. Intel. 33, 6 (2011), 1266--1273.Google Scholar
Digital Library
- Robert Jenke, Angelika Peer, and Martin Buss. 2014. Feature extraction and selection for emotion recognition from EEG. IEEE Trans. Affect. Comput. 5, 3 (2014), 327--339.Google Scholar
Cross Ref
- Jonghwa Kim and Elisabeth André. 2008. Emotion recognition based on physiological changes in music listening. IEEE Trans. Pattern Anal. Mach. Intell. 30, 12 (2008), 2067--2083.Google Scholar
Digital Library
- Jonghwa Kim, Elisabeth André, Matthias Rehm, Thurid Vogt, and Johannes Wagner. 2005. Integrating information from speech and physiological signals to achieve emotional sensitivity. In Proceedings of the 9th European Conference on Speech Communication and Technology.Google Scholar
Cross Ref
- Gennady G. Knyazev. 2007. Motivation, emotion, and their inhibitory control mirrored in brain oscillations. Neurosci. Biobehav. Rev. 31, 3 (2007), 377--395.Google Scholar
Cross Ref
- Sander Koelstra, Christian Muhl, Mohammad Soleymani, Jong-Seok Lee, Ashkan Yazdani, Touradj Ebrahimi, Thierry Pun, Anton Nijholt, and Ioannis Patras. 2012. Deap: A database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 3, 1 (2012), 18--31.Google Scholar
Digital Library
- Sander Koelstra and Ioannis Patras. 2013. Fusion of facial expressions and EEG for implicit affective tagging. Image Vision Comput. 31, 2 (2013), 164--174.Google Scholar
Digital Library
- Helmut Laufs, K. Krakow, P. Sterzer, E. Eger, A. Beyerle, A. Salek-Haddadi, and A. Kleinschmidt. 2003. Electroencephalographic signatures of attentional and cognitive default modes in spontaneous brain activity fluctuations at rest. Proc. Natl. Acad. Sci. U.S.A. 100, 19 (2003), 11053--11058.Google Scholar
Cross Ref
- Joseph E. LeDoux. 1995. Emotion: Clues from the brain. Annu. Rev. Psychol. 46, 1 (1995), 209--235.Google Scholar
Cross Ref
- Ting Li, Yoann Baveye, Christel Chamaret, Emmanuel Dellandréa, and Liming Chen. 2015. Continuous arousal self-assessments validation using real-time physiological responses. In Proceedings of the 1st International Workshop on Affect 8 Sentiment in Multimedia. ACM, 39--44.Google Scholar
Digital Library
- Qian Luo, Tom Holroyd, Matthew Jones, Talma Hendler, and James Blair. 2007. Neural dynamics for facial threat processing as revealed by gamma band synchronization using MEG. Neuroimage 34, 2 (2007), 839--847.Google Scholar
Cross Ref
- Hector P. Martinez, Yoshua Bengio, and Georgios N. Yannakakis. 2013. Learning deep physiological models of affect. IEEE Comput. Intell. Mag. 8, 2 (2013), 20--33.Google Scholar
Digital Library
- Arturo Nakasone, Helmut Prendinger, and Mitsuru Ishizuka. 2005. Emotion recognition from electromyography and skin conductance. In Proceedgins of the 5th International Workshop on Biosignal Interpretation. Citeseer, 219--222.Google Scholar
- Soujanya Poria, Erik Cambria, Rajiv Bajpai, and Amir Hussain. 2017. A review of affective computing: From unimodal analysis to multimodal fusion. Frontiers in ICT 7 (2017), 98--125.Google Scholar
- William J. Ray and Harry W. Cole. 1985. EEG alpha activity reflects attentional demands, and beta activity reflects emotional and cognitive processes. Science 228, 4700 (1985), 750--752.Google Scholar
Cross Ref
- Viktor Rozgić, Shiv N. Vitaladevuni, and Rohit Prasad. 2013. Robust EEG emotion classification using segment level decision fusion. In Proceeedings of the 2013 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP’13). IEEE, 1286--1290.Google Scholar
Cross Ref
- James A. Russell. 1983. Pancultural aspects of the human conceptual organization of emotions.J. Pers. Soc. Psychol. 45, 6 (1983), 1281.Google Scholar
- James A. Russell. 1991. Culture and the categorization of emotions. Psychol. Bull. 110, 3 (1991), 426.Google Scholar
Cross Ref
- Paul Sauseng, Wolfgang Klimesch, Waltraud Stadler, Manuel Schabus, Michael Doppelmayr, Simon Hanslmayr, Walter R. Gruber, and Niels Birbaumer. 2005. A shift of visual spatial attention is selectively associated with human EEG alpha activity. Eur. J. Neurosci. 22, 11 (2005), 2917--2926.Google Scholar
Cross Ref
- Yangyang Shu and Shangfei Wang. 2017. Emotion recognition through integrating EEG and peripheral signals. In Proceedings of the 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP’17). IEEE, 2871--2875.Google Scholar
Cross Ref
- Mohammad Soleymani, Sadjad Asghari-Esfeden, Yun Fu, and Maja Pantic. 2016. Analysis of EEG signals and facial expressions for continuous emotion detection [J]. IEEE Transactions on Affective Computing 7, 1 (2016), 17--28.Google Scholar
Digital Library
- Mohammad Soleymani, Maja Pantic, and Thierry Pun. 2012. Multimodal emotion recognition in response to videos. IEEE Trans. Affect. Comput. 3, 2 (2012), 211--223.Google Scholar
Digital Library
- Mohammad Soleymani, Frank Villaro-Dixon, Thierry Pun, and Guillaume Chanel. 2017. Toolbox for emotional feature extraction from physiological signals (TEAP). Front. ICT 13 (2017), 1.Google Scholar
- Ramanathan Subramanian, Julia Wache, Mojtaba Khomami Abadi, Radu L. Vieriu, Stefan Winkler, and Nicu Sebe. 2018. ASCERTAIN: Emotion and personality recognition using commercial sensors. IEEE Trans. Affect. Comput. 7, 1 (2018), 17--28.Google Scholar
- Samarth Tripathi, Shrinivas Acharya, Ranti Dev Sharma, Sudhanshu Mittal, and Samit Bhattacharya. 2017. Using deep and convolutional neural networks for accurate emotion classification on DEAP dataset. In Proceedings of the Innovation Aviation 8 Aerospace Industry International Conference (IAAI’17). 4746--4752.Google Scholar
- Zizhao Zhang, Haojie Lin, Xibin Zhao, Rongrong Ji, and Yue Gao. 2018. Inductive multi-hypergraph learning and its application on view-based 3D object classification. IEEE Trans. Image Process. 27, 12 (2018), 5957--5968.Google Scholar
Digital Library
- Sicheng Zhao, Amir Gholaminejad, Guiguang Ding, Yue Gao, Jungong Han, and Kurt Keutzer. 2019. Personalized emotion recognition by personality-aware high-order learning of physiological signals. ACM Trans. Multimedia Comput. Commun. Appl. 15, 1s (2019), 14.Google Scholar
Digital Library
- Sicheng Zhao, Chuang Lin, Pengfei Xu, Sendong Zhao, Yuchen Guo, Ravi Krishna, Guiguang Ding, and Kurt Keutzer. 2019. CycleEmotionGAN: Emotional semantic consistency preserved cycleGAN for adapting image emotions. In Proceedings of the Conference of the Association for the Advancement of Artificial Intelligence (AAAI’19).Google Scholar
Cross Ref
- Sicheng Zhao, Hongxun Yao, Wenlong Xie, and Xiaolei Jiang. 2016. User-centric affective computing of image emotion perceptions. In Proceedings of the Conference of the Association for the Advancement of Artificial Intelligence (AAAI’16). 4284--4285.Google Scholar
- Sicheng Zhao, Xin Zhao, Guiguang Ding, and Kurt Keutzer. 2018. EmotionGAN: Unsupervised domain adaptation for learning discrete probability distributions of image emotions. In Proceedings of the 2018 ACM Multimedia Conference on Multimedia Conference. ACM, 1319--1327.Google Scholar
Digital Library
- Dengyong Zhou, Jiayuan Huang, and Bernhard Schölkopf. 2007. Learning with hypergraphs: Clustering, classification, and embedding. In Advances in Neural Information Processing Systems. 1601--1608.Google Scholar
Index Terms
Physiological Signals-based Emotion Recognition via High-order Correlation Learning
Recommendations
Personalized Emotion Recognition by Personality-Aware High-Order Learning of Physiological Signals
Special Section on Deep Learning for Intelligent Multimedia Analytics and Special Section on Multi-Modal Understanding of Social, Affective and Subjective Attributes of DataDue to the subjective responses of different subjects to physical stimuli, emotion recognition methodologies from physiological signals are increasingly becoming personalized. Existing works mainly focused on modeling the involved physiological corpus ...
Emotion Recognition Using Physiological Signals
MIDI '15: Proceedings of the Mulitimedia, Interaction, Design and InnnovationIn this paper the problem of emotion recognition using physiological signals is presented. Firstly the problems with acquisition of physiological signals related to specific human emotions are described. It is not a trivial problem to elicit real ...
Mobile Emotion Recognition via Multiple Physiological Signals using Convolution-augmented Transformer
ICMR '22: Proceedings of the 2022 International Conference on Multimedia RetrievalRecognising and monitoring emotional states play a crucial role in mental health and well-being management. Importantly, with the widespread adoption of smart mobile and wearable devices, it has become easier to collect long-term and granular emotion-...






Comments