Abstract
Different senses provide us with information of various levels of precision and enable us to construct a more precise representation of the world. Rich multisensory simulations are thus beneficial for comprehension, memory reinforcement, or retention of information. Crossmodal mappings refer to the systematic associations often made between different sensory modalities (e.g., high pitch is matched with angular shapes) and govern multisensory processing. A great deal of research effort has been put into exploring cross-modal correspondences in the field of cognitive science. However, the possibilities they open in the digital world have been relatively unexplored. Multiple sensorial media (mulsemedia) provides a highly immersive experience to the users and enhances their Quality of Experience (QoE) in the digital world. Thus, we consider that studying the plasticity and the effects of cross-modal correspondences in a mulsemedia setup can bring interesting insights about improving the human computer dialogue and experience. In our experiments, we exposed users to videos with certain visual dimensions (brightness, color, and shape), and we investigated whether the pairing with a cross-modal matching sound (high and low pitch) and the corresponding auto-generated vibrotactile effects (produced by a haptic vest) lead to an enhanced QoE. For this, we captured the eye gaze and the heart rate of users while experiencing mulsemedia, and we asked them to fill in a set of questions targeting their enjoyment and perception at the end of the experiment. Results showed differences in eye-gaze patterns and heart rate between the experimental and the control group, indicating changes in participants’ engagement when videos were accompanied by matching cross-modal sounds (this effect was the strongest for the video displaying angular shapes and high-pitch audio) and transitively generated cross-modal vibrotactile effects.<?vsp -1pt?>
- Oluwakemi A. Ademoye and Gheorghita Ghinea. 2013. Information recall task impact in olfaction-enhanced multimedia. ACM Trans. Multimedia Comput. Commun. Appl. 9, 3 (2013), 17. Google Scholar
Digital Library
- Oluwakemi A. Ademoye, Niall Murray, Gabriel-Miro Muntean, and Gheorghita Ghinea. 2016. Audio masking effect on inter-component skews in olfaction-enhanced multimedia presentations. ACM Trans. Multimedia Comput. Commun. Appl. 12, 4 (2016), 51. Google Scholar
Digital Library
- Belma R. Brkic, Alan Chalmers, Kevin Boulanger, Sumanta Pattanaik, and James Covington. 2009. Cross-modal affects of smell on the real-time rendering of grass. In Proceedings of the 25th Spring Conference on Computer Graphics. ACM, 161--166. Google Scholar
Digital Library
- Kjell Brunnström, Sergio Ariel Beker, Katrien De Moor, Ann Dooms, Sebastian Egger, Marie-Neige Garcia, Tobias Hossfeld, Satu Jumisko-Pyykkö, Christian Keimel, Mohamed-Chaker Larabi, et al. 2013. Qualinet white paper on definitions of quality of experience. (2013).Google Scholar
- Drew Cingel and Anne Marie Piper. 2017. How parents engage children in tablet-based reading experiences: An exploration of haptic feedback. In Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing. ACM, 505--510. Google Scholar
Digital Library
- Anne-Sylvie Crisinel and Charles Spence. 2009. Implicit association between basic tastes and pitch. Neurosci. Lett. 464, 1 (2009), 39--42.Google Scholar
- Edwin Dalmaijer. 2014. Is the low-cost EyeTribe eye tracker any good for research? No. e585v1. PeerJ PrePrints (2014).Google Scholar
- Fabien Danieau, Anatole Lécuyer, Philippe Guillotel, Julien Fleureau, Nicolas Mollet, and Marc Christie. 2013. Enhancing audiovisual experience with haptic feedback: A survey on HAV. IEEE Trans. Haptics 6, 2 (2013), 193--205. Google Scholar
Digital Library
- Anders Drachen, Lennart E. Nacke, Georgios Yannakakis, and Anja Lee Pedersen. 2010. Correlation between heart rate, electrodermal activity and player experience in first-person shooter games. In Proceedings of the 5th ACM SIGGRAPH Symposium on Video Games. ACM, 49--54. Google Scholar
Digital Library
- Susan T. Dumais, Georg Buscher, and Edward Cutrell. 2010. Individual differences in gaze patterns for web search. In Proceedings of the 3rd Symposium on Information Interaction in Context (IIiX’10). ACM, New York, NY, 185--194. Google Scholar
Digital Library
- Darragh Egan, Sean Brennan, John Barrett, Yuansong Qiao, Christian Timmerer, and Niall Murray. 2016. An evaluation of heart rate and electrodermal activity as an objective QoE evaluation method for immersive virtual reality environments. In Proceedings of the 2016 8th International Conference on Quality of Multimedia Experience (QoMEX’16). IEEE, 1--6.Google Scholar
Cross Ref
- Mohamad Eid, Jongeun Cha, and Abdulmotaleb El Saddik. 2008. HugMe: A haptic videoconferencing system for interpersonal communication. In Proceedings of the IEEE Conference on Virtual Environments, Human--Computer Interfaces and Measurement Systems (VECIMS’08). IEEE, 5--9.Google Scholar
Cross Ref
- Ulrich Engelke, Daniel P. Darcy, Grant H. Mulliken, Sebastian Bosse, Maria G. Martini, Sebastian Arndt, Jan-Niklas Antons, Kit Yan Chan, Naeem Ramzan, and Kjell Brunnström. 2017. Psychophysiology-based QoE assessment: A survey. IEEE J. Select. Top. Sign. Process. 11, 1 (2017), 6--21.Google Scholar
Cross Ref
- David Gal, S. Christian Wheeler, and Baba Shiv. 2007. Cross-modal influences on gustatory perception. Available at SSRN 1030197.Google Scholar
- Gheorghita Ghinea and Oluwakemi Ademoye. 2012a. The sweet smell of success: Enhancing multimedia applications with olfaction. ACM Trans. Multimedia Comput. Commun. Appl. 8, 1 (2012), 2. Google Scholar
Digital Library
- Gheorghita Ghinea and Oluwakemi Ademoye. 2012b. User perception of media content association in olfaction-enhanced multimedia. ACM Trans. Multimedia Comput. Commun. Appl. 8, 4 (2012), 52. Google Scholar
Digital Library
- Gheorghita Ghinea, Christian Timmerer, Weisi Lin, and Stephen R. Gulliver. 2014. Mulsemedia: State of the art, perspectives, and challenges. ACM Trans. Multimedia Comput. Commun. Appl. 11, 1s, Article 17 (Oct. 2014), 23 pages. Google Scholar
Digital Library
- Kiel Gilleade, Alan Dix, and Jen Allanson. 2005. Affective videogames and modes of affective gaming: Assist me, challenge me, emote me. In Proceedings of DiGRA 2005: Changing Views--Worlds in Play. (2005).Google Scholar
- Stephen R. Gulliver and George Ghinea. 2004. Stars in their eyes: What eye-tracking reveals about multimedia perceptual quality. IEEE Trans. Syst. Man Cybernet. A 34, 4 (2004), 472--482. Google Scholar
Digital Library
- Henrik Hagtvedt and S. Adam Brasel. 2016. Cross-modal communication: Sound frequency influences consumer responses to color lightness. J. Market. Res. 53, 4 (2016), 551--562.Google Scholar
Cross Ref
- E. Hoggan and S. A. Brewster. 2006. Mobile crossmodal auditory and tactile displays. In Proceedings of the 1st International Workshop on Haptic and Audio Interaction Design (HAID’06). 9--12.Google Scholar
- Gijs Huisman, Merijn Bruijnes, and Dirk K. J. Heylen. 2016. A moving feast: Effects of color, shape and animation on taste associations and taste perceptions. In Proceedings of the 13th International Conference on Advances in Computer Entertainment Technology. ACM, 13. Google Scholar
Digital Library
- Vedad Hulusić, Gabriela Czanner, Kurt Debattista, Elena Sikudova, Piotr Dubla, and Alan Chalmers. 2009. Investigation of the beat rate effect on frame rate for animated content. In Proceedings of the 25th Spring Conference on Computer Graphics. ACM, 151--159. Google Scholar
Digital Library
- Vedad Hulusic, Kurt Debattista, Vibhor Aggarwal, and Alan Chalmers. 2010. Exploiting audio-visual cross-modal interaction to reduce computational requirements in interactive environments. In Proceedings of the 2010 2nd International Conference on Games and Virtual Worlds for Serious Applications (VS-GAMES’10). IEEE, 126--132. Google Scholar
Digital Library
- Conor Keighrey, Ronan Flynn, Siobhan Murray, and Niall Murray. 2017. A QoE evaluation of immersive augmented and virtual reality speech 8 language assessment applications. In Proceedings of the 2017 9th International Conference on Quality of Multimedia Experience (QoMEX’17). IEEE, 1--6.Google Scholar
Cross Ref
- Sander Koelstra, Christian Muhl, Mohammad Soleymani, Jong-Seok Lee, Ashkan Yazdani, Touradj Ebrahimi, Thierry Pun, Anton Nijholt, and Ioannis Patras. 2012. Deap: A database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 3, 1 (2012), 18--31. Google Scholar
Digital Library
- Naoya Koizumi, Hidekazu Tanaka, Yuji Uema, and Masahiko Inami. 2011. Chewing jockey: Augmented food texture by using sound based on the cross-modal effect. In Proceedings of the 8th International Conference on Advances in Computer Entertainment Technology. ACM, 21. Google Scholar
Digital Library
- Daniela Litscher, Lu Wang, Ingrid Gaischek, and Gerhard Litscher. 2013. The influence of new colored light stimulation methods on heart rate variability, temperature, and well-being: Results of a pilot study in humans. Evidence-Based Complementary and Alternative Medicine (2013).Google Scholar
- Lawrence E. Marks. 1974. On associations of light and sound: The mediation of brightness, pitch, and loudness. Am. J. Psychol. (1974), 173--188.Google Scholar
- Lawrence E. Marks. 1987. On cross-modal similarity: Auditory--visual interactions in speeded discrimination.J. Exp. Psychol.: Hum. Percept. Perf. 13, 3 (1987), 384.Google Scholar
Cross Ref
- Georgia Mastoropoulou. 2007. The Effect of Audio on the Visual Perception of High-Fidelity Animated 3d Computer Graphics. Ph.D. Dissertation. University of Bristol.Google Scholar
- Georgia Mastoropoulou, Kurt Debattista, Alan Chalmers, and Tom Troscianko. 2005. Auditory bias of visual attention for perceptually-guided selective rendering of animations. In Proceedings of the 3rd International Conference on Computer Graphics and Interactive Techniques in Australasia and South East Asia. ACM, 363--369. Google Scholar
Digital Library
- Gene Munster, Travis Jakel, Doug Clinton, and Erinn Murphy. 2015. Next mega tech theme is virtual reality. Gene 612 (2015), 303--6452.Google Scholar
- Niall Murray, Oluwakemi A. Ademoye, Gheorghita Ghinea, and Gabriel-Miro Muntean. 2017. A tutorial for olfaction-based multisensorial media application design and evaluation. ACM Comput. Surv. 50, 5 (2017), 67. Google Scholar
Digital Library
- Belma Ramic-Brkic, Alan Chalmers, Aida Sadzak, Kurt Debattista, and Saida Sultanic. 2013. Exploring multiple modalities for selective rendering of virtual environments. In Proceedings of the 29th Spring Conference on Computer Graphics. ACM, 91--98. Google Scholar
Digital Library
- Nimesha Ranasinghe, Pravar Jain, Nguyen Thi Ngoc Tram, Koon Chuan Raymond Koh, David Tolley, Shienny Karwita, Lin Lien-Ya, Yan Liangkun, Kala Shamaiah, Chow Eason Wai Tung, et al. 2018. Season traveller: Multisensory narration for enhancing the virtual reality experience. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 577. Google Scholar
Digital Library
- Nimesha Ranasinghe, Kuan-Yi Lee, and Ellen Yi-Luen Do. 2014. FunRasa: An interactive drinking platform. In Proceedings of the 8th International Conference on Tangible, Embedded and Embodied Interaction. ACM, 133--136. Google Scholar
Digital Library
- Nimesha Ranasinghe, Thi Ngoc Tram Nguyen, Yan Liangkun, Lien-Ya Lin, David Tolley, and Ellen Yi-Luen Do. 2017. Vocktail: A virtual cocktail for pairing digital taste, smell, and color sensations. In Proceedings of the 2017 ACM on Multimedia Conference. ACM, 1139--1147. Google Scholar
Digital Library
- Han-Seok Seo, Artin Arshamian, Kerstin Schemmer, Ingeborg Scheer, Thorsten Sander, Guido Ritter, and Thomas Hummel. 2010. Cross-modal integration between odors and abstract symbols. Neurosci. Lett. 478, 3 (2010), 175--178.Google Scholar
- Ladan Shams, Yukiyasu Kamitani, and Shinsuke Shimojo. 2000. Illusions: What you see is what you hear. Nature 408, 6814 (2000), 788.Google Scholar
- J. Simner and V. Ludwig. 2009. What colour does that feel? Cross-modal correspondences from touch to colour. In Proceedings of the 3rd International Conference of Synaesthesia and Art.Google Scholar
- Ray H. Simpson, Marian Quinn, and David P. Ausubel. 1956. Synesthesia in children: Association of colors with pure tone frequencies. J. Gen. Psychol. 89, 1 (1956), 95--103.Google Scholar
Cross Ref
- B. G. Slocombe, D. A. Carmichael, and J. Simner. 2016. Cross-modal tactile--taste interactions in food evaluations. Neuropsychologia 88 (2016), 58--64.Google Scholar
Cross Ref
- Guanghan Song, Denis Pellerin, and Lionel Granjon. 2012. How different kinds of sound in videos can influence gaze. In Proceedings of the 13th International Workshop on Image Analysis for Multimedia Interactive Services (WIAMIS’12). 4--p.Google Scholar
Cross Ref
- Charles Spence. 2010. The color of wine--part 1. Wrld. Fine Wine 28 (2010), 122--129.Google Scholar
- Charles Spence. 2011. Crossmodal correspondences: A tutorial review. Att. Percept. Psychophys. 73, 4 (2011), 971--995.Google Scholar
Cross Ref
- Peter Stephen and Susan Hornby. 1997. Simple statistics for library and information professionals. Library Association Publishing. (1997).Google Scholar
- Alina Striner. 2018. Can multisensory cues in VR help train pattern recognition to citizen scientists? arXiv preprint arXiv:1804.00229 (2018).Google Scholar
- Yevgeniya Sulema. 2016. Mulsemedia vs. Multimedia: State of the art and future trends. In Proceedings of the 2016 International Conference on Systems, Signals and Image Processing (IWSSIP’16). IEEE, 1--5.Google Scholar
Cross Ref
- Xiuwen Sun, Xiaoling Li, Lingyu Ji, Feng Han, Huifen Wang, Yang Liu, Yao Chen, Zhiyuan Lou, and Zhuoyun Li. 2018. An extended research of crossmodal correspondence between color and sound in psychology and cognitive ergonomics. PeerJ 6 (Mar. 2018), e4443.Google Scholar
- Benjamin Tag, Takuya Goto, Kouta Minamizawa, Ryan Mannschreck, Haruna Fushimi, and Kai Kunze. 2017. atmoSphere: Mindfulness over haptic-audio cross modal correspondence. In Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers. ACM, 289--292. Google Scholar
Digital Library
- Atau Tanaka and Adam Parkinson. 2016. Haptic wave: A cross-modal interface for visually impaired audio producers. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 2150--2161. Google Scholar
Digital Library
- R. Tortell, D. P. Luigi, A. Dozois, S. Bouchard, Jacquelyn Ford Morie, and D. Ilan. 2007. The effects of scent and game play experience on memory of a virtual environment. Virt. Real. 11, 1 (2007), 61--68.Google Scholar
Digital Library
- Augoustinos Tsiros. 2017. The parallels between the study of cross-modal correspondence and the design of cross-sensory mappings. In Proceedings of the Conference on Electronic Visualisation and the Arts. BCS Learning 8 Development Ltd., 175--182. Google Scholar
Digital Library
- Shafiq ur Réhman, Muhammad Sikandar Lal Khan, Liu Li, and Haibo Li. 2014. Vibrotactile TV for immersive experience. In Proceedings of the 2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA’14). IEEE, 1--4.Google Scholar
Cross Ref
- Giulio Valenti and Klaas R. Westerterp. 2013. Optical heart rate monitoring module validation study. In Proceedings of the 2013 IEEE International Conference on Consumer Electronics (ICCE’13). IEEE, 195--196.Google Scholar
- J. Jessica Wang and Ian A. Apperly. 2017. Just one look: Direct gaze briefly disrupts visual working memory. Psychonom. Bull. Rev. 24, 2 (2017), 393--399.Google Scholar
Cross Ref
- Robert B. Welch, Lance D. DutionHurt, and David H. Warren. 1986. Contributions of audition and vision to temporal rate perception. Percept. Psychophys. 39, 4 (1986), 294--300.Google Scholar
Cross Ref
- Robert B. Welch and David H. Warren. 1980. Immediate perceptual response to intersensory discrepancy. Psychol. Bull. 88, 3 (1980), 638.Google Scholar
Cross Ref
- Jeffrey M. Yau, Jonathon B. Olenczak, John F. Dammann, and Sliman J Bensmaia. 2009. Temporal frequency channels are linked across audition and touch. Curr. Biol. 19, 7 (2009), 561--566.Google Scholar
Cross Ref
- Zhenhui Yuan, Shengyang Chen, Gheorghita Ghinea, and Gabriel-Miro Muntean. 2014. User quality of experience of mulsemedia applications. ACM Trans. Multimedia Comput. Commun. Appl. 11, 1s (2014), 15. Google Scholar
Digital Library
- Zhenhui Yuan, Gheorghita Ghinea, and Gabriel-Miro Muntean. 2015. Beyond multimedia adaptation: Quality of experience-aware multi-sensorial media delivery. IEEE Trans. Multimedia 17, 1 (2015), 104--117.Google Scholar
- Longhao Zou, Irina Tal, Alexandra Covaci, Eva Ibarrola, Gheorghita Ghinea, and Gabriel-Miro Muntean. 2017. Can multisensorial media improve learner experience? In Proceedings of the 8th ACM on Multimedia Systems Conference. ACM, 315--320. Google Scholar
Digital Library
Index Terms
Using Eye Tracking and Heart-Rate Activity to Examine Crossmodal Correspondences QoE in Mulsemedia
Recommendations
A study on the quality of experience of crossmodal mulsemedia
MEDES '18: Proceedings of the 10th International Conference on Management of Digital EcoSystemsMedia content is consumed more and more on digital and mobile devices. Internet of Things enables media providers to move beyond the traditional content and entertainment space and offer a plethora of new services that creates an engaging user ...
On the influence of individual differences in cross-modal Mulsemedia QoE
AbstractQuality of Experience (QoE) is inextricably linked to the human side of the multimedia experience. Whilst there has been a considerable amount of research undertaken to explore the various dimensions of QoE, one facet which been relatively ...
Mulsemedia: multiple sensorial media - towards understanding user and quality of experience of future media applications
WebMedia '19: Proceedings of the 25th Brazillian Symposium on Multimedia and the WebRecently, there has been an emerging interest and research activity in in sensory technologies that stimulate all of the human senses vision, hearing, touch, smell, and taste. Critical to the success of such experiences is to understand how users ...






Comments