Abstract
Eye-tracking has a very strong potential in human computer interaction (HCI) as an input modality, particularly in mobile situations. However, it lacks convenient action triggering methods. In our research, we investigate the combination of eye-tracking and fixed-gaze head movement, which allows us to trigger various commands without using our hands or changing gaze direction. In this instance, we have proposed a new algorithm for fixed-gaze head movement detection using only scene images captured by the scene camera equipped in front of the head-mounted eye-tracker, for the purpose of saving computation time. To test the performance of our fixed-gaze head movement detection algorithm and the acceptance of triggering commands by these movements when the user's hands are occupied by another task, we have designed and developed an experimental application known as EyeMusic. The EyeMusic system is a music reading system, which can play the notes of a measure in a music score that the user does not understand. By making a voluntary head movement when fixing his/her gaze on the same point of a music score, the user can obtain the desired audio feedback. The design, development and usability testing of the first prototype for this application are presented in this paper. The usability of our application is confirmed by the experimental results, as 85% of participants were able to use all the head movements we implemented in the prototype. The average success rate of this application is 70%, which is partly influenced by the performance of the eye-tracker we use. The performance of our fixed-gaze head movement detection algorithm is 85%, and there were no significant differences between the performance of each head movement.
- Vasileios Athanasios Anagnostopoulos and Peter Kiefer. 2016. Towards gaze-based interaction with urban outdoor spaces. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct. ACM, Heidelberg, Germany, 1706--1715.Google Scholar
Digital Library
- Dario Bonino, Emiliano Castellina, Fulvio Corno, A Gale, Alessandro Garbo, Kevin Purdy, and Fangmin Shi. 2009. A blueprint for integrated eye-controlled environments. Universal Access in the Information Society, Vol. 8, 4 (2009), 311. Google Scholar
Digital Library
- Andreas Bulling, Daniel Roggen, and Gerhard Tröster. 2009. Wearable EOG Goggles: Eye-based Interaction in Everyday Environments. In CHI Conference on Human Factors in Computing Systems. ACM, Boston, USA, 3252--3264. Google Scholar
Digital Library
- Heiko Drewes. 2010. Eye gaze tracking for human computer interaction. Ph.D. Dissertation. Ludwig Maximilian University of Munich.Google Scholar
- Heiko Drewes and Albrecht Schmidt. 2007. Interacting with the computer using gaze gestures. In IFIP Conference on Human-Computer Interaction. Springer, Rio de Janeiro, Brazil, 475--488. Google Scholar
Digital Library
- Heiko Drewes and Albrecht Schmidt. 2009. The MAGIC touch: Combining MAGIC-pointing with a touch-sensitive mouse. In IFIP Conference on Human-Computer Interaction. Springer, Uppsala, Sweden, 415--428. Google Scholar
Digital Library
- Andrew T Duchowski. 2007. Eye tracking methodology: Theory and practice .Springer. Google Scholar
Digital Library
- Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze interaction for smart watches using smooth pursuit eye movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. ACM, Charlotte, North Carolina, USA, 457--466. Google Scholar
Digital Library
- Sergio Garrido-Jurado, Rafael Mu noz-Salinas, Francisco José Madrid-Cuevas, and Manuel Jesús Mar'in-Jiménez. 2014. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognition, Vol. 47, 6 (2014), 2280--2292. Google Scholar
Digital Library
- Ioannis Giannopoulos, Peter Kiefer, and Martin Raubal. 2015. GazeNav: Gaze-based pedestrian navigation. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services. Copenhagen, Denmark, 337--346. Google Scholar
Digital Library
- Jeremy Hales, David Rozado, and Diako Mardanbegi. 2013. Interacting with objects in the environment by gaze and hand gestures. In Proceedings of the 3rd international workshop on pervasive eye tracking and mobile eye-based interaction. Lund, Sweden, 1--9.Google Scholar
- Anthony Hornof, Anna Cavender, and Rob Hoselton. 2004. Eyedraw: a system for drawing pictures with eye movements. In ACM SIGACCESS Accessibility and Computing. ACM, 86--93. Google Scholar
Digital Library
- Robert JK Jacob. 1990. What you look at is what you get: eye movement-based interaction techniques. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 11--18. Google Scholar
Digital Library
- Takuya Kobayashi, Takumi Toyamaya, Faisal Shafait, Masakazu Iwamura, Koichi Kise, and Andreas Dengel. 2012. Recognizing words in scenes with a head-mounted eye-tracker. In Document Analysis Systems (DAS), 2012 10th IAPR International Workshop on. IEEE, 333--338.Google Scholar
Digital Library
- Angela Kwan. {n. d.}. 6 Benefits of Music Lessons. https://www.parents.com/kids/development/intellectual/6-benefits-of-music-lessons/. ({n. d.}). Accessed March 21, 2018.Google Scholar
- Hubert W Lilliefors. 1967. On the Kolmogorov-Smirnov test for normality with mean and variance unknown. Journal of the American statistical Association, Vol. 62, 318 (1967), 399--402.Google Scholar
Cross Ref
- Paul P Maglio, Teenie Matlock, Christopher S Campbell, Shumin Zhai, and Barton A Smith. 2000. Gaze and speech in attentive user interfaces. In Advances in Multimodal Interfaces ICMI 2000. Springer, 1--7.Google Scholar
Digital Library
- Paivi Majaranta and Kari-Jouko Raiha. 2002. Twenty years of eye typing: systems and design issues. In Proceedings of the 2002 symposium on Eye tracking research & applications. ACM, 15--22. Google Scholar
Digital Library
- Diako Mardanbegi, Dan Witzner Hansen, and Thomas Pederson. 2012. Eye-based head gestures. In Proceedings of the symposium on eye tracking research and applications. ACM, 139--146. Google Scholar
Digital Library
- Michael Mauderer, Florian Daiber, and Antonio Krüger. 2013. Combining touch and gaze for distant selection in a tabletop setting. In Proceedings of the Workshop on Gaze Interaction in the Post-WIMP World - ACM SIGCHI Conference on Human Factors in Computing Systems. ACM.Google Scholar
- Apurva Mehta and Malay Bhatt. 2014. Practical issues in the field of optical music recognition. IJARCSMS, Vol. 2 (2014), 513--518.Google Scholar
- Emilie Møllenbach, John Paulin Hansen, and Martin Lillholm. 2013. Eye movements in gaze interaction. Journal of Eye Movement Research, Vol. 6, 2 (2013).Google Scholar
- Tomi Nukarinen, Jari Kangas, Oleg vS pakov, Poika Isokoski, Deepak Akkil, Jussi Rantala, and Roope Raisamo. 2016. Evaluation of HeadTurn: An Interaction Technique Using the Gaze and Head Turns. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction. ACM, 43. Google Scholar
Digital Library
- Alice Oh, Harold Fox, Max Van Kleek, Aaron Adler, Krzysztof Gajos, Louis-Philippe Morency, and Trevor Darrell. 2002. Evaluating look-to-talk: a gaze-aware interface in a collaborative environment. In CHI'02 Extended Abstracts on Human Factors in Computing Systems. ACM, 650--651. Google Scholar
Digital Library
- Keith Rayner. 1998. Eye movements in reading and information processing: 20 years of research. Psychological bulletin, Vol. 124, 3 (1998), 372--421.Google Scholar
- Ana Rebelo, Ichiro Fujinaga, Filipe Paszkiewicz, Andre RS Marcal, Carlos Guedes, and Jaime S Cardoso. 2012. Optical music recognition: state-of-the-art and open issues. International Journal of Multimedia Information Retrieval, Vol. 1, 3 (2012), 173--190.Google Scholar
Cross Ref
- Jacek Ruminski, Adam Bujnowski, Jerzy Wtorek, Aliaksei Andrushevich, Martin Biallas, and Rolf Kistler. 2014. Interactions with recognized objects. In 7th International Conference on Human System Interactions (HSI), 2014. IEEE, 101--105.Google Scholar
Cross Ref
- Jeffrey S Shell, Ted Selker, and Roel Vertegaal. 2003. Interacting with groups of computers. Commun. ACM, Vol. 46, 3 (2003), 40--46. Google Scholar
Digital Library
- Fangmin Shi, Alastair G Gale, and Kevin Purdy. 2006. Direct gaze based environmental controls. In The 2nd Conference on Communication by Gaze Interaction. COGAIN, 36--41.Google Scholar
- Oleg Spakov and Paivi Majaranta. 2012. Enhanced gaze interaction using simple head gestures. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing. ACM, 705--710. Google Scholar
Digital Library
- Sophie Stellmach, Sebastian Stober, Andreas Nürnberger, and Raimund Dachselt. 2011. Designing gaze-supported multimodal interactions for the exploration of large image collections. In Proceedings of the 1st conference on novel gaze-controlled applications. ACM.Google Scholar
Digital Library
- Desney Tan and Anton Nijholt. 2010. Brain-computer interfaces and human-computer interaction. In Brain-Computer Interfaces. Springer, 3--19.Google Scholar
- Takumi Toyama, Thomas Kieninger, Faisal Shafait, and Andreas Dengel. 2011. Museum guide 2.0-an eye-tracking based personal assistant for museums and exhibits. In Proc. of Int. Conf. on Re-Thinking Technology in Museums, Vol. 1.Google Scholar
- Jayson Turner, Andreas Bulling, and Hans Gellersen. 2011. Combining gaze with manual interaction to extend physical reach. In Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interaction. ACM, 33--36. Google Scholar
Digital Library
- Eduardo Velloso, Jayson Turner, Jason Alexander, Andreas Bulling, and Hans Gellersen. 2015. An empirical investigation of gaze selection in mid-air gestural 3D manipulation. In Human-Computer Interaction. Springer, 315--330.Google Scholar
- Eduardo Velloso, Markus Wirth, Christian Weichel, Augusto Esteves, and Hans Gellersen. 2016. AmbiGaze: Direct control of ambient devices by gaze. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems. ACM, 812--817. Google Scholar
Digital Library
- Roel Vertegaal, Aadil Mamuji, Changuk Sohn, and Daniel Cheng. 2005. Media eyepliances: using eye tracking for remote control focus selection of appliances. In CHI'05 Extended Abstracts on Human Factors in Computing Systems. ACM, 1861--1864. Google Scholar
Digital Library
- Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and gaze input cascaded (MAGIC) pointing. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems. ACM, 246--253. Google Scholar
Digital Library
Index Terms
Assisted Music Score Reading Using Fixed-Gaze Head Movement: Empirical Experiment and Design Implications
Recommendations
Gaze-Hand Alignment: Combining Eye Gaze and Mid-Air Pointing for Interacting with Menus in Augmented Reality
ETRAGaze and freehand gestures suit Augmented Reality as users can interact with objects at a distance without need for a separate input device. We propose Gaze-Hand Alignment as a novel multimodal selection principle, defined by concurrent use of both gaze ...
Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection
UIST '19: Proceedings of the 32nd Annual ACM Symposium on User Interface Software and TechnologyEye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing approaches to gaze pointing are based on eye-tracking in abstraction from head motion. We propose to leverage the synergetic movement of eye and head, and ...
Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality
Humans perform gaze shifts naturally through a combination of eye, head and body movements. Although gaze has been long studied as input modality for interaction, this has previously ignored the coordination of the eyes, head and body. This article ...






Comments