skip to main content
research-article

Assisted Music Score Reading Using Fixed-Gaze Head Movement: Empirical Experiment and Design Implications

Published:13 June 2019Publication History
Skip Abstract Section

Abstract

Eye-tracking has a very strong potential in human computer interaction (HCI) as an input modality, particularly in mobile situations. However, it lacks convenient action triggering methods. In our research, we investigate the combination of eye-tracking and fixed-gaze head movement, which allows us to trigger various commands without using our hands or changing gaze direction. In this instance, we have proposed a new algorithm for fixed-gaze head movement detection using only scene images captured by the scene camera equipped in front of the head-mounted eye-tracker, for the purpose of saving computation time. To test the performance of our fixed-gaze head movement detection algorithm and the acceptance of triggering commands by these movements when the user's hands are occupied by another task, we have designed and developed an experimental application known as EyeMusic. The EyeMusic system is a music reading system, which can play the notes of a measure in a music score that the user does not understand. By making a voluntary head movement when fixing his/her gaze on the same point of a music score, the user can obtain the desired audio feedback. The design, development and usability testing of the first prototype for this application are presented in this paper. The usability of our application is confirmed by the experimental results, as 85% of participants were able to use all the head movements we implemented in the prototype. The average success rate of this application is 70%, which is partly influenced by the performance of the eye-tracker we use. The performance of our fixed-gaze head movement detection algorithm is 85%, and there were no significant differences between the performance of each head movement.

References

  1. Vasileios Athanasios Anagnostopoulos and Peter Kiefer. 2016. Towards gaze-based interaction with urban outdoor spaces. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct. ACM, Heidelberg, Germany, 1706--1715.Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Dario Bonino, Emiliano Castellina, Fulvio Corno, A Gale, Alessandro Garbo, Kevin Purdy, and Fangmin Shi. 2009. A blueprint for integrated eye-controlled environments. Universal Access in the Information Society, Vol. 8, 4 (2009), 311. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Andreas Bulling, Daniel Roggen, and Gerhard Tröster. 2009. Wearable EOG Goggles: Eye-based Interaction in Everyday Environments. In CHI Conference on Human Factors in Computing Systems. ACM, Boston, USA, 3252--3264. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Heiko Drewes. 2010. Eye gaze tracking for human computer interaction. Ph.D. Dissertation. Ludwig Maximilian University of Munich.Google ScholarGoogle Scholar
  5. Heiko Drewes and Albrecht Schmidt. 2007. Interacting with the computer using gaze gestures. In IFIP Conference on Human-Computer Interaction. Springer, Rio de Janeiro, Brazil, 475--488. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Heiko Drewes and Albrecht Schmidt. 2009. The MAGIC touch: Combining MAGIC-pointing with a touch-sensitive mouse. In IFIP Conference on Human-Computer Interaction. Springer, Uppsala, Sweden, 415--428. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Andrew T Duchowski. 2007. Eye tracking methodology: Theory and practice .Springer. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze interaction for smart watches using smooth pursuit eye movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. ACM, Charlotte, North Carolina, USA, 457--466. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Sergio Garrido-Jurado, Rafael Mu noz-Salinas, Francisco José Madrid-Cuevas, and Manuel Jesús Mar'in-Jiménez. 2014. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognition, Vol. 47, 6 (2014), 2280--2292. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Ioannis Giannopoulos, Peter Kiefer, and Martin Raubal. 2015. GazeNav: Gaze-based pedestrian navigation. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services. Copenhagen, Denmark, 337--346. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Jeremy Hales, David Rozado, and Diako Mardanbegi. 2013. Interacting with objects in the environment by gaze and hand gestures. In Proceedings of the 3rd international workshop on pervasive eye tracking and mobile eye-based interaction. Lund, Sweden, 1--9.Google ScholarGoogle Scholar
  12. Anthony Hornof, Anna Cavender, and Rob Hoselton. 2004. Eyedraw: a system for drawing pictures with eye movements. In ACM SIGACCESS Accessibility and Computing. ACM, 86--93. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Robert JK Jacob. 1990. What you look at is what you get: eye movement-based interaction techniques. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 11--18. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Takuya Kobayashi, Takumi Toyamaya, Faisal Shafait, Masakazu Iwamura, Koichi Kise, and Andreas Dengel. 2012. Recognizing words in scenes with a head-mounted eye-tracker. In Document Analysis Systems (DAS), 2012 10th IAPR International Workshop on. IEEE, 333--338.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Angela Kwan. {n. d.}. 6 Benefits of Music Lessons. https://www.parents.com/kids/development/intellectual/6-benefits-of-music-lessons/. ({n. d.}). Accessed March 21, 2018.Google ScholarGoogle Scholar
  16. Hubert W Lilliefors. 1967. On the Kolmogorov-Smirnov test for normality with mean and variance unknown. Journal of the American statistical Association, Vol. 62, 318 (1967), 399--402.Google ScholarGoogle ScholarCross RefCross Ref
  17. Paul P Maglio, Teenie Matlock, Christopher S Campbell, Shumin Zhai, and Barton A Smith. 2000. Gaze and speech in attentive user interfaces. In Advances in Multimodal Interfaces ICMI 2000. Springer, 1--7.Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Paivi Majaranta and Kari-Jouko Raiha. 2002. Twenty years of eye typing: systems and design issues. In Proceedings of the 2002 symposium on Eye tracking research & applications. ACM, 15--22. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Diako Mardanbegi, Dan Witzner Hansen, and Thomas Pederson. 2012. Eye-based head gestures. In Proceedings of the symposium on eye tracking research and applications. ACM, 139--146. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Michael Mauderer, Florian Daiber, and Antonio Krüger. 2013. Combining touch and gaze for distant selection in a tabletop setting. In Proceedings of the Workshop on Gaze Interaction in the Post-WIMP World - ACM SIGCHI Conference on Human Factors in Computing Systems. ACM.Google ScholarGoogle Scholar
  21. Apurva Mehta and Malay Bhatt. 2014. Practical issues in the field of optical music recognition. IJARCSMS, Vol. 2 (2014), 513--518.Google ScholarGoogle Scholar
  22. Emilie Møllenbach, John Paulin Hansen, and Martin Lillholm. 2013. Eye movements in gaze interaction. Journal of Eye Movement Research, Vol. 6, 2 (2013).Google ScholarGoogle Scholar
  23. Tomi Nukarinen, Jari Kangas, Oleg vS pakov, Poika Isokoski, Deepak Akkil, Jussi Rantala, and Roope Raisamo. 2016. Evaluation of HeadTurn: An Interaction Technique Using the Gaze and Head Turns. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction. ACM, 43. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Alice Oh, Harold Fox, Max Van Kleek, Aaron Adler, Krzysztof Gajos, Louis-Philippe Morency, and Trevor Darrell. 2002. Evaluating look-to-talk: a gaze-aware interface in a collaborative environment. In CHI'02 Extended Abstracts on Human Factors in Computing Systems. ACM, 650--651. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Keith Rayner. 1998. Eye movements in reading and information processing: 20 years of research. Psychological bulletin, Vol. 124, 3 (1998), 372--421.Google ScholarGoogle Scholar
  26. Ana Rebelo, Ichiro Fujinaga, Filipe Paszkiewicz, Andre RS Marcal, Carlos Guedes, and Jaime S Cardoso. 2012. Optical music recognition: state-of-the-art and open issues. International Journal of Multimedia Information Retrieval, Vol. 1, 3 (2012), 173--190.Google ScholarGoogle ScholarCross RefCross Ref
  27. Jacek Ruminski, Adam Bujnowski, Jerzy Wtorek, Aliaksei Andrushevich, Martin Biallas, and Rolf Kistler. 2014. Interactions with recognized objects. In 7th International Conference on Human System Interactions (HSI), 2014. IEEE, 101--105.Google ScholarGoogle ScholarCross RefCross Ref
  28. Jeffrey S Shell, Ted Selker, and Roel Vertegaal. 2003. Interacting with groups of computers. Commun. ACM, Vol. 46, 3 (2003), 40--46. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Fangmin Shi, Alastair G Gale, and Kevin Purdy. 2006. Direct gaze based environmental controls. In The 2nd Conference on Communication by Gaze Interaction. COGAIN, 36--41.Google ScholarGoogle Scholar
  30. Oleg Spakov and Paivi Majaranta. 2012. Enhanced gaze interaction using simple head gestures. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing. ACM, 705--710. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Sophie Stellmach, Sebastian Stober, Andreas Nürnberger, and Raimund Dachselt. 2011. Designing gaze-supported multimodal interactions for the exploration of large image collections. In Proceedings of the 1st conference on novel gaze-controlled applications. ACM.Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Desney Tan and Anton Nijholt. 2010. Brain-computer interfaces and human-computer interaction. In Brain-Computer Interfaces. Springer, 3--19.Google ScholarGoogle Scholar
  33. Takumi Toyama, Thomas Kieninger, Faisal Shafait, and Andreas Dengel. 2011. Museum guide 2.0-an eye-tracking based personal assistant for museums and exhibits. In Proc. of Int. Conf. on Re-Thinking Technology in Museums, Vol. 1.Google ScholarGoogle Scholar
  34. Jayson Turner, Andreas Bulling, and Hans Gellersen. 2011. Combining gaze with manual interaction to extend physical reach. In Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interaction. ACM, 33--36. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Eduardo Velloso, Jayson Turner, Jason Alexander, Andreas Bulling, and Hans Gellersen. 2015. An empirical investigation of gaze selection in mid-air gestural 3D manipulation. In Human-Computer Interaction. Springer, 315--330.Google ScholarGoogle Scholar
  36. Eduardo Velloso, Markus Wirth, Christian Weichel, Augusto Esteves, and Hans Gellersen. 2016. AmbiGaze: Direct control of ambient devices by gaze. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems. ACM, 812--817. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Roel Vertegaal, Aadil Mamuji, Changuk Sohn, and Daniel Cheng. 2005. Media eyepliances: using eye tracking for remote control focus selection of appliances. In CHI'05 Extended Abstracts on Human Factors in Computing Systems. ACM, 1861--1864. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and gaze input cascaded (MAGIC) pointing. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems. ACM, 246--253. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Assisted Music Score Reading Using Fixed-Gaze Head Movement: Empirical Experiment and Design Implications

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in

          Full Access

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader
          About Cookies On This Site

          We use cookies to ensure that we give you the best experience on our website.

          Learn more

          Got it!