Abstract
Computers have increasingly become part of our everyday lives, with many activities either involving their direct use or being supported by one. This has prompted research into developing methods and mechanisms to assist humans in interacting with computers (human-computer interaction, or HCI). A number of HCI techniques have been developed over the years, some of which are quite old but continue to be used, and some more recent and still evolving. Many of these interaction techniques, however, are not natural in their use and typically require the user to learn a new means of interaction. Inconsistencies within these techniques and the restrictions they impose on user creativity can also make such interaction techniques difficult to use, especially for novice users.
This article proposes an alternative interaction method, the conductor interaction method (CIM), which aims to provide a more natural and easier-to-learn interaction technique. This novel interaction method extends existing HCI methods by drawing upon techniques found in human-human interaction. It is argued that the use of a two-phased multimodal interaction mechanism, using gaze for selection and gesture for manipulation, incorporated within a metaphor-based environment, can provide a viable alternative for interacting with a computer (especially for novice users). Both the model and an implementation of the CIM within a system are presented in this article. This system formed the basis of a number of user studies that have been performed to assess the effectiveness of the CIM, the findings of which are discussed in this work.
- 5DT. 2004. Data glove. http://www.5dt.com/.Google Scholar
- Argyle, M. 1996. Bodily Communication, 2nd ed., Routledge, London.Google Scholar
- Aoki, T. 1999. MONJUnoCHIE system: Videoconference system with eye contact for decision making. In Proceedings of the International Workshop on Advanced Image Technology (IWAIT).Google Scholar
- Bauer, B. and Kraiss, K. F. 2002. Towards an automatic sign language recognition system using subunits. In Proceedings of the International Gesture Workshop, on Gestrure and Sign Language in Human-Computer Interaction. Lecture Notes in Computer Science, vol. 2298, Springer, Heidelberg, Germany. Google Scholar
Digital Library
- Bauml, B. J. and Bauml, F. H. 1997. Dictionary of Worldwide Gestures. Scarecrow Press. MD.Google Scholar
- Benford, S., Snowdon, D., Greenhalgh, C., Ingram, R., Knox, I., and Brown, C. 1995. VR-VIBE: A virtual environment for co-operative information retrieval. In Proceedings of the Conference, Eurographics. Maastricht, The Netherlands.Google Scholar
- Bolt, R. 1980. “Put-that-there”: Voice and gesture at the graphics interface. In Proceedings of the 7th Annual Conference on Computer Graphics and Interactive Techniques, Seattle, WA. Google Scholar
Digital Library
- Bolt, R. and Harranz, E. 1992. Two-Handed gesture in multi-modal natural dialog. In Proceedings of the 5th Annual ACM Symposium on User Interface Software and Technology (UIST). Google Scholar
Digital Library
- Borchers, O. 1997. WorldBeat: Designing a baton-based interface for an interactive music exhibit. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Atlanta, GA. Google Scholar
Digital Library
- Chen, W. C. 2000. Toward a compelling sensation of telepresence: Demonstrating a portal to a distant (static) office. In Proceedings of the IEEE Visualization Conference, Salt Lake City, UT. Google Scholar
Digital Library
- Dix, A. J., Finlay, J. E., Abowd, G. D., and Beale, R. 2004. Human-Computer Interaction, 3rd ed., Prentice Hall, Hertfordshire, UK. Google Scholar
Digital Library
- Farid, M. M. and Murtagh, F. 2002. Eye-Movements and voice as interface modalities to computer systems. In Proceedings of the SPIE Regional Meeting on Optoelectronics, Photonics and Imaging (OPTO), Galway, Ireland.Google Scholar
- Farid, M. M., Murtagh, F., and Starck J. L. 2002. Computer display control and interaction using eye-gaze, J. Soc. Inf. Display.Google Scholar
Cross Ref
- Gribbs, S. J. 1999. TELEPORT-Towards immersive copresence. Multimedia Syst. 7, 3, 214--221. Google Scholar
Digital Library
- Gips, J. and Olivieri, P. 1996. EagleEyes: An eye control system for persons with disabilities. In Proceedings of the 11th International Conference on Technology and Persons with Disabilities, Los Angeles, CA.Google Scholar
- Hart, S. G. 1987. Background description and application of the NASA task load index (TLX). In Proceedings of the Department of Defence Human Engineering Technical Advisory Group Workshop on Workload (NUSC6688) Newport, RI, 90--97.Google Scholar
- Instance, H. and Howarth, P. 1994. Keeping an eye on your interface: The potential for eye-gaze control of graphical user interfaces. In Proceedings of the Conference on Human-Computer Interaction, People and Computers, Glasgow, 195--209. Google Scholar
Digital Library
- Jacob, R. J. K. 1991. The use of eye movements in human-computer interaction techniques: What you look at is what you get. ACM Trans. Inf. Syst. 9, 3, (Apr.). Google Scholar
Digital Library
- Kam, K. S. 2002. Definitions “dwell” vs. “gaze”. In Eye-Movement Mailing List, http://www.jiscmail.ac.uk/files/eye-movement/introduction.html.Google Scholar
- Kendon, A. 1992. The negotiation of context in face-to-face Interaction. In Rethinking context: Language as an Interactive Phenomenon, Duranti and Goodwin, Eds. Cambridge University Press.Google Scholar
- Kauff, P. and Schreer, O. 2002. An immersive 3D video-conferencing system using shared virtual team user environments. In Proceedings of the 4th International Conference on Collaborative Virtual Environments (CVE), Bonn, Germany, 105--112. Google Scholar
Digital Library
- LC-Technologies 1986. VOG. http://www.eyegaze.com/.Google Scholar
- Marrin, T. and Paradiso, J. 1997. The digital baton: A versatile performance instrument. In Proceedings of the International Computer Music Conference, Thessaloniki, Greece. Natural Point. 1997. Optical tracking systems. http://www.naturalpoint.com/.Google Scholar
- Park, E., Kim, B., Salim, W., and Cheok, A. D. 2006. Magic Asian art. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Montréal, Québec, Canada, 255--258. Google Scholar
Digital Library
- Qvarfordt, P. and Zhai, S. 2005. Conversing with the user based on eye-gaze patterns. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Portland, OR, 221--230. Google Scholar
Digital Library
- Salvucci, D. D. 1999. Inferring intent in eye-based interfaces: Tracing eye movements with process models. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Pittsburgh, PA. Google Scholar
Digital Library
- Shneiderman, B. and Plaisant, C. 2005. Designing the User Interface: Strategies for Effective Human-Computer Interaction. Addison-Wesley.Google Scholar
- Sibert, L. E. and Jacob, R. J. K. 2000. Evaluation of gaze interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, The Hague, The Netherlands. Google Scholar
Digital Library
- Sowa, T., Frohlich, M., and Latoschik, M. E. 1999. Temporal symbolic integration applied to a multimodal system using gestures and speech. In Proceedings of the International Gesture Workshop on Gesture-Based Communication in Human-Computer Interaction, Gif-Sur-Yvette, France. Lecture Notes in Computer Science, vol. 1739. Springer, Heidelberg, Germany. Google Scholar
Digital Library
- Thorisson, K. R. 1998. Real-Time decision-making in multimodal face-to-face communication. In Proceedings of the Autonomous Agents Conference, Minneapolis, MN. Google Scholar
Digital Library
- Vertegaal, R. 1999. The gaze groupware system: Mediating joint attention in multiparty communication and collaboration. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Pittsburgh, PA. Google Scholar
Digital Library
- Vertegaal, R., Slagter, R., van der Veer, G., and Nijholt, A. 2001. Eye gaze patterns in conversations: There is more the conversational agents than meets the eyes. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Seattle, WA. Google Scholar
Digital Library
- Zhai, S., Morimoto, C., and Ihde, S. 1999. Manual and gaze input cascade (MAGIC) pointing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Pittsburgh, PA. Google Scholar
Digital Library
Index Terms
The conductor interaction method
Recommendations
Integrating Point and Touch for Interaction with Digital Tabletop Displays
TractorBeam is a hybrid point-touch interaction technique for tabletop computer displays that seamlessly combines remote pointing and local touch. Results from studies investigating its use for target selection, docking, and puzzle tasks give some ...
Tabletop interaction: research alert
NordiCHI '06: Proceedings of the 4th Nordic conference on Human-computer interaction: changing rolesAt the t2i Lab we focus on tangible user interfaces (TUIs) to advance and improve the user experience in computer supported learning and problem solving. By directly interacting with physical controls, complex concepts such as the chemistry of a ...
Video Interaction Tools: A Survey of Recent Work
Digital video enables manifold ways of multimedia content interaction. Over the last decade, many proposals for improving and enhancing video content interaction were published. More recent work particularly leverages on highly capable devices such as ...








Comments