skip to main content
research-article

Typealike: Near-Keyboard Hand Postures for Expanded Laptop Interaction

Published:05 November 2021Publication History
Skip Abstract Section

Abstract

We propose a style of hand postures to trigger commands on a laptop. The key idea is to perform hand-postures while keeping the hands on, beside, or below the keyboard, to align with natural laptop usage. 36 hand-posture variations are explored considering three resting locations, left or right hand, open or closed hand, and three wrist rotation angles. A 30-participant formative study measures posture preferences and generates a dataset of nearly 350K images under different lighting conditions and backgrounds. A deep learning recognizer achieves over 97% accuracy when classifying all 36 postures with 2 additional non-posture classes for typing and non-typing. A second experiment with 20 participants validates the recognizer under real-time usage and compares posture invocation time with keyboard shortcuts. Results find low error rates and fast formation time, indicating postures are close to current typing and pointing postures. Finally, practical use case demonstrations are presented, and further extensions discussed.

Skip Supplemental Material Section

Supplemental Material

V5iss486wVF.mp4

Supplemental video

References

  1. Caroline Appert and Shumin Zhai. 2009. Using Strokes as Command Shortcuts: Cognitive Benefits and Toolkit Support. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Boston, MA, USA) (CHI '09). Association for Computing Machinery, New York, NY, USA, 2289--2298. https://doi.org/10.1145/1518701.1519052Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Oscar Kin-Chung Au, Kira Yip-Wo Yeung, and Jacky Kit Cheung. 2016. DownChord and UpChord: A New Style of Keyboard Shortcuts Based on Simultaneous Key-down and Key-up Events. In Proceedings of the Fourth International Symposium on Chinese CHI. 1--10.Google ScholarGoogle Scholar
  3. Gilles Bailly, Thomas Pietrzak, Jonathan Deber, and Daniel J Wigdor. 2013. Métamorphe: augmenting hotkey usage with actuated keys. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 563--572.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Sven Bambach, Stefan Lee, David J Crandall, and Chen Yu. 2015. Lending a hand: Detecting hands and recognizing activities in complex egocentric interactions. In Proceedings of the IEEE International Conference on Computer Vision. 1949--1957.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Steven Bird and Edward Loper. 2004. NLTK: the natural language toolkit. In Proceedings of the ACL 2004 on Interactive poster and demonstration sessions. Association for Computational Linguistics, 31.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Florian Block, Hans Gellersen, and Nicolas Villar. 2010. Touch-display keyboards: transforming keyboards into interactive surfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 1145--1154.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Daniel Buschek, Bianka Roppelt, and Florian Alt. 2018. Extending Keyboard Shortcuts with Arm and Wrist Rotation Gestures. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 21.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Sangwon Choi, Jiseong Gu, Jaehyun Han, and Geehyuk Lee. 2012. Area gestures for a laptop computer enabled by a hover-tracking touchpad. In Proceedings of the 10th asia pacific conference on Computer human interaction. ACM, 119--124.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Wenzhe Cui, Jingjie Zheng, Blaine Lewis, Daniel Vogel, and Xiaojun Bi. 2019. HotStrokes: Word-Gesture Shortcuts on a Trackpad. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, 165.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Jia Deng, Wei Dong, Richard Socher, Li-Jia Li, Kai Li, and Li Fei-Fei. 2009. Imagenet: A large-scale hierarchical image database. In 2009 IEEE conference on computer vision and pattern recognition. Ieee, 248--255.Google ScholarGoogle ScholarCross RefCross Ref
  11. Paul H Dietz, Benjamin Eidelson, Jonathan Westhues, and Steven Bathiche. 2009. A practical pressure sensitive computer keyboard. In Proceedings of the 22nd annual ACM symposium on User interface software and technology. 55--58.Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Wolfgang Fallot-Burghardt, Morten Fjeld, C Speirs, S Ziegenspeck, Helmut Krueger, and Thomas Läubli. 2006. Touch&Type: a novel pointing device for notebook computers. In Proceedings of the 4th Nordic conference on Humancomputer interaction: changing roles. ACM, 465--468.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition. 770--778.Google ScholarGoogle ScholarCross RefCross Ref
  14. Jeremy Howard and Sylvain Gugger. 2020. Fastai: A layered API for deep learning. Information 11, 2 (2020), 108.Google ScholarGoogle ScholarCross RefCross Ref
  15. Shaun K Kane, Daniel Avrahami, Jacob O Wobbrock, Beverly Harrison, Adam D Rea, Matthai Philipose, and Anthony LaMarca. 2009. Bonfire: a nomadic system for hybrid laptop-tabletop interaction. In Proceedings of the 22nd annual ACM symposium on User interface software and technology. ACM, 129--138.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Sunjun Kim and Geehyuk Lee. 2016. TapBoard 2: Simple and Effective Touchpad-like Interaction on a Multi-Touch Surface Keyboard. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 5163--5168.Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. David M Lane, H Albert Napier, S Camille Peres, and Anikó Sándor. 2005. Hidden costs of graphical user interfaces: Failure to make the transition from menus and icon toolbars to keyboard shortcuts. International Journal of Human- Computer Interaction 18, 2 (2005), 133--144.Google ScholarGoogle ScholarCross RefCross Ref
  18. Jinha Lee, Alex Olwal, Hiroshi Ishii, and Cati Boulanger. 2013. SpaceTop: integrating 2D and spatial 3D interactions in a see-through desktop environment. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 189--192.Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Yang Li, Ken Hinckley, Zhiwei Guan, and James A Landay. 2005. Experimental analysis of mode switching techniques in pen-based user interfaces. In Proceedings of the SIGCHI conference on Human factors in computing systems. 461--470. Proc. ACM Hum.-Comput. Interact., Vol. 5, No. ISS, Article 486. Publication date: November 2021. 486:20 Nalin Chhibber et al.Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Hugh McLoone, Ken Hinckley, and Edward Cutrell. 2003. Bimanual interaction on the microsoft office keyboard. In INTERACT'03. 49--56.Google ScholarGoogle Scholar
  21. Pranav Mistry and Patricia Maes. 2010. Mouseless. In Adjunct proceedings of the 23nd annual ACM symposium on User interface software and technology. ACM, 441--442.Google ScholarGoogle Scholar
  22. Thomas A Mysliwiec. 1994. Fingermouse: A freehand computer pointing interface. In in Proc. of Int'l Conf. on Automatic Face and Gesture Recognition. Citeseer.Google ScholarGoogle Scholar
  23. Michael Ortega and Laurence Nigay. 2009. AirMouse: Finger gesture for 2D and 3D interaction. In IFIP Conference on Human-Computer Interaction. Springer, 214--227.Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Fabian Pedregosa, Gaël Varoquaux, Alexandre Gramfort, Vincent Michel, Bertrand Thirion, Olivier Grisel, Mathieu Blondel, Peter Prettenhofer, RonWeiss, Vincent Dubourg, et al. 2011. Scikit-learn: Machine learning in Python. Journal of machine learning research 12, Oct (2011), 2825--2830.Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Thomas Pietrzak, Sylvain Malacria, and Gilles Bailly. 2014. CtrlMouse et TouchCtrl: duplicating mode delimiters on the mouse. In Proceedings of the 26th Conference on l'Interaction Homme-Machine. 38--47.Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. P Pramod Kumar, Prahlad Vadakkepat, and Loh Ai Poh. 2017. The NUS hand posture datasets I. (2017).Google ScholarGoogle Scholar
  27. P Pramod Kumar, Prahlad Vadakkepat, and Loh Ai Poh. 2017. The NUS hand posture datasets II. (2017).Google ScholarGoogle Scholar
  28. Julian Ramos, Zhen Li, Johana Rosas, Nikola Banovic, Jennifer Mankoff, and Anind Dey. 2016. Keyboard surface interaction: Making the keyboard into a pointing device. arXiv preprint arXiv:1601.04029 (2016).Google ScholarGoogle Scholar
  29. Jun Rekimoto, Takaaki Ishizawa, Carsten Schwesig, and Haruo Oba. 2003. PreSense: interaction techniques for finger sensing input devices. In Proceedings of the 16th annual ACM symposium on User interface software and technology. 203--212.Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Baer Sawin and S Baer. [n. d.]. DesRoches (1998). Mobile Computer User Frustrations 1998 Research Summary. Technical Report. IBM Internal Technical Report.Google ScholarGoogle Scholar
  31. Yilei Shi, Haimo Zhang, Hasitha Rajapakse, Nuwan Tharaka Perera, Tomás Vega Gálvez, and Suranga Nanayakkara. 2018. GestAKey: Touch Interaction on Individual Keycaps. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 596.Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Leslie N Smith. 2018. A disciplined approach to neural network hyper-parameters: Part 1--learning rate, batch size, momentum, and weight decay. arXiv preprint arXiv:1803.09820 (2018).Google ScholarGoogle Scholar
  33. Hemant Bhaskar Surale, Fabrice Matulic, and Daniel Vogel. 2019. Experimental Analysis of Barehand Mid-air Mode- Switching Techniques in Virtual Reality. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, 196.Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Stuart Taylor, Cem Keskin, Otmar Hilliges, Shahram Izadi, and John Helmes. 2014. Type-hover-swipe in 96 bytes: a motion sensing mechanical keyboard. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 1695--1704.Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Ying-Chao Tung, Ta Yang Cheng, Neng-Hao Yu, Chiuan Wang, and Mike Y Chen. 2015. FlickBoard: enabling trackpad interaction with automatic mode switching on a capacitive-sensing keyboard. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 1847--1850.Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Andrew D Wilson. 2006. Robust computer vision-based detection of pinching for one and two-handed gesture input. In Proceedings of the 19th annual ACM symposium on User interface software and technology. ACM, 255--258.Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Jacob OWobbrock, Andrew D Wilson, and Yang Li. 2007. Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes. In Proceedings of the 20th annual ACM symposium on User interface software and technology. 159--168.Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Robert Xiao, Greg Lew, James Marsanico, Divya Hariharan, Scott Hudson, and Chris Harrison. 2014. Toffee: enabling ad hoc, around-device interaction with acoustic time-of-arrival correlation. In Proceedings of the 16th international conference on Human-computer interaction with mobile devices & services. ACM, 67--76.Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Haimo Zhang and Yang Li. 2014. GestKeyboard: enabling gesture-based interaction on ordinary physical keyboard. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 1675--1684.Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Jingjie Zheng, Blaine Lewis, Jeff Avery, and Daniel Vogel. 2018. FingerArc and FingerChord: Supporting Novice to Expert Transitions with Guided Finger-Aware Shortcuts. (2018).Google ScholarGoogle Scholar
  41. Jingjie Zheng and Daniel Vogel. 2016. Finger-Aware Shortcuts. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 4274--4285.Google ScholarGoogle Scholar

Index Terms

  1. Typealike: Near-Keyboard Hand Postures for Expanded Laptop Interaction

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in

        Full Access

        • Published in

          cover image Proceedings of the ACM on Human-Computer Interaction
          Proceedings of the ACM on Human-Computer Interaction  Volume 5, Issue ISS
          ISS
          November 2021
          481 pages
          EISSN:2573-0142
          DOI:10.1145/3498314
          Issue’s Table of Contents

          Copyright © 2021 ACM

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 5 November 2021
          Published in pacmhci Volume 5, Issue ISS

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader
        About Cookies On This Site

        We use cookies to ensure that we give you the best experience on our website.

        Learn more

        Got it!