ABSTRACT
Glanceable interfaces are Augmented Reality (AR) User Interfaces (UIs) for information retrieval ”at a glance” relying on eye gaze for implicit input. While they provide rapid information retrieval, they often occlude a large part of the real-world. This is compounded as the amount of virtual information increases. Interacting with complex glanceable interfaces often results in unintentional eye gaze interaction and selections due to the Midas Touch problem. In this work, we present Holo-box, an innovative AR UI design that combines 2D compact glanceable interfaces with 3D virtual ”Holo-boxes”. We can utilize the glanceable 2D interface to provide compact information at a glance while using Holo-box for explicit input such as hand tracking activated when necessary, surpassing the Midas Touch problem and resulting in Level-of-Detail(LOD) for AR glanceable UIs. We test our proposed system inside a real-world machine shop to provide on-demand virtual information while minimizing unintentional real-world occlusion.
Supplemental Material
Available for Download
- William Buxton. 1995. Integrating the periphery and context: A new taxonomy of telematics. In Proceedings of graphics interface, Vol. 95. Citeseer, 239–246.Google Scholar
- Grigoris Daskalogrigorakis, Salva Kirakosian, Angelos Marinakis, Vaggelis Nikolidakis, Ioanna Pateraki, Aristomenis Antoniadis, and Katerina Mania. 2021. G-Code Machina: A Serious Game for G-code and CNC Machine Operation Training. IEEE EDUCON 2021 forthcoming, Games in Engineering Education Special Session (2021).Google Scholar
- Stephen DiVerdi, Tobias Hollerer, and Richard Schreyer. 2004. Level of detail interfaces. In 3rd IEEE and ACM Intl. Symp. on Mixed and Augmented Reality. IEEE, 300–301.Google Scholar
Digital Library
- Panagiotis Drakopoulos, George-alex Koulieris, and Katerina Mania. 2021. Eye Tracking Interaction on Unmodified Mobile VR Headsets Using the Selfie Camera. ACM Transactions on Applied Perception (TAP) 18, 3 (2021), 1–20.Google Scholar
Digital Library
- Robert JK Jacob. 1990. What you look at is what you get: eye movement-based interaction techniques. In Proceedings of the SIGCHI conference on Human factors in computing systems. 11–18.Google Scholar
Digital Library
- George Alex Koulieris, Kaan Akşit, Michael Stengel, Rafał K Mantiuk, Katerina Mania, and Christian Richardt. 2019. Near-eye display and tracking technologies for virtual and augmented reality. In Computer Graphics Forum, Vol. 38. 493–519.Google Scholar
- David Lindlbauer, Anna Maria Feit, and Otmar Hilliges. 2019. Context-aware online adaptation of mixed reality interfaces. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology. 147–160.Google Scholar
Digital Library
- Feiyu Lu, Shakiba Davari, Lee Lisle, Yuan Li, and Doug A Bowman. 2020. Glanceable ar: Evaluating information access methods for head-worn augmented reality. In 2020 IEEE conference on virtual reality and 3D user interfaces (VR). IEEE, 930–939.Google Scholar
Cross Ref
- Ann McNamara, Katherine Boyd, Joanne George, Weston Jones, Somyung Oh, and Annie Suther. 2019. Information placement in virtual reality. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 1765–1769.Google Scholar
Cross Ref
- Ercan Oztemel and Samet Gursev. 2020. Literature review of Industry 4.0 and related technologies. Journal of Intelligent Manufacturing 31, 1 (2020), 127–182.Google Scholar
Digital Library
- Ken Pfeuffer, Yasmeen Abdrabou, Augusto Esteves, Radiah Rivu, Yomna Abdelrahman, Stefanie Meitner, Amr Saadi, and Florian Alt. 2021. ARtention: A design space for gaze-adaptive UIs in augmented reality. Computers & Graphics 95(2021), 1–12.Google Scholar
Cross Ref
- Long Qian, Alexander Plopski, Nassir Navab, and Peter Kazanzides. 2018. Restoring the awareness in the occluded visual field for optical see-through head-mounted displays. IEEE trans. on visualization and computer graphics 24, 11 (2018), 2936–2946.Google Scholar
Cross Ref
- Zhiqiang Zhao, Da Jun Toh, Xiaoming Ding, Keng Chong Ng, See Choon Sin, and Yuen Choe Wong. 2020. Immersive Gamification Platform for Manufacturing Shopfloor Training. (2020).Google Scholar
- J Zhu, Soh-Khim Ong, and Andrew YC Nee. 2015. A context-aware augmented reality assisted maintenance system. International Journal of Computer Integrated Manufacturing 28, 2(2015), 213–225.Google Scholar
Digital Library
Recommendations
Augmented Reality Interfaces
Technological advances, exploding amounts of information, and user receptiveness are fueling augmented reality's (AR) rapid expansion from a novelty concept to potentially the default interface paradigm in coming years. This article briefly describes AR ...
Towards Ambient Augmented Reality with Tangible Interfaces
Proceedings of the 13th International Conference on Human-Computer Interaction. Part III: Ubiquitous and Intelligent InteractionAmbient Interface research has the goal of embedding technology that disappears into the user's surroundings. In many ways Augmented Reality (AR) technology is complimentary to this in that AR interfaces seamlessly enhances the real environment with ...
Markerless 3D gesture-based interaction for handheld augmented reality interfaces
SA '13: SIGGRAPH Asia 2013 Symposium on Mobile Graphics and Interactive ApplicationsConventional 2D touch-based interaction methods for handheld Augmented Reality (AR) cannot provide intuitive 3D interaction due to a lack of natural gesture input with real-time depth information. The goal of this research is to develop a natural ...




Comments