ABSTRACT
Being able to detect and to employ gaze enhances digital displays. Research on gaze-contingent or gaze-aware display devices dates back two decades. This is the time, though, that it could truly be employed for fast, low-latency gaze-based interaction and for optimization of computer graphics rendering such as in foveated rendering. Moreover, Virtual Reality (VR) is becoming ubiquitous. The widespread availability of consumer grade VR Head Mounted Displays (HMDs) transformed VR to a commodity available for everyday use. VR applications are now abundantly designed for recreation, work and communication. However, interacting with VR setups requires new paradigms of User Interfaces (UIs), since traditional 2D UIs are designed to be viewed from a static vantage point only, e.g. the computer screen. Adding to this, traditional input methods such as the keyboard and mouse are hard to manipulate when the user wears a HMD. Recently, companies such as HTC announced embedded eye-tracking in their headsets and therefore, novel, immersive 3D UI paradigms embedded in a VR setup can now be controlled via eye gaze. Gaze-based interaction is intuitive and natural the users. Tasks can be performed directly into the 3D spatial context without having to search for an out-of-view keyboard/mouse. Furthermore, people with physical disabilities, already depending on technology for recreation and basic communication, can now benefit even more from VR. This course presents timely, relevant information on how gaze-contingent displays, in general, including the recent advances of Virtual Reality (VR) eye tracking capabilities can leverage eye-tracking data to optimize the user experience and to alleviate usability issues surrounding intuitive interaction challenges. Research topics to be covered include saliency models, gaze prediction, gaze tracking, gaze direction, foveated rendering, stereo grading and 3D User Interfaces (UIs) based on gaze on any gaze-aware display technology.
Supplemental Material
Available for Download
Index Terms
Gaze-aware displays and interaction
Recommendations
A Design Space for Gaze Interaction on Head-mounted Displays
CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing SystemsAugmented and virtual reality (AR/VR) has entered the mass market and, with it, will soon eye tracking as a core technology for next generation head-mounted displays (HMDs). In contrast to existing gaze interfaces, the 3D nature of AR and VR requires ...
Gaze-based interaction with public displays using off-the-shelf components
UbiComp '10 Adjunct: Proceedings of the 12th ACM international conference adjunct papers on Ubiquitous computing - AdjunctEye gaze can be used to interact with high-density information presented on large displays. We have built a system employing off-the-shelf hardware components and open-source gaze tracking software that enables users to interact with an interface ...
Touchless gestural interaction with small displays: a case study
CHItaly '13: Proceedings of the Biannual Conference of the Italian Chapter of SIGCHITouchless gestural interaction enables users to interact with digital devices using body movements and gestures, and without the burden of a physical contact with technology (e.g., data gloves, body markers, or remote controllers). Most gesture-based ...




Comments