ABSTRACT
In this poster, we present a live 360° panoramic-video based empathic Mixed Reality (MR) collaboration system that shares various Near-Gaze non-verbal communication cues including gaze, hand pointing, gesturing, and heart rate visualisations in real-time. The preliminary results indicate that the interface with the partner’s communication cues visualised close to the gaze point allows users to focus without dividing attention to the collaborator’s physical body movements yet still effectively communicate. Shared gaze visualisations coupled with deictic languages are primarily used to affirm joint attention and mutual understanding, while hand pointing and gesturing are used as secondary. Our approach provides a new way to help enable effective remote collaboration through varied empathic communication visualisations and modalities which covers different task properties and spatial setups.
- Allison Jing, Brandon Matthews, Mark Billinghurst, Kieran May, Thomas Clarke, Gun Lee, and Mark Billinghurst. 2021a. EyemR-Talk: Using Speech to Visualise Shared MR Gaze Cues. In SIGGRAPH Asia 2021 Posters(SA ’21 Posters). Association for Computing Machinery, New York, NY, USA, 6–7. https://doi.org/10.1145/3476124.3488618Google Scholar
Digital Library
- Allison Jing, Kieran William May, Mahnoor Naeem, Gun Lee, Mark Billinghurst, Mahnoor Naeem, Gun Lee, and Mark Billinghurst. 2021b. EyemR-Vis: Using Bi-Directional Gaze Behavioural Cues to Improve Mixed Reality Remote Collaboration. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems(CHI EA ’21). Association for Computing Machinery, New York, NY, USA, 1–7. https://doi.org/10.1145/3411763.3451844Google Scholar
Digital Library
Recommendations
The Impact of Sharing Gaze Behaviours in Collaborative Mixed Reality
CSCWIn a remote collaboration involving a physical task, visualising gaze behaviours may compensate for other unavailable communication channels. In this paper, we report on a 360° panoramic Mixed Reality (MR) remote collaboration system that shares gaze ...
eyemR-Vis: A Mixed Reality System to Visualise Bi-Directional Gaze Behavioural Cues Between Remote Collaborators
CHI EA '21: Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing SystemsThis demonstration shows eyemR-Vis, a 360 panoramic Mixed Reality collaboration system that translates gaze behavioural cues to bi-directional visualisations between a local host (AR) and a remote collaborator (VR). The system is designed to share ...
eyemR-Vis: Using Bi-Directional Gaze Behavioural Cues to Improve Mixed Reality Remote Collaboration
CHI EA '21: Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing SystemsGaze is one of the most important communication cues in face-to-face collaboration. However, in remote collaboration, sharing dynamic gaze information is more difficult. In this research, we investigate how sharing gaze behavioural cues can improve ...





Comments