ABSTRACT
Traveling to different places simultaneously is a dream for several people, but it is difficult to realize this aspiration because of our physical space limits. On one hand, virtual reality technologies can help alleviate such limits. According to the best of the authors’ knowledge, there is no study attempt to operate multiple telepresence robots in remote places simultaneously, with presenting walk sensation feedback to the operator for an immersive multispace experience. In this study, we used autonomous mobile robots; a dog and wheel type one, where their movements’ direction can be controlled by an operator (Fig. 1). The operator can alternatively choose/re-choose the space (or robot) to attend and can move the viewpoint using a head-mounted display (HMD) controller. A live video image with 4 K resolution is transmitted to the HMD via web real-time communication (WebRTC) network from a 360° camera placed to the top of each robot. The operator perceives viewpoint movement feedback as a visual cue and vestibular feeling via waist motion and proprioception on the legs. Our system also allows viewpoint sharing in which fifty users can enjoy omnidirectional viewing of the remote environments through the HMD without walk-like sensation feedback.
- Lucas Bruck, Bruce Haycock, and Ali Emadi. 2021. A Review of Driving Simulation Technology and Applications. IEEE Open Journal of Vehicular Technology 2 (2021), 1–16. https://doi.org/10.1109/OJVT.2020.3036582Google Scholar
Cross Ref
- Markku Suomalainen, Basak Sakcak, Adhi Widagdo, Juho Kalliokoski, Katherine J. Mimnaugh, Alexis P. Chambers, Timo Ojala, and Steven M. LaValle. 2022. Unwinding Rotations Improves User Comfort with Immersive Telepresence Robots. CoRR abs/2201.02392(2022). arXiv:2201.02392https://arxiv.org/abs/2201.02392Google Scholar
- Susumu Tachi. 2016. Telexistence: Enabling Humans to Be Virtually Ubiquitous. IEEE Computer Graphics and Applications 36, 1 (2016), 8–14. https://doi.org/10.1109/MCG.2016.6Google Scholar
Digital Library
- Minori Unno, Ken Yamaoka, Vibol Yem, Tomohiro Amemiya, Michiteru Kitazaki, and Yasushi Ikei. 2021. Novel Motion Display for Virtual Walking. 482–492. https://doi.org/10.1007/978-3-030-78361-7_37Google Scholar
Digital Library
- Vibol Yem, Reon Nashiki, Tsubasa Morita, Fumiya Miyashita, Tomohiro Amemiya, and Yasushi Ikei. 2019. TwinCam Go: Proposal of Vehicle-Ride Sensation Sharing with Stereoscopic 3D Visual Perception and Vibro-Vestibular Feedback for Immersive Remote Collaboration. In SIGGRAPH Asia 2019 Emerging Technologies (Brisbane, QLD, Australia) (SA ’19). Association for Computing Machinery, New York, NY, USA, 53–54. https://doi.org/10.1145/3355049.3360540Google Scholar
Digital Library
Recommendations
Rendering of Virtual Walking Sensation by a Vestibular Display
Human Interface and the Management of Information. Information in Intelligent SystemsAbstractThe current study describes a method for rendering the sensation of walking using a vestibular display (a motion seat) with three degrees of freedom. The sensations evoked by the vestibular display and real walking were investigated using a ...
The effect of avatar realism in immersive social virtual realities
VRST '17: Proceedings of the 23rd ACM Symposium on Virtual Reality Software and TechnologyThis paper investigates the effect of avatar realism on embodiment and social interactions in Virtual Reality (VR). We compared abstract avatar representations based on a wooden mannequin with high fidelity avatars generated from photogrammetry 3D scan ...
Generation of Turning Walking Sensation by a Vestibular Display
VRCAI '19: Proceedings of the 17th International Conference on Virtual-Reality Continuum and its Applications in IndustryThis paper describes a new method to generate turning walking sensation by vestibular stimulation with initial (bias) inclination of the motion seat. Our vestibular display can move a seat in 3 degree-of-freedom: lifting, roll and pitch rotation. All of ...





Comments