Abstract
With the overarching goal of developing user-centric Virtual Reality (VR) systems, a new wave of studies focused on understanding how users interact in VR environments has recently emerged. Despite the intense efforts, however, current literature still does not provide the right framework to fully interpret and predict users’ trajectories while navigating in VR scenes. This work advances the state-of-the-art on both the study of users’ behaviour in VR and the user-centric system design. In more detail, we complement current datasets by presenting a publicly available dataset that provides navigation trajectories acquired for heterogeneous omnidirectional videos and different viewing platforms—namely, head-mounted display, tablet, and laptop. We then present an exhaustive analysis on the collected data to better understand navigation in VR across users, content, and, for the first time, across viewing platforms. The novelty lies in the user-affinity metric, proposed in this work to investigate users’ similarities when navigating within the content. The analysis reveals useful insights on the effect of device and content on the navigation, which could be precious considerations from the system design perspective. As a case study of the importance of studying users’ behaviour when designing VR systems, we finally propose a user-centric server optimisation. We formulate an integer linear program that seeks the best stored set of omnidirectional content that minimises encoding and storage cost while maximising the user’s experience. This is posed while taking into account network dynamics, type of video content, and also user population interactivity. Experimental results prove that our solution outperforms common company recommendations in terms of experienced quality but also in terms of encoding and storage, achieving a savings up to 70%. More importantly, we highlight a strong correlation between the storage cost and the user-affinity metric, showing the impact of the latter in the system architecture design.
- Mathias Almquist, Viktor Almquist, Vengatanathan Krishnamoorthi, Niklas Carlsson, and Derek Eager. 2018. The prefetch aggressiveness tradeoff in 360 video streaming. In Proceedings of the 9th ACM Multimedia Systems Conference (MMSys’18).Google Scholar
Digital Library
- Amazon. 2019. Cloud Storage pricing. Retrieved from https://aws.amazon.com/s3/pricing/.Google Scholar
- Amazon. 2019. Elastic transcoder pricing. Retrieved from https://aws.amazon.com/elastictranscoder/pricing/.Google Scholar
- Apple. 2018. HLS Authoring Specification for Apple Devices. Retrieved from https://developer.apple.com.Google Scholar
- Samantha W. Bindman, Lisa M. Castaneda, Mike Scanlon, and Anna Cechony. 2018. Am I a Bunny?: The impact of high and low immersion platforms and viewers’ perceptions of role on presence, narrative engagement, and empathy during an animated 360 video. In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI’18).Google Scholar
Digital Library
- F. Chao, L. Zhang, W. Hamidouche, and O. Deforges. 2018. Salgan360: Visual saliency prediction on 360 degree images with generative adversarial networks. In Proceedings of the IEEE International Conference on Multimedia Expo Workshops (ICMEW’18).Google Scholar
- Xavier Corbillon, Francesca De Simone, and Gwendal Simon. 2017. 360 degreee video head movement dataset. In Proceedings of the 8th ACM on Multimedia Systems Conference (MMSys’17).Google Scholar
Digital Library
- Xavier Corbillon, Alisa Devlic, Gwendal Simon, and Jacob Chakareski. 2017. Optimal set of 360-degree videos for viewport-adaptive streaming. In Proceedings of the 25th ACM International Conference on Multimedia (MM’17).Google Scholar
Digital Library
- Simone Croci, Cagri Ozcinar, Emin Zerman, Julián Cabrera, and Aljosa Smolic. 2019. Voronoi-based objective quality metrics for omnidirectional video. In Proceedings of the IEEE 11th International Conference on Quality of Multimedia Experience (QoMEX’19).Google Scholar
Cross Ref
- Erwan J. David, Jesús Gutiérrez, Antoine Coutrot, Matthieu Perreira Da Silva, and Patrick Le Callet. 2018. A dataset of head and eye movements for 360 videos. In Proceedings of the 9th ACM Multimedia Systems Conference (MMSys’18).Google Scholar
Digital Library
- Ana De Abreu, Cagri Ozcinar, and Aljosa Smolic. 2017. Look around you: Saliency maps for omnidirectional images in VR applications. In Proceedings of the 9th International Conference on Quality of Multimedia Experience (QoMEX’17).Google Scholar
- Francesca De Simone, Jesús Gutiérrez, and Patrick Le Callet. 2019. Complexity measurement and characterization of 360-degree content. Electron. Imag. 2019, 12 (2019).Google Scholar
- Fanyi Duanmu, Yixiang Mao, Shuai Liu, Sumanth Srinivasan, and Yao Wang. 2018. A subjective study of viewer navigation behaviors when watching 360-degree videos on computers. In Proceedings of the IEEE International Conference on Multimedia and Expo (ICME’18).Google Scholar
Cross Ref
- Ching-Ling Fan, Wen-Chih Lo, Yu-Tung Pai, and Cheng-Hsin Hsu. 2019. A survey on 360 video streaming: Acquisition, transmission, and display. ACM Comput. Surv. 52, 4 (2019).Google Scholar
Digital Library
- Colm O. Fearghail, Cagri Ozcinar, Sebastian Knorr, and Aljosa Smolic. 2018. Director’s cutAnalysis of aspects of interactive storytelling for VR films. In Proceedings of the International Conference for Interactive Digital Storytelling (ICIDS’18).Google Scholar
Cross Ref
- Colm O. Fearghail, Cagri Ozcinar, Sebastian Knorr, and Aljosa Smolic. 2018. Director’s cut—Analysis of VR film cuts for interactive storytelling. In Proceedings of the IEEE International Conference on 3D Immersion (IC3D’18).Google Scholar
Cross Ref
- Stephan Fremerey, Ashutosh Singla, Kay Meseberg, and Alexander Raake. 2018. AVtrack360: An open dataset and software recording people’s head rotations watching 360 videos on an HMD. In Proceedings of the 9th ACM Multimedia Systems Conference (MMSys’18).Google Scholar
Digital Library
- M. Graf, C. Timmerer, and C. Mueller. 2017. Towards bandwidth efficient adaptive streaming of omnidirectional video over HTTP: Design, implementation, and evaluation. In Proceedings of the 8th ACM on Multimedia Systems Conference (MMSys’17).Google Scholar
- Jonathan Harth, Alexandra Hofmann, Mike Karst, David Kempf, Annelie Ostertag, Isabell Przemus, and Bernhard Schaefermeyer. 2018. Different types of users, different types of immersion: A user study of interaction design and immersion in consumer virtual reality. IEEE Consum. Electron. Mag. 7, 4 (2018).Google Scholar
Cross Ref
- IBM. 2013. ILOG CPLEX Optimization studio. Retrieved from https://www-01.ibm.com/software/.Google Scholar
- MulticoreWare Inc.2018. x265 HEVC Encoder / H.265 Video Codec. Retrieved from http://x265.org/.Google Scholar
- ITU-T. 2008. Subjective Video Quality Assessment Methods for Multimedia Applications. ITU-T Recom. P.910.Google Scholar
- Chakareski Jacob, Aksu Ridvan, Corbillon Xavier, Simon Gwendal, and Swaminathan Viswanathan. 2018. Viewport-driven rate-distortion optimized 360 video streaming. In Proceedings of the IEEE International Conference on Communications (ICC’18).Google Scholar
- Sebastian Knorr, Cagri Ozcinar, Colm O. Fearghail, and Aljosa Smolic. 2018. Director’s cut—A combined dataset for visual attention analysis in cinematic VR content. In Proceedings of the 15th ACM SIGGRAPH European Conference on Visual Media Production (CVMP’18).Google Scholar
Digital Library
- Chen Li, Mai Xu, Xinzhe Du, and Zulin Wang. 2018. Bridge the gap between VQA and human behavior on omnidirectional video: A large-scale dataset and a deep learning model. In Proceedings of the 26th ACM International Conference on Multimedia (MM’18).Google Scholar
Digital Library
- Wen-Chih Lo, Ching-Ling Fan, Jean Lee, Chun-Ying Huang, Kuan-Ta Chen, and Cheng-Hsin Hsu. 2017. 360 video viewing dataset in head-mounted virtual reality. In Proceedings of the 8th ACM on Multimedia Systems Conference (MMSys’17).Google Scholar
Digital Library
- Lester C. Loschky, Adam M. Larson, Joseph P. Magliano, and Tim J. Smith. 2015. What would Jaws do? The tyranny of film and the relationship between gaze and higher-level narrative film comprehension. PloS One 10, 11 (2015).Google Scholar
- Anahita Mahzari, Afshin Taghavi Nasrabadi, Aliehsan Samiei, and Ravi Prakash. 2018. FoV-aware edge caching for adaptive 360 video streaming. In Proceedings of the 26th ACM International Conference on Multimedia (MM’18).Google Scholar
Digital Library
- Pantelis Maniotis, Eirina Bourtsoulatze, and Nikolaos Thomos. 2019. Tile-based joint caching and delivery of 360 videos in heterogeneous networks. IEEE Trans. Multimedia 2019, 12 (2019).Google Scholar
- Kiran Misra, Andrew Segall, Michael Horowitz, Shilin Xu, Arild Fuldseth, and Minhua Zhou. 2013. An overview of tiles in HEVC. IEEE J. Select. Topics Sig. Proc. 7, 6 (2013).Google Scholar
- Afshin Taghavi Nasrabadi, Aliehsan Samiei, Anahita Mahzari, Ryan P. McMahan, Ravi Prakash, Mylène C. Q. Farias, and Marcelo M. Carvalho. 2019. A taxonomy and dataset for 360 videos. In Proceedings of the 10th ACM Multimedia Systems Conference (MMSys’19).Google Scholar
- Netflix. 2015. Per-Title Encode Optimization. Retrieved from https://medium.com/netflix-techblog/per-title-encode-optimization-7e99442b62a2.Google Scholar
- Duc V. Nguyen, Huyen T. T. Tran, Anh T. Pham, and Truong C. Thang. 2019. An optimal tile-based approach for viewport-adaptive 360-degree video streaming. IEEE J. Emerg. Select. Topics Circ. Syst. 9, 1 (2019).Google Scholar
- Omar A. Niamut, Emmanuel Thomas, Lucia D’Acunto, Cyril Concolato, Franck Denoual, and Seong Yong Lim. 2016. MPEG DASH SRD: Spatial relationship description. In Proceedings of the 7th International Conference on Multimedia Systems (MMSys’16).Google Scholar
Digital Library
- J.-R. Ohm and G. Sullivan. 2011. Vision, Applications and Requirements for High Efficiency Video Coding (HEVC). Technical Report. ISO/IEC JTC1/SC29/WG11.Google Scholar
- Cagri Ozcinar, Julián Cabrera, and Aljosa Smolic. 2019. Visual attention-aware omnidirectional video streaming using optimal tiles for virtual reality. IEEE J. Emerg. Select. Topics Circ. Syst. 9, 1 (2019).Google Scholar
- Cagri Ozcinar, Ana De Abreu, Sebastian Knorr, and Aljosa Smolic. 2017. Estimation of optimal encoding ladders for tiled 360 VR video in adaptive streaming systems. In Proceedings of the IEEE International Symposium on Multimedia (ISM’17).Google Scholar
Cross Ref
- Cagri Ozcinar, Ana De Abreu, and Aljosa Smolic. 2017. Viewport-aware adaptive 360 video streaming using tiles for virtual reality. In Proceedings of the IEEE International Conference on Image Processing (ICIP’17).Google Scholar
Cross Ref
- Cagri Ozcinar and Aljosa Smolic. 2018. Visual attention in omnidirectional video for virtual reality applications. In Proceedings of the IEEE 10th International Conference on Quality of Multimedia Experience (QoMEX’18).Google Scholar
Cross Ref
- Anjul Patney, Marco Salvi, Joohwan Kim, Anton Kaplanyan, Chris Wyman, Nir Benty, David Luebke, and Aaron Lefohn. 2016. Towards foveated rendering for gaze-tracked virtual reality. ACM Trans. Graph. 35, 6 (2016).Google Scholar
Digital Library
- S. Petrangeli, G. Simon, and V. Swaminathan. 2018. Trajectory-based viewport prediction for 360-degree virtual reality videos. In Proceedings of the IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR’18).Google Scholar
- Michelle M. Ramey, Andrew P. Yonelinas, and John M. Henderson. 2019. Conscious and unconscious memory deferentially impact attention: Eye movements, visual search, and recognition processes. Cognition 185, 1 (2019).Google Scholar
- Silvia Rossi, Francesca De Simone, Pascal Frossard, and Laura Toni. 2019. Spherical clustering of users navigating 360 content. In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP’19).Google Scholar
Cross Ref
- Silvia Rossi and Laura Toni. 2017. Navigation-aware adaptive streaming strategies for omnidirectional video. In Proceedings of the IEEE 19th International Workshop on Multimedia Signal Processing (MMSP’17).Google Scholar
Cross Ref
- Jose Rubio-Tamayo, Manuel Gertrudix Barrio, and Francisco García García. 2017. Immersive environments and virtual reality: Systematic review and advances in communication, interaction and simulation. Multimod. Technol. Interact. 1, 4 (2017).Google Scholar
- Salient360! 2019. Salient360!—Visual Attention Modeling for 360 Content. Retrieved from https://salient360.ls2n.fr/grand-challenges/.Google Scholar
- Vincent Sitzmann, Ana Serrano, Amy Pavel, Maneesh Agrawala, Diego Gutierrez, Belen Masia, and Gordon Wetzstein. 2018. Saliency in VR: How do people explore virtual environments? IEEE Trans. Vis. Comput. Graph. 24, 4 (2018).Google Scholar
Digital Library
- Mel Slater and Maria V. Sanchez-Vives. 2016. Enhancing our lives with immersive virtual reality. Front. Robot. AI 3, 1 (2016).Google Scholar
- Yule Sun, Ang Lu, and Lu Yu. 2017. Weighted-to-spherically-uniform quality evaluation for omnidirectional video. IEEE Sig. Proc. Lett. 24, 9 (2017).Google Scholar
- C. Timmerer. 2017. Immersive media delivery: Overview of ongoing standardization activities. IEEE Commun. Stand. Mag. 1, 4 (2017).Google Scholar
Cross Ref
- Laura Toni, Ramon Aparicio-Pardo, Karine Pires, Gwendal Simon, Alberto Blanc, and Pascal Frossard. 2015. Optimal selection of adaptive streaming representations. ACM Trans. Multimedia Comput. Commun. Applic. 11, 2s (2015).Google Scholar
Digital Library
- Audrey Tse, Charlene Jennett, Joanne Moore, Zillah Watson, Jacob Rigby, and Anna L. Cox. 2017. Was I there?: Impact of platform and headphones on 360 video immersion. In Proceedings of the ACM Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA’17).Google Scholar
- Chenglei Wu, Zhihao Tan, Zhi Wang, and Shiqiang Yang. 2017. A dataset for exploring user behaviors in VR spherical video streaming. In Proceedings of the 8th ACM on Multimedia Systems Conference (MMSys’17).Google Scholar
Digital Library
- Mengbai Xiao, Chao Zhou, Yao Liu, and Songqing Chen. 2017. OpTile: Toward optimal tiling in 360-degree video streaming. In Proceedings of the 25th ACM International Conference on Multimedia (MM’17).Google Scholar
Digital Library
- M. Xu, C. Li, S. Zhang, and P. Le Callet. 2020. State-of-the-art in 360 video/image processing: Perception, assessment and compression. IEEE J. Select. Topics Sig. Proc. 14, 1 (2020).Google Scholar
- Matt Yu, Haricharan Lakshman, and Bernd Girod. 2015. A framework to evaluate omnidirectional video coding schemes. In Proceedings of the IEEE International Symposium on Mixed and Augmented Reality.Google Scholar
Digital Library
- Ziheng Zhang, Yanyu Xu, Jingyi Yu, and Shenghua Gao. 2018. Saliency detection in 360 videos. In Proceedings of the European Conference on Computer Vision (ECCV’18).Google Scholar
Cross Ref
- Junni Zou, Chenglin Li, Chengming Liu, Qin Yang, Hongkai Xiong, and Eckehard Steinbach. 2020. Probabilistic tile visibility-based server-side rate adaptation for adaptive 360-degree video streaming. IEEE J. Select. Topics Sig. Proc. 14, 1 (2020).Google Scholar
Index Terms
Do Users Behave Similarly in VR? Investigation of the User Influence on the System Design
Recommendations
Estimating VR Sickness and user experience using different HMD technologies: An evaluation study
AbstractThis paper presents results of a user study of the effects of virtual reality technology on VR Sickness and User Experience. In our study the participants watched two different panoramic (360) videos, one with relaxing content (beach ...
Highlights- Assessing VR Sickness discomfort levels using the SSQ and Questionnaire.
- User ...
Measuring User Quality of Experience in Social VR systems
AltMM'18: Proceedings of the 3rd International Workshop on Multimedia Alternate RealitiesVirtual Reality (VR) is a computer-generated experience that can simulate physical presence in real or imagined environments [7]. A social VR system is an application that allows multiple users to join a collaborative Virtual Environment (VE), such as a ...
Influence of Narrative Elements on User Behaviour in Photorealistic Social VR
MMVE '21: Proceedings of the International Workshop on Immersive Mixed and Virtual Environment Systems (MMVE '21)Social Virtual Reality (VR) applications represent a big step forward in the field of remote communication. Social VR provides the possibility for participants to explore and interact with virtual environments and objects, feelings of a full sense of ...






Comments