ABSTRACT
This paper describes Sound Scope Phone, an application that enables you to emphasize the part you want to listen to in a song consisting of multiple parts by head direction or hand gestures. The previously proposed interface required special headphones equipped with a digital compass and distance sensor to detect the direction of the head and distance between the head and a hand, respectively. Sound Scope Phone integrates face tracking information on the basis of images from the front camera of a commercially available smartphone with information from the built-in acceleration/gyro sensor to detect the head direction. The built application is published on the Apple App store under the name SoundScopePhone.
- Camille Goudeseune and Hank Kaczmarski. 2001. Composing outdoor augmented-reality sound environments. In International Computer Music Conference. 83–86.Google Scholar
- Masatoshi Hamanaka. 2006. Music Scope Headphones: Natural User Interface for Selection of Music. In Proceedings of ISMIR 2006, 7th International Conference on Music Information Retrieval, Victoria, Canada, 8-12 October 2006,. 302–307.Google Scholar
- Masatoshi Hamanaka and SuengHee Lee. 2009. Sound Scope Headphones. In ACM SIGGRAPH 2009 Emerging Technologies (New Orleans, Louisiana) (SIGGRAPH ’09). Association for Computing Machinery, New York, NY, USA, Article 21, 1 pages. https://doi.org/10.1145/1597956.1597977Google Scholar
Digital Library
- Warusfel Oliver and Eckel Gerhard. 2004. LISTEN - Augmenting Eeveryday Environments Through Interactive Soundscapes. In Proceedings of IEEE Workshop on VR for public consumption, IEEE Virtural Reality. 268–275.Google Scholar
- OpenAL. n.d.. Cross Platform 3D Audio. https://www.openal.org/. (Accessed on May 25, 2022).Google Scholar
- François Pachet and Olivier Delerue. 1998. A Mixed 2D/3D Interface for Music Spatialization. In Virtual Worlds, Jean-Claude Heudin (Ed.). Springer Berlin Heidelberg, Berlin, Heidelberg, 298–307.Google Scholar
- Francois Pachet and Olivier Delerue. 2000. On-The-Fly Multi Track Mixing. In Proceedings of AES 109th Convention, Los Angeles. Audio Engineering Society.Google Scholar
- Jiann-Rong Wu, Cha-Dong Duh, Ming Ouhyoung, and Jei-Tun Wu. 1997. Head Motion and Latency Compensation on Localization of 3D Sound in Virtual Reality. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology (Lausanne, Switzerland) (VRST ’97). Association for Computing Machinery, New York, NY, USA, 15–20. https://doi.org/10.1145/261135.261140Google Scholar
Digital Library
Index Terms
Sound Scope Phone: Focusing Parts by Natural Movement
Recommendations
Music Scope Pad: Video Selecting Interface by Natural Movement in VR Space
UIST '22 Adjunct: Adjunct Proceedings of the 35th Annual ACM Symposium on User Interface Software and TechnologyThis paper describes a novel video selecting interface that enables us to select videos without having to click a mouse or touch a screen. Existing video players enable us to see and hear only one video at a time, and thus we have to play pieces ...
Sound Scope Pad: Controlling a VR Concert with Natural Movement
ICMI '22: Proceedings of the 2022 International Conference on Multimodal InteractionWe developed Sound Scope Pad, an application that provides an active music listening experience that combines AI, virtual reality, and spatial acoustics. Users can emphasize the sounds of certain performers by turning their head to the left or right or ...
MapMyTrip app for Android smart phone
COM.Geo '12: Proceedings of the 3rd International Conference on Computing for Geospatial Research and ApplicationsThe purpose of MapMyTrip Android app is to provide fun and convenient tools for travelers to create their trip journal while on-the-go. This app will save the travelers a tremendous amount of time by utilizing the all-in-one capability of smart phone ...





Comments