ABSTRACT
The real-time synthesis of 3D spatial audio has many applications, from virtual reality to navigation for the visually-impaired. Head-related transfer functions (HRTF) can be used to generate spatial audio based on a model of the user's head. Previous studies have focused on the creation and interpolation of these functions with little regard for real-time performance. In this paper, we present an FPGA-based platform for real-time synthesis of spatial audio using FIR filters created from head-related transfer functions. For performance reasons, we run filtering, crossfading, and audio output on FPGA fabric, while calculating audio source locations and storing audio files on the CPU. We use a head-mounted 9-axis IMU to track the user's head in real-time and adjust relative spatial audio locations to create the perception that audio sources are fixed in space. Our system, running on a Xilinx Zynq Z-7020, is able to support 4X more audio sources than a comparable GPU and 8X more sources than a CPU while maintaining sub-millisecond latency and comparable power consumption. Furthermore, we show how our system can be leveraged to communicate the location of landmarks and obstacles to a visually-impaired user during a sailing race or other navigation scenario. We test our system with multiple users and show that, as a result of our reduced latency, a user is able to locate a virtual audio source with an extremely high degree of accuracy and navigate toward it.
- [n. d.]. Vela: IL PROGETTO HOMERUS : Chi è Homerus :. http://www.homerus. it/Velanon-vedenti/chiehomerus.htmlGoogle Scholar
- 2018. Stereo Recording Methods: The 5 Useful Techniques. https:// ehomerecordingstudio.com/stereo-microphone-techniques/Google Scholar
- C. Phillip Brown and Richard O. Duda. 2014. Implementation of 3 D Audio Effects using Head Related Transfer Function ( HRTF ) for Real Time Application using Blackfin Processor.Google Scholar
- Brian Carty and Victor Lazzarini. 2009. BINAURAL HRTF BASED SPATIALISATION: NEW APPROACHES AND IMPLEMENTATION.Google Scholar
- Wolfgang Fohl, Jürgen Reichardt, and Jan Kuhr. [n. d.]. A System-On-Chip Platform for HRTF-Based Realtime Spatial Audio Rendering With an Improved Realtime Filter Interpolation.Google Scholar
- Emmanuel Gallo and Nicolas Tsingos. 2004. Efficient 3D Audio Processing with the GPU. (08 2004).Google Scholar
- Bill Gardner and Keith Martin. 1994. HRTF Measurements of a KEMAR Dummy- Head Microphone. https://sound.media.mit.edu/resources/KEMAR.htmlGoogle Scholar
- Texas Instruments. May 2012. PCM510xA Stereo Audio DAC Datasheet.Google Scholar
- InvenSense. June 2016. MPU-9250 Product Specification Revision 1.1.Google Scholar
- R. Jordan, A. Morrow, and P. Ruvolo. 2017. Sensing sail luffing by detection of sail shape. In OCEANS 2017 - Aberdeen. 1--5. https://doi.org/10.1109/OCEANSE. 2017.8084810Google Scholar
Cross Ref
- K. Keil, A. Morrow, J. B. Geddes, and P. Ruvolo. 2017. Autonomous sailing for blind sailors using GPS. In OCEANS 2017 - Aberdeen. 1--6. https://doi.org/10. 1109/OCEANSE.2017.8085015Google Scholar
Cross Ref
- World Sailing and Sail Sheboygan. September 2016. Blind Match Racing World Championship Notice of Race.Google Scholar
- Jacob Scarpaci, H. Steven Colburn, and John White. 0002. A system for real-time virtual auditory space. (08 0002).Google Scholar
- John Semmlow. 2012. Chapter 8 - Linear System Analysis: Applications. In Signals and Systems for Bioengineers (Second Edition) (second edition ed.), John Semmlow (Ed.). Academic Press, Boston, 317 -- 374. https://doi.org/10.1016/ B978-0--12--384982--3.00008-0Google Scholar
- Olivier Warusfel. 2002. http://recherche.ircam.fr/equipes/salles/listen/Google Scholar
- Xilinx 2019. LogiCORE IP Product Guide. Xilinx.Google Scholar
Index Terms
Real-Time Spatial 3D Audio Synthesis on FPGAs for Blind Sailing
Recommendations
User experience of stereo and spatial audio in 360° live music videos
AcademicMindtrek '20: Proceedings of the 23rd International Conference on Academic Mindtrek360° music videos are becoming prevalent in music entertainment. Still, academic studies of the 360° live music experience covering both audio and visual experience are scarce. In this paper, we present a study of user experience of stereo and spatial ...
Generative Audio and Real-Time Soundtrack Synthesis in Gaming Environments: An exploration of how dynamically rendered soundtracks can introduce new artistic sound design opportunities and enhance the immersion of interactive audio spaces.
OzCHI '20: Proceedings of the 32nd Australian Conference on Human-Computer InteractionAn important yet oft-overlooked front in the scope of interactive media, audio technologies have remained relatively stagnant compared to groundbreaking advancements made in fields such as visual fidelity and virtual reality. This paper explores the use ...
Sonification of a juggling performance using spatial audio
We developed a new musical instrument by equipping juggling balls with accelerometers, gyroscopes, and WiFi sensors. The system measures acceleration and rotation, allowing us to distinguish between ball movement and static states. We can determine the ...





Comments