ABSTRACT
Visual displays are still mainly used in the in-vehicle context, but they may be problematic for providing timely, appropriate feedback to drivers. To compensate for the drawbacks of visual displays, multimodal displays have been developed, but applied to limited areas (e.g., collision warning sounds). The present paper introduces advanced vehicle sonification applications: Two of our on-going projects (fuel efficiency sonification and driver emotion sonification) and a plausible future project (nearby traffic sonification). In addition, applicable sonification techniques and solutions are provided. Sonification applications to these areas can be an effective, unobtrusive means to increase drivers' situation awareness and engagement with driving, which will lead to road safety. To successfully implement these applications, iterative and intensive assessment of driver needs, effectiveness of the application, and its impact on driver distraction and road safety should be conducted.
- Nees, M. A. and Walker, B. N. Auditory displays for in-vehicle technologies. In P. DeLucia (Ed.), Reviews of Human Factors and Ergonomics, 7, 1 (pp. 58--99). Santa Monica, CA: Human Factors and Ergonomics Society. 2011.Google Scholar
Cross Ref
- Belz, S. M., Robinson, G. S. and Casali, J. G. A new class of auditory warning signals for complex systems: Auditory icons. Human Factors, 41, 4, 1999, 608--618.Google Scholar
Cross Ref
- Jeon, M., Park, J., Heo, U. and Yun, J. Enhanced turning point displays facilitate drivers' interaction with navigation devices. In Proceedings of the 1st International Conference on Automotive User Interfaces and Interactice Vehicular Applications (AutomotiveUI09) (Essen, Germany, 2009). Google Scholar
Digital Library
- Jeon, M., Gable, T. M., Davison, B. K., Nees, M., Wilson, J. and Walker, B. N. Menu navigation with invehicle technologies: Auditory menu cues improve dual task performance, preference, and workload. International Journal of Human-Computer Interaction, published online, 2014.Google Scholar
- Wickens, C. D. Multiple resources and performance prediction. Theoretical Issues in Ergonomics Science, 3, 2002, 159--177.Google Scholar
- Meschtscherjakov, A., Wilfinger, D., Scherndl, T. and Tscheligi, M. Acceptance of future persuasive in-car interfaces towards a more economic driving behaviour, In Proceedings of the 1st International Conference on Automotive User Interfaces and Interactice Vehicular Applications (AutomotiveUI09) (Essen, Germany, 2009), pp. 81--88. Google Scholar
Digital Library
- Nees, M. A., Gable, T., Jeon, M. and Walker, B. N. Prototype auditory displays for a fuel efficiency driver interface In Proceedings of the 20th International Conference on Auditory Display (ICAD2014) (NY, USA, 2014).Google Scholar
- Lisetti, C. L. and Nasoz, F. Affective intelligent car interfaces with emotion recognition. In Proceedings of the 11th International Conference on Human Computer Interaction (Las Vegas, NV, USA, 2005).Google Scholar
- Healey, J. Wearable and automotive systems for affect recognition from physiology. PhD Dissertation, Massachusetts Institute of Technology, Boston, MA, 2000. Google Scholar
Digital Library
- Hudlicka, E. and McNeese, M. D. Assessment of user affective and belief states for interface adaptation: Application to an air force pilot task. User Modeling and User-Adapted Interaction, 12, 2002, 1--47. Google Scholar
Digital Library
- Eyben, F., Wollmer, M., Poitschke, T., Schuller, B., Blaschke, C., Farber, B. and Nguyen-Thien, N. Emotion on the road-Necessity, acceptance, and feasibility of affective computing in the car. Advances in Human-Computer Interaction, 2010, ID 263593 2010, 1--17. Google Scholar
Digital Library
- Harris, F. H. Emotion regulation for drivers: An exploration of variables that promote positive emotional states and safe driving. PhD Dissertation, Stanford University, Stanford, CA, 2011.Google Scholar
- Riener, A., Jeon, M. and Ferscha, A. Human-car confluence: Socially-inspired driving mechanisms, in A. Gaggioli (Ed.), Human-computer confluence: Advancing our understanding of the emerging symbiotic relation between humans and computing devices, in press.Google Scholar
- Blattner, M. M., Sumikawa, D. A. and Greenberg, R. M. Earcons and icons: Their structure and common design principles. Human-Computer Interaction, 4, 1989, 11--44. Google Scholar
Digital Library
- Gaver, W. W. Auditory icons: Using sound in computer interfaces. Human-Computer Interaction, 2, 1986, 167--177. Google Scholar
Digital Library
- Walker, B. N., Lindsay, J., Nance, A., Nakano, Y., Palladino, D. K., Dingler, T. and Jeon, M. Spearcons (Speech-based earcons) improve navigation performance in advanced auditory menus. Human Factors, 55, 1, 2013, 157--182.Google Scholar
Cross Ref
- Jeon, M. Lyricons (Lyrics + Earcons): Designing a new auditory cue combining speech and sounds. In C. Stephanidis (Ed.), Posters, Part I, HCII 2013, CCIS 373, pp. 342--346. Heidelberg: Springer. 2013.Google Scholar
- Jeon, M., Roberts, J., Raman, P., Yim, J.-B. and Walker, B. N. Participatory design process for an invehicle affect detection and regulation system for various drivers. In Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS'11) (Dundee, Scotland, 2011). Google Scholar
Digital Library
- Hermann, T. Taxonomy and definitions for sonification and auditory display. In Proceedings of the 14th Internatinal Conference on Auditory Display (ICAD2008) (Paris, France, 2008).Google Scholar
- Riener, A. and Jeon, M. The role of subliminal perception in vehicular interfaces. In Proceedings of the 14th ACM International Conference on Ubiquitous Computing (UbiComp'12) (Pittsburgh, USA, 2012). Google Scholar
Digital Library
Index Terms
Advanced Vehicle Sonification Applications
Recommendations
Evaluation of psychoacoustic sound parameters for sonification
Sonification designers have little theory or experimental evidence to guide the design of data-to-sound mappings. Many mappings use acoustic representations of data values which do not correspond with the listener's perception of how that data value ...
Lyricon Lyrics + Earcons Improves Identification of Auditory Cues
Auditory researchers have developed various non-speech cues in designing auditory user interfaces. A preliminary study of "lyricons" lyricsï ź+ï źearcons [1] has provided a novel approach to devising auditory cues in electronic products, by combining ...
Roadrunner+: An Autonomous Intersection Management Cooperating with Connected Autonomous Vehicles and Pedestrians with Spillback Considered
The recent emergence of Connected Autonomous Vehicles (CAVs) enables the Autonomous Intersection Management (AIM) system, replacing traffic signals and human driving operations for improved safety and road efficiency. When CAVs approach an intersection, ...






Comments