skip to main content
10.1145/3543174.3546841acmconferencesArticle/Chapter ViewAbstractPublication PagesautomotiveuiConference Proceedingsconference-collections
research-article

Can Eyes on a Car Reduce Traffic Accidents?

Published: 17 September 2022 Publication History
  • Get Citation Alerts
  • Abstract

    Various car manufacturers and researchers have explored the idea of adding eyes to a car as an additional communication modality. A previous study demonstrated that autonomous vehicles’ (AVs) eyes help pedestrians make faster street-crossing decisions. In this study, we examine a more critical question, "can eyes reduce traffic accidents?” To answer this question, we consider a critical street-crossing situation in which a pedestrian is in a hurry to cross the street. If the car is not looking at the pedestrian, this implies that the car does not recognize the pedestrian. Thus, pedestrians can judge that they should not cross the street, thereby avoiding potential traffic accidents. We conducted an empirical study using 360-degree video shooting of an actual car with robotic eyes. The results showed that the eyes can reduce potential traffic accidents and that gaze direction can increase pedestrians’ subjective feelings of safety and danger. In addition, the results showed gender differences in critical and noncritical scenarios in AV-to-pedestrian interaction.

    References

    [1]
    Daniel J. Fagnant, and Kara Kockelman. 2015. Preparing a nation for autonomous vehicles: opportunities, barriers and policy recommendations. Transportation Research Part A: Policy and Practice 77: 167-181.
    [2]
    Hiroshi Hayakawa, Paul S. Fischbeck, and Baruch Fischhoff. 2000. Traffic accident statistics and risk perceptions in Japan and the United States. Accident Analysis & Prevention, 32(6), 827-835.
    [3]
    Sucha Matúš. 2014. Pedestrians and drivers: their encounters at zebra crossings. In Proceedings of the Fit to drive: 8th International Traffic Expert Congress. https://psych.upol.cz/fileadmin/userdata/FF/katedry/pch/vyzkum/dopravni_psychologie/FTD_2014_WAW_PPT_Sucha2.pdf
    [4]
    Chia-Ming Chang, Koki Toda, Daisuke Sakamoto, and Takeo Igarashi. 2017. Eyes on a Car: an Interface Design for Communication between an Autonomous Car and a Pedestrian. In Proceedings of the 9th international conference on automotive user interfaces and interactive vehicular applications (AutomotiveUI ’17), pp. 65-73.
    [5]
    Stefanie M. Faas, Johannes Kraus, Alexander Schoenhals, and Martin Baumann. 2021. Calibrating Pedestrians' Trust in Automated Vehicles: Does an Intent Display in an External HMI Support Trust Calibration and Safe Crossing Behavior?. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-17).
    [6]
    María Ángeles Onieva-García, Virginia Martínez-Ruiz, Pablo Lardelli-Claret, José Juan Jiménez-Moleón, Carmen Amezcua-Prieto, Juan de Dios Luna-Del-Castillo, and Eladio Jiménez-Mejías. 2016. Gender and age differences in components of traffic-related pedestrian death rates: exposure, risk of crash and fatality rate. Injury epidemiology, 3(1), 1-10.
    [7]
    Eladio Jiménez-Mejías, Carmen Amezcua Prieto, Virginia Martínez-Ruiz, Juan de Dios Luna del Castillo, Pablo Lardelli-Claret, José Juan Jiménez-Moleón. 2014. Gender-related differences in distances travelled, driving behaviour and traffic accidents among university students. Transportation research part F: traffic psychology and behaviour, 27, 81-89.
    [8]
    Chia-Ming Chang. 2020. A Gender Study of Communication Interfaces between an Autonomous Car and a Pedestrian. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI ’20), pp. 42-45.
    [9]
    Debargha Dey, Marieke Martens, Berry Eggen, and Jacques Terken. 2017. The impact of vehicle appearance and vehicle behavior on pedestrian interaction with autonomous vehicles. In Adjunct Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI ’17), 158–162.
    [10]
    Friederike Schneemann and Irene Gohl. 2016. Analyzing driver-pedestrian interaction at crosswalks: A contribution to autonomous driving in urban en- environments. In 2016 IEEE Intelligent Vehicles Symposium (IV), 38–43. //doi.org/10.1109/IVS.2016.7535361
    [11]
    Yung-ChingLiuandYing-ChanTung.2014.Riskanalysisofpedestrians'road- crossing decisions: Effects of age, time gap, time of day, and vehicle speed. Safety Science, 63, 77–82.
    [12]
    Matúš Šucha, Daniel Dostal, and Ralf Risser. 2017. Pedestrian-driver communication and decision strategies at marked crossings. Accident Analysis and Prevention, 102, 41–50.
    [13]
    Nicolas Guéguen, Sébastien Meineri, and Chloé Eyssartier. 2015. A pedestrian's stare and drivers’ stopping behavior: A feld experiment at the pedestrian crossing. Safety Science, 75, 87–89.
    [14]
    ZehengRen,XiaobeiJiang,andWuhongWang.2016.Analysisoftheinfuenceof pedestrians’ eye contact on drivers’ comfort boundary during the crossing confict. Procedia Engineering 137, 399–406.
    [15]
    Chia-Ming, Chang, Koki Toda, Takeo Igarashi, Masahiro Miyata, and Yasuhiro Kobayashi, 2018, September. A Video-based Study Comparing Communication Modalities between an Autonomous Car and a Pedestrian. In Adjunct Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI ’18), pp. 104-109.
    [16]
    Kai Holländer, Ashley Colley, Christian Mai, Jonna Häkkilä, Florian Alt, and Bastian Pfleging. 2019. Investigating the influence of external car displays on pedestrians' crossing behavior in virtual reality. In Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI ’19), pp. 1-11.
    [17]
    Yeti Li, Murat Dikmen, Thana G. Hussein, Yahui Wang, and Catherine Burns. 2018. To Cross or Not to Cross: Urgency-Based External Warning Displays on Autonomous Vehicles to Improve Pedestrian Crossing Safety. In Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI ’18). ACM, New York, NY, USA, 188–197.
    [18]
    Karthik Mahadevan, Sowmya Somanath, and Ehud Sharlin. 2018. Communicating Awareness and Intent in Autonomous Vehicle-Pedestrian Interaction. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI ’18). ACM, New York, NY, USA, Article 429, 12 pages.
    [19]
    Jingyi Zhang, Erik Vinkhuyzen, and Melissa Cefkin. 2018. Evaluation of an Autonomous Vehicle External Communication System Concept: A Survey Study. In Advances in Human Aspects of Transportation, Neville A Stanton (Ed.). Springer International Publishing, Cham, Germany, 650–661.
    [20]
    Lex Fridman, Bruce Mehler, Lei Xia, Yangyang Yang, Laura Yvonne Facusse, and Bryan Reimer. 2017. To Walk or Not to Walk: Crowd- sourced Assessment of External Vehicle-to-Pedestrian Displays. CoRR abs/1707.02698 (2017), 7. arXiv:1707.02698
    [21]
    Debargha Dey, Marieke Martens, Chao Wang, Felix Ros, and Jacques Terken. 2018. Interface Concepts for Intent Communication from Au- tonomous Vehicles to Vulnerable Road Users. In Adjunct Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI ’18). ACM, New York, NY, USA, 82–86.
    [22]
    Debargha Dey, Azra Habibovic, Bastian Pfleging, Marieke Martens, and Jacques Terken. 2020. Color and Animation Preferences for a Light Band eHMI in In- teractions Between Automated Vehicles and Pedestrians. In CHI Conference on Human Factors in Computing Systems. Hawai'i, Honolulu, United States, 1–13.
    [23]
    Debargha Dey, Kai Holländer, Melanie Berger, Berry Eggen, Marieke Martens, Bastian Pfeging, and Jacques Terken. 2020. Distance-Dependent EHMIs for the Interaction Between Automated Vehicles and Pedestrians. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (Virtual Event, DC, USA) (AutomotiveUI ’20). Association for Computing Ma- chinery, New York, NY, USA, 192–204.
    [24]
    Annette Werner. 2018. New Colours for Autonomous Driving: An Evaluation of Chromaticities for the External Lighting Equipment of Autonomous Vehicles. ColourTurn 10, 3 (2018), 183–193.
    [25]
    Helmut Tiesler-Wittig. 2019. Functional Application, Regulatory Requirements and Their Future Opportunities for Lighting of Automated Driving Systems. Technical Report.
    [26]
    Yoichi Ochiai and Keisuke Toyoshima. 2011. Homunculus: The Vehicle as Augmented Clothes. In Proceedings of the 2nd Augmented Human. 3–6.
    [27]
    Koen de Clercq, Andre Dietrich, Juan Pablo Núñez Velasco, Joost de Winter, and Riender Happee. 2019. External human-machine interfaces on automated vehicles: effects on pedestrian crossing decisions. Human factors, 61(8), 1353-1370.
    [28]
    Andreas Löcken, Carmen Golling, and Andreas Riener. 2019. How should automated vehicles interact with pedestrians? A comparative analysis of interaction concepts in virtual reality. In Proceedings of the 11th international conference on automotive user interfaces and interactive vehicular applications, pp. 262-274.
    [29]
    Uwe Gruenefeld, Sebastian Weiß, Andreas Löcken, Isabella Virgilio, Andrew L. Kun, and Susanne Boll. 2019. VRoad: gesture-based interaction between pedestrians and automated vehicles in virtual reality. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct Proceedings, pp. 399-404.
    [30]
    Sebastian Stadler, Henriette Cornet, Tatiana Novaes Theoto, and Fritz Frenkler. 2019. A tool, not a toy: using virtual reality to evaluate the communication between autonomous vehicles and pedestrians. In Augmented reality and virtual reality (pp. 203-216). Springer, Cham.
    [31]
    Mark Colley, Stefanos Can Mytilineos, Marcel Walch, Jan Gugenheimer, Enrico Rukzio. 2020. Evaluating highly automated trucks as signaling lights. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, pp. 111-121.
    [32]
    Marius Hoggenmüller, Martin Tomitsch, Luke Hespanhol, Tram Thi Minh Tran, Stewart Worrall, and Eduardo Nebot. 2021. Context-based interface prototyping: Understanding the effect of prototype representation on user feedback. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, pp. 1-14.
    [33]
    Alexandre M. Nascimento, Anna Carolina M. Queiroz, Lucio F. Vismari, Jeremy N. Bailenson, Paulo S. Cugnasca, João B. Camargo Junior, and Jorge R. de Almeida. 2019. The role of virtual reality in autonomous vehicles’ safety. In 2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR) (pp. 50-507). IEEE.
    [34]
    Claudia Ackermann, Matthias Beggiato, Sarah Schubert, and Josef F. Krems. 2019. An experimental study to investigate design and assessment criteria: What is important for communication between pedestrians and automated vehicles?. Applied ergonomics, pp. 272-282.
    [35]
    Dominik Schlackl, Klemens Weigl, and Andreas Riener. 2020. eHMI visualization on the entire car body: results of a comparative evaluation of concepts for the communication between AVs and manual drivers. In Proceedings of the Conference on Mensch und Computer, pp. 79-83.
    [36]
    Ann-Christin Hensch, Isabel Neumann, Matthias Beggiato, Josephine Halama, and Josef F. Krems. 2019. Effects of a light-based communication approach as an external HMI for Automated Vehicles–a Wizard-of-Oz Study. Transactions on Transport Sciences, 10(2).
    [37]
    Dirk Rothenbücher, Jamy Li, David Sirkin, Brian Mok, and Wendy Ju. 2016. Ghost driver: A field study investigating the interaction between pedestrians and driverless vehicles. In 2016 25th IEEE international symposium on robot and human interactive communication (RO-MAN), pp. 795-802. IEEE.
    [38]
    Xinyue Gui, Koki Toda, Stela H. Seo, Chia-Ming Chang and Takeo Igarashi. 2022. “I am going this way”: Gazing eyes on a self-driving car show multiple moving directions. In Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI ’22).
    [39]
    The Japan Automobile Federation (JAF), 2016. JAF finds over 90% of drivers ignore pedestrian crosswalks. Retrieved April 23, 2022 from https://mainichi.jp/english/articles/20161002/p2a/00m/0na/015000c
    [40]
    Jaguar Land Rover Automotive Plc., 2018. The Virtual Eyes Have it. Retrieved June 19, 2022 from https://www.jaguarlandrover.com/2018/virtual-eyes-have-it
    [41]
    Jim Blascovich, Jack Loomis, Andrew C. Beall, Kimberly R. Swinth, Crystal L. Hoyt, and Jeremy N. Bailenson. 2002. Immersive Virtual Environment Technology as a Methodological Tool for Social Psychology. Psychological Inquiry 13, 2: 103–124.
    [42]
    Dylan Moore, Rebecca Currano, G. Ella Strack, and David Sirkin. 2019. The Case for Implicit External Human-Machine Interfaces for Autonomous Vehicles. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '19). Association for Computing Machinery, New York, NY, USA, 295–307. https://doi.org/10.1145/3342197.3345320

    Cited By

    View all
    • (2024)Kawaii Computing: Scoping Out the Japanese Notion of Cute in User Experiences with Interactive SystemsExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3651001(1-9)Online publication date: 11-May-2024
    • (2024)Information Presentation Strategies to Promote Pedestrian Behavior Change in Mixed Spaces with Automated Vehicle2024 IEEE/SICE International Symposium on System Integration (SII)10.1109/SII58957.2024.10417422(1026-1031)Online publication date: 8-Jan-2024
    • (2024)Are the External Human-Machine Interfaces (eHMI) Accessible for People with Disabilities? A Systematic Review2024 IEEE 4th International Conference on Human-Machine Systems (ICHMS)10.1109/ICHMS59971.2024.10555703(1-6)Online publication date: 15-May-2024
    • Show More Cited By

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    AutomotiveUI '22: Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
    September 2022
    371 pages
    ISBN:9781450394154
    DOI:10.1145/3543174
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 17 September 2022

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Autonomous Vehicles
    2. Empirical Study
    3. Gaze Directions
    4. Real Car Prototype
    5. Vehicle-to-Pedestrian Interaction

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    • JST CREST

    Conference

    AutomotiveUI '22
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 248 of 566 submissions, 44%

    Upcoming Conference

    AutomotiveUI '24

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)156
    • Downloads (Last 6 weeks)6
    Reflects downloads up to 09 Aug 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Kawaii Computing: Scoping Out the Japanese Notion of Cute in User Experiences with Interactive SystemsExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3651001(1-9)Online publication date: 11-May-2024
    • (2024)Information Presentation Strategies to Promote Pedestrian Behavior Change in Mixed Spaces with Automated Vehicle2024 IEEE/SICE International Symposium on System Integration (SII)10.1109/SII58957.2024.10417422(1026-1031)Online publication date: 8-Jan-2024
    • (2024)Are the External Human-Machine Interfaces (eHMI) Accessible for People with Disabilities? A Systematic Review2024 IEEE 4th International Conference on Human-Machine Systems (ICHMS)10.1109/ICHMS59971.2024.10555703(1-6)Online publication date: 15-May-2024
    • (2024)Behavioural Gap Assessment of Human-Vehicle Interaction in Real and Virtual Reality-Based Scenarios in Autonomous DrivingInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2385514(1-14)Online publication date: 8-Aug-2024
    • (2023)AVs in HCI EducationProceedings of the 15th Biannual Conference of the Italian SIGCHI Chapter10.1145/3605390.3610833(1-4)Online publication date: 20-Sep-2023
    • (2023)My Eyes Speak: Improving Perceived Sociability of Autonomous Vehicles in Shared Spaces Through Emotional Robotic EyesProceedings of the ACM on Human-Computer Interaction10.1145/36042617:MHCI(1-30)Online publication date: 13-Sep-2023
    • (2023)Exploring the Potential of eHMIs as Traffic Light AlternativesAdjunct Proceedings of the 15th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3581961.3609885(99-104)Online publication date: 18-Sep-2023
    • (2023)Pimp My Ride: Designing Versatile eHMIs for CyclistsProceedings of the 15th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3580585.3607161(213-223)Online publication date: 18-Sep-2023
    • (2023)ARcoustic: A Mobile Augmented Reality System for Seeing Out-of-View TrafficProceedings of the 15th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3580585.3606461(178-190)Online publication date: 18-Sep-2023
    • (2023)Advantages of Multimodal versus Verbal-Only Robot-to-Human Communication with an Anthropomorphic Robotic Mock Driver2023 32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)10.1109/RO-MAN57019.2023.10309629(293-300)Online publication date: 28-Aug-2023
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media