ABSTRACT
Low vision people face many daily encumbrances. Traditional visual enhancements do not suffice to navigate indoor environments, or recognize objects efficiently. In this paper, we explore how Augmented Reality (AR) can be leveraged to design mobile applications to improve visual experience and unburden low vision persons. Specifically, we propose a novel automated AR-based annotation tool for detecting and labeling salient objects for assisted indoor navigation applications like NearbyExplorer. NearbyExplorer, which issues audio descriptions of nearby objects to the users, relies on a database populated by large teams of volunteers and map-a-thons to manually annotate salient objects in the environment like desks, chairs, low overhead ceilings. This has limited widespread and rapid deployment. Our tool builds on advances in automated object detection, AR labeling and accurate indoor positioning to provide an automated way to upload object labels and user position to a database, requiring just one volunteer. Moreover, it enables low vision people to detect and notice surrounding objects quickly using smartphones in various indoor environments.
Supplemental Material
Available for Download
- Dragan Ahmetovic, Cole Gleason, Kris M. Kitani, Hironobu Takagi, and Chieko Asakawa. 2016. NavCog: Turn-by-Turn Smartphone Navigation Assistant for People with Visual Impairments or Blindness. In Proceedings of the 13th International Web for All Conference (Montreal, Canada) (W4A ’16). Association for Computing Machinery, New York, NY, USA, Article 9, 2 pages. https://doi.org/10.1145/2899475.2899509Google Scholar
Digital Library
- Dragan Ahmetovic, Daisuke Sato, Uran Oh, Tatsuya Ishihara, Kris Kitani, and Chieko Asakawa. 2020. ReCog: Supporting Blind People in Recognizing Personal Objects. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, 1–12.Google Scholar
Digital Library
- AOA (American Optometric Association. 2015. Common Types of Low Vision.Google Scholar
- Shelly Brisbin. 2016. The Nearby Explorer Blindness-Focused Navigation App from APH Comes to iOS. http://www.afb.org/aw/17/11/15388 [Online; retrieved 17-June-2021].Google Scholar
- Cole Gleason, Anhong Guo, Gierad Laput, Kris Kitani, and Jeffrey P. Bigham. 2016. VizMap: Accessible Visual Information Through Crowdsourced Map Reconstruction. In Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility (Reno, Nevada, USA) (ASSETS ’16). Association for Computing Machinery, New York, NY, USA, 273–274.Google Scholar
Digital Library
- Apple Inc. 2017. https://developer.apple.com/documentation/arkit.Google Scholar
- Brown-Ogilvie Tara Beresheim-Kools Jenna Parker Amy T., Swobodzinski Martin. [n.d.]. The Use of Wayfinding Apps by Deafblind Travelers in an Urban Environment: Insights From Focus Groups. Frontiers in Education([n. d.]).Google Scholar
- Joseph Redmon and Ali Farhadi. 2018. YOLOv3: An Incremental Improvement. CVPR.Google Scholar
- Manaswi Saha, Alexander J. Fiannaca, Melanie Kneisel, Edward Cutrell, and Meredith Ringel Morris. 2019. Closing the Gap: Designing for the Last-Few-Meters Wayfinding Problem for People with Visual Impairments. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility (Pittsburgh, PA, USA) (ASSETS ’19). Association for Computing Machinery, New York, NY, USA, 222–235. https://doi.org/10.1145/3308561.3353776Google Scholar
Digital Library
- Manaswi Saha, Michael Saugstad, Hanuma Teja Maddali, Aileen Zeng, Ryan Holland, Steven Bower, Aditya Dash, Sage Chen, Anthony Li, Kotaro Hara, and Jon Froehlich. 2019. Project Sidewalk: A Web-Based Crowdsourcing Tool for Collecting Sidewalk Accessibility Data At Scale. Association for Computing Machinery, New York, NY, USA, 1–14. https://doi.org/10.1145/3290605.3300292Google Scholar
Digital Library
- Martin Swobodzinski and Amy T. Parker. 2019. Electronic Wayfinding for Visually Impaired Travelers: Limitations and Opportunities. https://ppms.trec.pdx.edu/media/project_files/1177_ProjectBrief_3FR1hWw.pdf [Online; retrieved 17-June-2021].Google Scholar
- Yuhang Zhao, Michele Hu, Shafeka Hashash, and Shiri Azenkot. 2017. Understanding low vision people’s visual perception on commercial augmented reality glasses. ACM CHI.Google Scholar
- Yuhang Zhao, Elizabeth Kupferstein, Brenda V. Castro, Steven Feiner, and Shiri Azenkot. 2019. Designing AR Visualizations to Facilitate Stair Navigation for People with Low Vision. UIST 2019.Google Scholar
Digital Library
- Yuhang Zhao, Elizabeth Kupferstein, Hathaitorn Rojnirun, Leah Findlater, and Shiri Azenkot. 2020. The Effectiveness of Visual and Audio Wayfinding Guidance on Smartglasses for People with Low Vision. CHI.Google Scholar
- Yuhang Zhao, Sarit Szpiro, and Shiri Azenkot. 2015. ForSee: A Customizable Head-Mounted Vision Enhancement System for People with Low Vision. International ACM SIGACESS Conference on Computers and Accessibility, 239–249.Google Scholar
- Yuhang Zhao, Sarit Szpiro, Jonathan Knighten, and Shiri Azenkot. 2016. CueSee: Exploring Visual Cues for People with Low Vision to Facilitate a Visual Search Task. ACM UbiComp, 73–84.Google Scholar
- Yuhang Zhao, Sarit Szpiro, Lei Shi, and Shiri Azenkot. 2019. Designing and Evaluating a Customizable Head-mounted Vision Enhancement System for People with Low Vision. ACM Transactions on Accessible Computing.Google Scholar





Comments