ABSTRACT

Interaction with large touch surfaces is still a relatively infant domain, particularly when looking at the accessibility solutions offered to blind users. Their smaller mobile counterparts are shipped with built-in accessibility features, enabling non-visual exploration of linearized screen content. However, it is unknown how well these solutions perform in large interactive surfaces that use more complex spatial content layouts. We report on a user study with 14 blind participants performing common touchscreen interactions using one and two-hand exploration. We investigate the exploration strategies applied by blind users when interacting with a tabletop. We identified six basic strategies that were commonly adopted and should be considered in future designs. We finish with implications for the design of accessible large touch interfaces.
References
- Anthony, L. et al. 2013. Analyzing user-generated youtube videos to understand touchscreen use by people with motor impairments. Proc. of CHI, ACM (2013), 1223--1232. Google Scholar
- Azenkot, S., et al. 201 Input finger detection for nonvisual touch screen text entry in Perkinput. Proc. of GI '12, ACM (2012), 121--129. Google Scholar
- Bertelson, P., et al.. (1985). A study of braille reading: 2. Patterns of hand activity in one-handed and two-handed reading. The Quarterly Journal of Experimental Psychology, 37(2), 235--256.Google Scholar
- Bonner, M. et al. 2010 No-Look Notes: Accessible Eyes-Free Multitouch Text-Entry. Proc. Pervasive '10, Springer (2010), 409--427. Google Scholar
- Brock, A., et al. 2012. Kin'touch: understanding how visually impaired people explore tactile maps. In CHI Extended Abstracts, ACM (2012), 2471--2476. Google Scholar
- Brock, A., et al. 2015. Interactivity Improves Usability of Geographic Maps for Visually Impaired People. Human-Computer Interaction, 30, 156--194. Google Scholar
- Brock, A., et al. 2013. Map design for visually impaired people: past, present, and future research. Médiation et Information-Handicap et Communication, 36, 117--129.Google Scholar
- Gardner, J., and Bulatov, V. 2006. Scientific Diagrams Made Easy with IVEO. Proc. of ICCHP, (2006) 1243--1250. Google Scholar
- Giudice, N. et al. 2012. Learning non-visual graphical information using a touch-based vibro-audio interface. Proc. of ASSETS, ACM (2012), 103--110. Google Scholar
- Goncu C., and Marriot K. 2011. Gravvitas: generic multitouch presentation of accessible graphics. Proc. of INTERACT, (2011), 30--48. Google Scholar
- 1Guerreiro, J., Gonçalves, D. 2014. Text-to-Speeches: Evaluating the Perception of Concurrent Speech by Blind People. Proc. of ASSETS, ACM (2014),169--176. Google Scholar
- Guerreiro, T. et al. 2008. A. From tapping to touching: Making touch screens accessible to blind users. IEEE Multimedia, 15, 4 (2008), 48--50. Google Scholar
- Guerreiro, J., et al. 2015. TabLETS Get Physical: Non-Visual Text Entry on Tablet Devices. Proc. of CHI, ACM (2015), 39--42. Google Scholar
- Hatwell, Y., et al.. (2003). Touching for knowing: cognitive psychology of haptic manual perception (Vol. 53). John Benjamins Publishing.Google Scholar
- Kane, S., et al 2008. Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques. In Proc. of ASSETS, ACM (2008). 73--80, Google Scholar
- Kane, S. K., et al.. 2011. Usable gestures for blind people: understanding preference and performance. Proc. of CHI. ACM (2011), 413--422. Google Scholar
- Kane, S., et al. 2011. Access overlays: improving non-visual access to large touch screens for blind users. Proc. UIST, ACM (2011), 273--282. Google Scholar
- Manshad, M., Manshad, A. 2008. Multimodal Vision Glove for Touchscreens. Proc. of ASSETS, ACM (2008), 251--252. Google Scholar
- Nicolau, H., et al. 2014. B#: chord-based correction for multitouch braille input. Proc. of CHI, ACM (2014), 1705--1708. Google Scholar
- Nunes, R., et al. 2015.TACTIC: An API for Touch and Tangible Interaction. Proc. of TEI, ACM (2015), 125--132. Google Scholar
Digital Library
- 2Oliveira, J., et al. 2011. Blind people and mobile touch-based text-entry: acknowledging the need for different flavors. Proc. of ASSETS, ACM (2011), 179--186. Google Scholar
- 2Parkes, D. 1988. "NOMAD":An audio-tactile tool for the acquisition, use and management of spatially distributed information by partially sighted and blind persons. Proc. of International Conference on Maps and Graphics for Visually Disabled People (1988), 24--29.Google Scholar
- Sauro, J., & Dumas, J. S. 2009. Comparison of three one-question, post-task usability questionnaires. Proc. of CHI, ACM (2009), 1599--1608. Google Scholar
- Southern, C. et al. 2012. An evaluation of BrailleTouch mobile touchscreen text entry for the visually impaired. Proc. of MobileHCI, ACM (2012), 317--326. Google Scholar
- Thinus-Blanc, C., & Gaunet, F. (1997). Representation of space in blind persons: Vision as a spatial sense? Psychological Bulletin, 121, 1 (1997), 20--42.Google Scholar
Index Terms
Blind People Interacting with Large Touch Surfaces

Tiago Guerreiro
Kyle Montague
Hugo Nicolau


Comments