Abstract
With content rapidly moving to the electronic space, access to graphics for individuals with visual impairments is a growing concern. Recent research has demonstrated the potential for representing basic graphical content on touchscreens using vibrations and sounds, yet few guidelines or processes exist to guide the design of multimodal, touchscreen-based graphics. In this work, we seek to address this gap by synergizing our collective research efforts over the past eight years and implementing our findings into a compilation of recommendations, which we validate through an iterative design process and user study. We start by reviewing previous work and then collate findings into a set of design guidelines for generating basic elements of touchscreen-based multimodal graphics. We then use these guidelines to generate exemplary graphics in mathematics, specifically bar charts and geometry concepts. We discuss the iterative design process of moving from guidelines to actual graphics and highlight challenges. We then present a formal user study with 22 participants with visual impairments, comparing learning performance on using touchscreen-rendered graphics to embossed graphics. We conclude with qualitative feedback from participants on the touchscreen-based approach and offer areas of future investigation as these recommendation are expanded to include more complex graphical concepts.
- Lopez Ada. 2014. Reach for the Stars: Touch, Look, Listen, Learn. SAS Institute. Retrieved from https://books.apple.com/us/book/reach-for-stars-touch-look/id763516126.Google Scholar
- Richard Adams, Dianne Pawluk, Margaret Fields, and Ryan Clingman. 2015. Multimodal application for the perception of spaces (MAPS). In Proceedings of the 17th International ACM SIGACCESS Conference on Computers 8 Accessibility. ACM, 393--394.Google Scholar
Digital Library
- AFB. 2017. Key Employment Statistics for People Who Are Blind or Visually Impaired. Retrieved from https://www.afb.org/research-and-initiatives/statistics/key-employment-statistics.Google Scholar
- APH. 2018. Graphiti. Retrieved from http://www.aph.org/graphiti/.Google Scholar
- Apple. 2019. Vision Accessibility. Retrieved from https://www.apple.com/accessibility/mac/vision/.Google Scholar
- Shiri Azenkot, Cynthia L. Bennett, and Richard E. Ladner. 2013. DigiTaps: Eyes-free number entry on touchscreens with minimal audio feedback. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST’13). ACM, New York, NY, 85--90. DOI:https://doi.org/10.1145/2501988.2502056Google Scholar
- Catherine M. Baker, Lauren R. Milne, Ryan Drapeau, Jeffrey Scofield, Cynthia L. Bennett, and Richard E. Ladner. 2016. Tactile graphics with a voice. ACM Trans. Access. Comput. 8, 1 (2016), 3.Google Scholar
Digital Library
- BANA. 2019. Code Books and Guidelines. Retrieved from http://www.brailleauthority.org/publications-area.html.Google Scholar
- BLC. 2019. Braille Literacy Canada. Retrieved from http://www.brailleliteracycanada.ca/en/.Google Scholar
- Stephen A. Brewster and Lorna M. Brown. 2004. Tactons: Structured tactile messages for non-visual information display. In Proceedings of the 5th Conference on Australasian User Interface-Volume 28. Australian Computer Society, Inc., 15--23.Google Scholar
- Stephen A. Brewster, Faraz Chohan, and Lorna Brown. 2007. Tactile feedback for mobile interactions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 159--162.Google Scholar
Digital Library
- Erin Buehler, Shaun K. Kane, and Amy Hurst. 2014. ABC and 3D: Opportunities and obstacles to 3D printing in special education environments. In Proceedings of the 16th International ACM SIGACCESS Conference on Computers 8 Accessibility. ACM, 107--114.Google Scholar
Digital Library
- Maria Claudia Buzzi, Marina Buzzi, Barbara Leporini, and Amaury Trujillo. 2017. Analyzing visually impaired people’s touch gestures on smartphones. Multimedia Tools Applic. 76, 4 (2017), 5141--5169.Google Scholar
Digital Library
- Seungmoon Choi and Katherine J. Kuchenbecker. 2013. Vibrotactile display: Perception, technology, and applications. Proc. IEEE 101, 9 (2013), 2093--2104.Google Scholar
Cross Ref
- Rafael Jeferson Pezzuto Damaceno, Juliana Cristina Braga, and Jesús Pascual Mena-Chalco. 2018. Mobile device accessibility for the visually impaired: Problems mapping and recommendations. Univ. Access Inf. Soc. 17, 2 (2018), 421--435.Google Scholar
Digital Library
- Core Haptics | Apple Developer. 2020. Retrieved from https://developer.apple.com/documentation/corehaptics.Google Scholar
- Julie Ducasse, Anke M. Brock, and Christophe Jouffrais. 2018. Accessible interactive maps for visually impaired users. In Mobility of Visually Impaired People. Springer, 537--584.Google Scholar
- Christin Engel and Gerhard Weber. 2018. A user study to evaluate tactile charts with blind and visually impaired people. In Proceedings of the International Conference on Computers Helping People with Special Needs. Springer, 177--184.Google Scholar
Cross Ref
- William Erickson, Camille Lee, and Sarah von Schrader. 2012. Disability Statistics from the 2011 American Community Survey (ACS). Retrieved from www.disabilitystatistics.org.Google Scholar
- George A. Gescheider, John H. Wright, and Ronald T. Verrillo. 2010. Information-processing Channels in the Tactile Sensory System: A Psychophysical and Physiological Analysis. Psychology Press.Google Scholar
- Frédéric Giraud, Tomohiro Hara, Christophe Giraud-Audine, Michel Amberg, Betty Lemaire-Semail, and Masaya Takasaki. 2018. Evaluation of a friction reduction based haptic surface at high frequency. In Proceedings of the IEEE Haptics Symposium (HAPTICS’18). IEEE, 210--215.Google Scholar
Cross Ref
- Nicholas A. Giudice, Benjamin A. Guenther, Nicholas A. Jensen, and Kaitlyn N. Haase. 2020. Cognitive mapping without vision: Comparing wayfinding performance after learning from digital touchscreen-based multimodal maps vs. embossed tactile overlays. Front. Hum. Neurosci. 14 (2020), 87.Google Scholar
Cross Ref
- Nicholas A. Giudice, Hari Prasath Palani, Eric Brenner, and Kevin M. Kramer. 2012. Learning non-visual graphical information using a touch-based vibro-audio interface. In Proceedings of the 14th International ACM SIGACCESS Conference on Computers 8 Accessibility. ACM, 103--110.Google Scholar
- Cagatay Goncu and Kim Marriott. 2011. GraVVITAS: Generic multi-touch presentation of accessible graphics. In Proceedings of the Conference on Human-Computer Interaction. Springer, 30--48.Google Scholar
Cross Ref
- Jenna L. Gorlewicz, Jessica Burgner, Thomas J. Withrow, and Robert J. Webster III. 2014. Initial experiences using vibratory touchscreens to display graphical math concepts to students with visual impairments. J. Special Educ. Technol. 29, 2 (2014), 17--25.Google Scholar
Cross Ref
- Jenna L. Gorlewicz, Jennifer L. Tennison, Hari Prasath Palani, and Nicholas A. Giudice. 2018. The graphical access challenge for people with visual impairments: Positions and pathways forward. In Interactive Multimedia. IntechOpen.Google Scholar
- W. Grussenmeyer and E. Folmer. 2017. Accessible touchscreen technology for people with visual impairments: A survey. Trans. Access. Comput. 9, 2 (2017), 6:1--6:31.Google Scholar
- João Guerreiro and Daniel Gonçalves. 2015. Faster text-to-speeches: Enhancing blind people’s information scanning with faster concurrent speech. In Proceedings of the 17th International ACM SIGACCESS Conference on Computers 8 Accessibility. 3--11.Google Scholar
Digital Library
- Tiago Guerreiro, Kyle Montague, João Guerreiro, Rafael Nunes, Hugo Nicolau, and Daniel J. V. Gonçalves. 2015. Blind people interacting with large touch surfaces: Strategies for one-handed and two-handed exploration. In Proceedings of the International Conference on Interactive Tabletops 8 Surfaces. 25--34.Google Scholar
- Michael Hahn, Corrine Mueller, and Jenna L. Gorlewicz. 2019. The comprehension of stem graphics via a multi-sensory tablet in students with visual impairment. J. Vis. Impair. Blind. 113, 5 (2019).Google Scholar
Cross Ref
- Yvette Hatwell, Arlette Streri, and Edouard Gentaz. 2003. Touching for Knowing: Cognitive Psychology of Haptic Manual Perception. Vol. 53. John Benjamins Publishing.Google Scholar
- Eve Hoggan, Stephen A. Brewster, and Jody Johnston. 2008. Investigating the effectiveness of tactile feedback for mobile touchscreens. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 1573--1582.Google Scholar
Digital Library
- Inwook Hwang and Seungmoon Choi. 2010. Perceptual space and adjective rating of sinusoidal vibrations perceived via mobile device. In Proceedings of the IEEE Haptics Symposium. IEEE, 1--8.Google Scholar
- Apple iOS. 2019. Haptics—User Interaction. Retrieved from https://developer.apple.com/design/human-interface-guidelines/ios/user-interaction/haptics/.Google Scholar
- Yvonne Jansen, Thorsten Karrer, and Jan Borchers. 2010. MudPad: Tactile feedback and haptic texture overlay for touch surfaces. In Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces. ACM, 11--14.Google Scholar
Digital Library
- Lynette A. Jones and Susan J. Lederman. 2006. Human Hand Function. Oxford University Press.Google Scholar
- Shaun K. Kane, Jeffrey P. Bigham, and Jacob O. Wobbrock. 2008. Slide rule: Making mobile touch screens accessible to blind people using multi-touch interaction techniques. In Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 73--80.Google Scholar
- Shaun K. Kane, Meredith Ringel Morris, Annuska Z. Perkins, Daniel Wigdor, Richard E. Ladner, and Jacob O. Wobbrock. 2011. Access overlays: Improving non-visual access to large touch screens for blind users. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (UIST’11). ACM, New York, NY, 273--282. DOI:https://doi.org/10.1145/2047196.2047232Google Scholar
- Roberta L. Klatzky, Nicholas A. Giudice, Christopher R. Bennett, and Jack M. Loomis. 2014. Touch-screen technology for the dynamic display of 2D spatial information without vision: Promise and progress.Multisens. Res. 27 (2014), 359--378.Google Scholar
Cross Ref
- Roberta L. Klatzky and Susan J. Lederman. 2003. Touch. In Handbook Psychol. A. F. Healy and R. W. Proctor (Eds.). John Wiley 8 Sons, 147--176.Google Scholar
- Susan J. Lederman and Roberta L. Klatzky. 1987. Hand movements: A window into haptic object recognition. Cog. Psychol. 19 (1987), 342--368.Google Scholar
Cross Ref
- Susan J. Lederman and Roberta L. Klatzky. 2009. Haptic perception: A tutorial. Atten. Percept. Psychophys. 71, 7 (2009), 1439--1459.Google Scholar
Cross Ref
- Jack M. Loomis, Roberta L. Klatzky, and Nicholas A. Giudice. 2018. Sensory substitution of vision: Importance of perceptual and cognitive processing. In Assistive Technology for Blindness and Low Vision. CRC Press, 179--210.Google Scholar
- Jack M. Loomis and Susan J. Lederman. 1986. Tactual perception. Handbook Percept. Hum. Perform. 2 (1986), 2.Google Scholar
- James Minogue and M. Gail Jones. 2006. Haptics in education: Exploring an untapped sensory modality. Rev. Educ. Res. 76, 3 (2006), 317--348.Google Scholar
Cross Ref
- Annika Muehlbradt, Madhur Atreya, Darren Guinness, and Shaun K. Kane. 2018. Exploring the design of audio-kinetic graphics for education. In Proceedings of the International Conference on Multimodal Interaction. ACM, 455--463.Google Scholar
- Joseph Mullenbach, Michael A. Peshkin, and J. Edward Colgate. 2016. eShiver: Lateral force feedback on fingertips through oscillatory motion of an electroadhesive surface. IEEE Trans. Haptics 10, 3 (2016), 358--370.Google Scholar
Digital Library
- Michael A. Nees and Bruce N. Walker. 2008. Encoding of information in auditory displays: Initial research on flexibility and processing codes in dual-task scenarios. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Vol. 52. SAGE Publications, Los Angeles, CA, 1820--1824.Google Scholar
- Michael A. Nees and Bruce N. Walker. 2009. Auditory Interfaces and Sonification. 1--15. http://sonify.psych.gatech.edu/∼walkerb/publications/pdfs/NeesWalker-UniversalAccessChapter2007-submitted.pdf.Google Scholar
- Uran Oh, Stacy Branham, Leah Findlater, and Shaun K. Kane. 2015. Audio-based feedback techniques for teaching touchscreen gestures. ACM Trans. Access. Comput. 7, 3 (2015), 9.Google Scholar
Digital Library
- Sile O’Modhrain, Nicholas A. Giudice, John A. Gardner, and Gordon E. Legge. 2015. Designing media for visually-impaired users of refreshable touch display: Possibilities and pitfalls. Trans. Haptics 8, 3 (2015), 248--257.Google Scholar
Digital Library
- Hari Prasath Palani and Nicholas A. Giudice. 2014. Evaluation of non-visual panning operations using touch-screen devices. In Proceedings of the 16th International ACM SIGACCESS Conference on Computers 8 Accessibility. ACM, 293--294.Google Scholar
- Hari Prasath Palani and Nicholas A. Giudice. 2017. Principles for designing large-format refreshable haptic graphics using touchscreen devices: An evaluation of nonvisual panning methods. ACM Trans. Access. Comput. 9, 3 (2017), 9.Google Scholar
Digital Library
- Hari Prasath Palani, Uro Giudice, and Nicholas A. Giudice. 2016. Evaluation of non-visual zooming operations on touchscreen devices. In Proceedings of the International Conference on Universal Access in Human-Computer Interaction. Springer, 162--174.Google Scholar
- Hari Prasath Palani, Jennifer L. Tennison, G. Bernard Giudice, and Nicholas A. Giudice. 2018. Touchscreen-based haptic information access for assisting blind and visually-impaired users: Perceptual parameters and design guidelines. In Proceedings of the International Conference on Applied Human Factors and Ergonomics. Springer, 837--847.Google Scholar
- Pew. 2019. Demographics of Mobile Device Ownership and Adoption in the United States. Retrieved from https://www.pewinternet.org/fact-sheet/mobile/.Google Scholar
- View Plus. 2018. View Plus. Retrieved from http://www.viewplus.com.Google Scholar
- Edman Polly. 1992. Tactile Graphics. American Foundation for the Blind. New York.Google Scholar
- Benjamin Poppinga, Charlotte Magnusson, Martin Pielot, and Kirsten Rassmus-Gröhn. 2011. TouchOver map: Audio-tactile exploration of interactive maps. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services. ACM, 545--550.Google Scholar
Digital Library
- Monoj Kumar Raja. 2011. The Development and Validation of a New Smartphone Based Non-visual Spatial Interface for Learning Indoor Layouts. Master’s Thesis. The University of Maine.Google Scholar
- Ravi Rastogi and Dianne T. V. Pawluk. 2010. Automatic, intuitive zooming for people who are blind or visually impaired. In Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 239--240.Google Scholar
- Ravi Rastogi and Dianne T. V. Pawluk. 2013. Toward an improved haptic zooming algorithm for graphical information accessed by individuals who are blind and visually impaired. Assist. Technol. 25, 1 (2013), 9--15.Google Scholar
Cross Ref
- Ravi Rastogi, T. V. Dianne Pawluk, and Jessica Ketchum. 2013. Intuitive tactile zooming for graphics accessed by individuals who are blind and visually impaired. IEEE Trans. Neural Syst. Rehab. Eng. 21, 4 (2013), 655--663.Google Scholar
Cross Ref
- Alexander Russomanno, R. Brent Gillespie, Sile O’Modhrain, and Mark Burns. 2015. The design of pressure-controlled valves for a refreshable tactile display. In Proceedings of the IEEE World Haptics Conference (WHC’15). IEEE, 177--182.Google Scholar
Cross Ref
- Freedom Scientific. 2019. Freedom Scientific. Retrieved from https://www.freedomscientific.com/.Google Scholar
- Joseph C. Stevens, Emerson Foulke, and Matthew Q. Patterson. 1996. Tactile acuity, aging, and Braille reading in long-term blindness.J. Exper. Psychol.: Appl. 2, 2 (1996), 91.Google Scholar
Cross Ref
- Ed Summers, Julianna Langston, Robert Allison, and Jennifer Cowley. 2012. Using SAS/GRAPH to create visualizations that also support tactile and auditory interaction. In Proceedings of the SAS Global Forum. Citeseer.Google Scholar
- Hong Z. Tan, Nathaniel I. Durlach, Charlotte M. Reed, and William M. Rabinowitz. 1999. Information transmission with a multifinger tactual display. Percept. Psychophys. 61, 6 (1999), 993--1008.Google Scholar
Cross Ref
- Tanvas. 2019. Tanvas. Retrieved from https://tanvas.co/.Google Scholar
- Blitab Technology. 2018. Blitab. Retrieved from http://blitab.com/.Google Scholar
- Jennifer L. Tennison, Zachary S. Carril, Nicholas A. Giudice, and Jenna L. Gorlewicz. 2018. Comparing haptic pattern matching on tablets and phones: Large screens are not necessarily better. Optom. Vis. Sci. 95, 9 (2018), 720--726.Google Scholar
Cross Ref
- Jennifer L. Tennison and Jenna L. Gorlewicz. 2016. Toward non-visual graphics representations on vibratory touchscreens: Shape exploration and identification. In Proceedings of the International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. Springer, 384--395.Google Scholar
- Jennifer L. Tennison and Jenna L. Gorlewicz. 2019. Non-visual perception of lines on a multimodal touchscreen tablet. ACM Trans. Appl. Percept. 16, 1 (2019), 6.Google Scholar
Digital Library
- Jennifer L. Tennison, P. Merlin Uesbeck, Nicholas A. Giudice, Andreas Stefik, Derrick W. Smith, and Jenna L. Gorlewicz. 2020. Establishing vibration-based tactile line profiles for use in multimodal graphics. Trans. Appl. Percept. 17, 2 (2020), 1--14.Google Scholar
Digital Library
- American Thermoform. 2019. Swell Touch Paper. Retrieved from http://www.americanthermoform.com/product/swell-touch-paper/.Google Scholar
- Robert W. Van Boven, Roy H. Hamilton, Thomas Kauffman, Julian P. Keenan, and Alvaro Pascual-Leone. 2000. Tactile spatial resolution in blind Braille readers. Neurology 54, 12 (2000), 2230--2236.Google Scholar
Cross Ref
- Yasemin Vardar, Burak Güçlü, and Cagatay Basdogan. 2017. Effect of waveform on tactile perception by electrovibration displayed on touch screens. IEEE Trans. Haptics 10, 4 (2017), 488--499.Google Scholar
Digital Library
- ViTAL. 2019. Retrieved from https://www.vital.education/.Google Scholar
- Bruce N. Walker. 2002. Magnitude estimation of conceptual data dimensions for use in sonification.J. Exper. Psychol.: Appl. 8, 4 (2002), 211.Google Scholar
Cross Ref
- Bruce N. Walker, Jeffrey Lindsay, Amanda Nance, Yoko Nakano, Dianne K. Palladino, Tilman Dingler, and Myounghoon Jeon. 2013. Spearcons (speech-based earcons) improve navigation performance in advanced auditory menus. Hum. Fact. 55, 1 (2013), 157--182.Google Scholar
Cross Ref
- Bruce N. Walker and Lisa M. Mauney. 2010. Universal design of auditory graphs: A comparison of sonification mappings for visually impaired and sighted listeners. ACM Trans. Access. Comput. 2, 3 (2010), 12.Google Scholar
Digital Library
- WHO. 2011. Visual Impairment and Blindness. Fact Sheet No. 282. Retrieved from http://www.who.int/mediacentre/factsheets/fs282/en/.Google Scholar
- Maarten W. A. Wijntjes, Thijs Van Lienen, Ilse M. Verstijnen, and Astrid M. L. Kappers. 2008. Look what I have felt: Unidentified haptic line drawings are identified after sketching. Acta Psycholog. 128, 2 (2008), 255--263.Google Scholar
Cross Ref
- Heng Xu, Michael A. Peshkin, and J. Edward Colgate. 2019. UltraShiver: Lateral force feedback on a bare fingertip via ultrasonic oscillation and electroadhesion. IEEE Trans. Haptics 12, 4 (2019), 497--507.Google Scholar
Digital Library
- Koji Yatani and Khai Nhut Truong. 2009. SemFeel: A user interface with semantic tactile feedback for mobile touch-screen devices. In Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology. ACM, 111--120.Google Scholar
Digital Library
- Wai Yu and Stephen A. Brewster. 2002. Comparing two haptic interfaces for multimodal graph rendering. In Proceedings of the 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (HAPTICS’02). IEEE, 3--9.Google Scholar
Digital Library
- Wai Yu and Stephen A. Brewster. 2003. Evaluation of multimodal graphs for blind people. Univ. Access Inf. Soc. 2, 2 (2003), 105--124.Google Scholar
Digital Library
Index Terms
Design Guidelines and Recommendations for Multimodal, Touchscreen-based Graphics
Recommendations
Touch-Based mobile phone interface guidelines and design recommendations for elderly people: a survey of the literature
ICONIP'12: Proceedings of the 19th international conference on Neural Information Processing - Volume Part IVMobile phones are becoming a great necessity for elderly people; the features they provide supported by rich functionality made them one of the indispensable gadgets used in their daily life. However, as mobile phones get more advanced and their ...
Developing a workstation design assistance tool for older knowledge workforce inclusion
The paper addresses two significant and growing trends: knowledge work and population ageing. The population pyramid inversion brings important changes for society as a whole. These changes are associated with the inclusion of older persons in the ...
How Designing for People With and Without Disabilities Shapes Student Design Thinking
ASSETS '16: Proceedings of the 18th International ACM SIGACCESS Conference on Computers and AccessibilityDespite practices addressing disability in design and advocating user-centered design (UCD) approaches, popular mainstream technologies remain largely inaccessible for people with disabilities. We conducted a design course study investigating how ...






Comments