skip to main content
research-article
Public Access

Design Guidelines and Recommendations for Multimodal, Touchscreen-based Graphics

Published:03 August 2020Publication History
Skip Abstract Section

Abstract

With content rapidly moving to the electronic space, access to graphics for individuals with visual impairments is a growing concern. Recent research has demonstrated the potential for representing basic graphical content on touchscreens using vibrations and sounds, yet few guidelines or processes exist to guide the design of multimodal, touchscreen-based graphics. In this work, we seek to address this gap by synergizing our collective research efforts over the past eight years and implementing our findings into a compilation of recommendations, which we validate through an iterative design process and user study. We start by reviewing previous work and then collate findings into a set of design guidelines for generating basic elements of touchscreen-based multimodal graphics. We then use these guidelines to generate exemplary graphics in mathematics, specifically bar charts and geometry concepts. We discuss the iterative design process of moving from guidelines to actual graphics and highlight challenges. We then present a formal user study with 22 participants with visual impairments, comparing learning performance on using touchscreen-rendered graphics to embossed graphics. We conclude with qualitative feedback from participants on the touchscreen-based approach and offer areas of future investigation as these recommendation are expanded to include more complex graphical concepts.

References

  1. Lopez Ada. 2014. Reach for the Stars: Touch, Look, Listen, Learn. SAS Institute. Retrieved from https://books.apple.com/us/book/reach-for-stars-touch-look/id763516126.Google ScholarGoogle Scholar
  2. Richard Adams, Dianne Pawluk, Margaret Fields, and Ryan Clingman. 2015. Multimodal application for the perception of spaces (MAPS). In Proceedings of the 17th International ACM SIGACCESS Conference on Computers 8 Accessibility. ACM, 393--394.Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. AFB. 2017. Key Employment Statistics for People Who Are Blind or Visually Impaired. Retrieved from https://www.afb.org/research-and-initiatives/statistics/key-employment-statistics.Google ScholarGoogle Scholar
  4. APH. 2018. Graphiti. Retrieved from http://www.aph.org/graphiti/.Google ScholarGoogle Scholar
  5. Apple. 2019. Vision Accessibility. Retrieved from https://www.apple.com/accessibility/mac/vision/.Google ScholarGoogle Scholar
  6. Shiri Azenkot, Cynthia L. Bennett, and Richard E. Ladner. 2013. DigiTaps: Eyes-free number entry on touchscreens with minimal audio feedback. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST’13). ACM, New York, NY, 85--90. DOI:https://doi.org/10.1145/2501988.2502056Google ScholarGoogle Scholar
  7. Catherine M. Baker, Lauren R. Milne, Ryan Drapeau, Jeffrey Scofield, Cynthia L. Bennett, and Richard E. Ladner. 2016. Tactile graphics with a voice. ACM Trans. Access. Comput. 8, 1 (2016), 3.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. BANA. 2019. Code Books and Guidelines. Retrieved from http://www.brailleauthority.org/publications-area.html.Google ScholarGoogle Scholar
  9. BLC. 2019. Braille Literacy Canada. Retrieved from http://www.brailleliteracycanada.ca/en/.Google ScholarGoogle Scholar
  10. Stephen A. Brewster and Lorna M. Brown. 2004. Tactons: Structured tactile messages for non-visual information display. In Proceedings of the 5th Conference on Australasian User Interface-Volume 28. Australian Computer Society, Inc., 15--23.Google ScholarGoogle Scholar
  11. Stephen A. Brewster, Faraz Chohan, and Lorna Brown. 2007. Tactile feedback for mobile interactions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 159--162.Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Erin Buehler, Shaun K. Kane, and Amy Hurst. 2014. ABC and 3D: Opportunities and obstacles to 3D printing in special education environments. In Proceedings of the 16th International ACM SIGACCESS Conference on Computers 8 Accessibility. ACM, 107--114.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Maria Claudia Buzzi, Marina Buzzi, Barbara Leporini, and Amaury Trujillo. 2017. Analyzing visually impaired people’s touch gestures on smartphones. Multimedia Tools Applic. 76, 4 (2017), 5141--5169.Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Seungmoon Choi and Katherine J. Kuchenbecker. 2013. Vibrotactile display: Perception, technology, and applications. Proc. IEEE 101, 9 (2013), 2093--2104.Google ScholarGoogle ScholarCross RefCross Ref
  15. Rafael Jeferson Pezzuto Damaceno, Juliana Cristina Braga, and Jesús Pascual Mena-Chalco. 2018. Mobile device accessibility for the visually impaired: Problems mapping and recommendations. Univ. Access Inf. Soc. 17, 2 (2018), 421--435.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Core Haptics | Apple Developer. 2020. Retrieved from https://developer.apple.com/documentation/corehaptics.Google ScholarGoogle Scholar
  17. Julie Ducasse, Anke M. Brock, and Christophe Jouffrais. 2018. Accessible interactive maps for visually impaired users. In Mobility of Visually Impaired People. Springer, 537--584.Google ScholarGoogle Scholar
  18. Christin Engel and Gerhard Weber. 2018. A user study to evaluate tactile charts with blind and visually impaired people. In Proceedings of the International Conference on Computers Helping People with Special Needs. Springer, 177--184.Google ScholarGoogle ScholarCross RefCross Ref
  19. William Erickson, Camille Lee, and Sarah von Schrader. 2012. Disability Statistics from the 2011 American Community Survey (ACS). Retrieved from www.disabilitystatistics.org.Google ScholarGoogle Scholar
  20. George A. Gescheider, John H. Wright, and Ronald T. Verrillo. 2010. Information-processing Channels in the Tactile Sensory System: A Psychophysical and Physiological Analysis. Psychology Press.Google ScholarGoogle Scholar
  21. Frédéric Giraud, Tomohiro Hara, Christophe Giraud-Audine, Michel Amberg, Betty Lemaire-Semail, and Masaya Takasaki. 2018. Evaluation of a friction reduction based haptic surface at high frequency. In Proceedings of the IEEE Haptics Symposium (HAPTICS’18). IEEE, 210--215.Google ScholarGoogle ScholarCross RefCross Ref
  22. Nicholas A. Giudice, Benjamin A. Guenther, Nicholas A. Jensen, and Kaitlyn N. Haase. 2020. Cognitive mapping without vision: Comparing wayfinding performance after learning from digital touchscreen-based multimodal maps vs. embossed tactile overlays. Front. Hum. Neurosci. 14 (2020), 87.Google ScholarGoogle ScholarCross RefCross Ref
  23. Nicholas A. Giudice, Hari Prasath Palani, Eric Brenner, and Kevin M. Kramer. 2012. Learning non-visual graphical information using a touch-based vibro-audio interface. In Proceedings of the 14th International ACM SIGACCESS Conference on Computers 8 Accessibility. ACM, 103--110.Google ScholarGoogle Scholar
  24. Cagatay Goncu and Kim Marriott. 2011. GraVVITAS: Generic multi-touch presentation of accessible graphics. In Proceedings of the Conference on Human-Computer Interaction. Springer, 30--48.Google ScholarGoogle ScholarCross RefCross Ref
  25. Jenna L. Gorlewicz, Jessica Burgner, Thomas J. Withrow, and Robert J. Webster III. 2014. Initial experiences using vibratory touchscreens to display graphical math concepts to students with visual impairments. J. Special Educ. Technol. 29, 2 (2014), 17--25.Google ScholarGoogle ScholarCross RefCross Ref
  26. Jenna L. Gorlewicz, Jennifer L. Tennison, Hari Prasath Palani, and Nicholas A. Giudice. 2018. The graphical access challenge for people with visual impairments: Positions and pathways forward. In Interactive Multimedia. IntechOpen.Google ScholarGoogle Scholar
  27. W. Grussenmeyer and E. Folmer. 2017. Accessible touchscreen technology for people with visual impairments: A survey. Trans. Access. Comput. 9, 2 (2017), 6:1--6:31.Google ScholarGoogle Scholar
  28. João Guerreiro and Daniel Gonçalves. 2015. Faster text-to-speeches: Enhancing blind people’s information scanning with faster concurrent speech. In Proceedings of the 17th International ACM SIGACCESS Conference on Computers 8 Accessibility. 3--11.Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Tiago Guerreiro, Kyle Montague, João Guerreiro, Rafael Nunes, Hugo Nicolau, and Daniel J. V. Gonçalves. 2015. Blind people interacting with large touch surfaces: Strategies for one-handed and two-handed exploration. In Proceedings of the International Conference on Interactive Tabletops 8 Surfaces. 25--34.Google ScholarGoogle Scholar
  30. Michael Hahn, Corrine Mueller, and Jenna L. Gorlewicz. 2019. The comprehension of stem graphics via a multi-sensory tablet in students with visual impairment. J. Vis. Impair. Blind. 113, 5 (2019).Google ScholarGoogle ScholarCross RefCross Ref
  31. Yvette Hatwell, Arlette Streri, and Edouard Gentaz. 2003. Touching for Knowing: Cognitive Psychology of Haptic Manual Perception. Vol. 53. John Benjamins Publishing.Google ScholarGoogle Scholar
  32. Eve Hoggan, Stephen A. Brewster, and Jody Johnston. 2008. Investigating the effectiveness of tactile feedback for mobile touchscreens. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 1573--1582.Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Inwook Hwang and Seungmoon Choi. 2010. Perceptual space and adjective rating of sinusoidal vibrations perceived via mobile device. In Proceedings of the IEEE Haptics Symposium. IEEE, 1--8.Google ScholarGoogle Scholar
  34. Apple iOS. 2019. Haptics—User Interaction. Retrieved from https://developer.apple.com/design/human-interface-guidelines/ios/user-interaction/haptics/.Google ScholarGoogle Scholar
  35. Yvonne Jansen, Thorsten Karrer, and Jan Borchers. 2010. MudPad: Tactile feedback and haptic texture overlay for touch surfaces. In Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces. ACM, 11--14.Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Lynette A. Jones and Susan J. Lederman. 2006. Human Hand Function. Oxford University Press.Google ScholarGoogle Scholar
  37. Shaun K. Kane, Jeffrey P. Bigham, and Jacob O. Wobbrock. 2008. Slide rule: Making mobile touch screens accessible to blind people using multi-touch interaction techniques. In Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 73--80.Google ScholarGoogle Scholar
  38. Shaun K. Kane, Meredith Ringel Morris, Annuska Z. Perkins, Daniel Wigdor, Richard E. Ladner, and Jacob O. Wobbrock. 2011. Access overlays: Improving non-visual access to large touch screens for blind users. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (UIST’11). ACM, New York, NY, 273--282. DOI:https://doi.org/10.1145/2047196.2047232Google ScholarGoogle Scholar
  39. Roberta L. Klatzky, Nicholas A. Giudice, Christopher R. Bennett, and Jack M. Loomis. 2014. Touch-screen technology for the dynamic display of 2D spatial information without vision: Promise and progress.Multisens. Res. 27 (2014), 359--378.Google ScholarGoogle ScholarCross RefCross Ref
  40. Roberta L. Klatzky and Susan J. Lederman. 2003. Touch. In Handbook Psychol. A. F. Healy and R. W. Proctor (Eds.). John Wiley 8 Sons, 147--176.Google ScholarGoogle Scholar
  41. Susan J. Lederman and Roberta L. Klatzky. 1987. Hand movements: A window into haptic object recognition. Cog. Psychol. 19 (1987), 342--368.Google ScholarGoogle ScholarCross RefCross Ref
  42. Susan J. Lederman and Roberta L. Klatzky. 2009. Haptic perception: A tutorial. Atten. Percept. Psychophys. 71, 7 (2009), 1439--1459.Google ScholarGoogle ScholarCross RefCross Ref
  43. Jack M. Loomis, Roberta L. Klatzky, and Nicholas A. Giudice. 2018. Sensory substitution of vision: Importance of perceptual and cognitive processing. In Assistive Technology for Blindness and Low Vision. CRC Press, 179--210.Google ScholarGoogle Scholar
  44. Jack M. Loomis and Susan J. Lederman. 1986. Tactual perception. Handbook Percept. Hum. Perform. 2 (1986), 2.Google ScholarGoogle Scholar
  45. James Minogue and M. Gail Jones. 2006. Haptics in education: Exploring an untapped sensory modality. Rev. Educ. Res. 76, 3 (2006), 317--348.Google ScholarGoogle ScholarCross RefCross Ref
  46. Annika Muehlbradt, Madhur Atreya, Darren Guinness, and Shaun K. Kane. 2018. Exploring the design of audio-kinetic graphics for education. In Proceedings of the International Conference on Multimodal Interaction. ACM, 455--463.Google ScholarGoogle Scholar
  47. Joseph Mullenbach, Michael A. Peshkin, and J. Edward Colgate. 2016. eShiver: Lateral force feedback on fingertips through oscillatory motion of an electroadhesive surface. IEEE Trans. Haptics 10, 3 (2016), 358--370.Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. Michael A. Nees and Bruce N. Walker. 2008. Encoding of information in auditory displays: Initial research on flexibility and processing codes in dual-task scenarios. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Vol. 52. SAGE Publications, Los Angeles, CA, 1820--1824.Google ScholarGoogle Scholar
  49. Michael A. Nees and Bruce N. Walker. 2009. Auditory Interfaces and Sonification. 1--15. http://sonify.psych.gatech.edu/∼walkerb/publications/pdfs/NeesWalker-UniversalAccessChapter2007-submitted.pdf.Google ScholarGoogle Scholar
  50. Uran Oh, Stacy Branham, Leah Findlater, and Shaun K. Kane. 2015. Audio-based feedback techniques for teaching touchscreen gestures. ACM Trans. Access. Comput. 7, 3 (2015), 9.Google ScholarGoogle ScholarDigital LibraryDigital Library
  51. Sile O’Modhrain, Nicholas A. Giudice, John A. Gardner, and Gordon E. Legge. 2015. Designing media for visually-impaired users of refreshable touch display: Possibilities and pitfalls. Trans. Haptics 8, 3 (2015), 248--257.Google ScholarGoogle ScholarDigital LibraryDigital Library
  52. Hari Prasath Palani and Nicholas A. Giudice. 2014. Evaluation of non-visual panning operations using touch-screen devices. In Proceedings of the 16th International ACM SIGACCESS Conference on Computers 8 Accessibility. ACM, 293--294.Google ScholarGoogle Scholar
  53. Hari Prasath Palani and Nicholas A. Giudice. 2017. Principles for designing large-format refreshable haptic graphics using touchscreen devices: An evaluation of nonvisual panning methods. ACM Trans. Access. Comput. 9, 3 (2017), 9.Google ScholarGoogle ScholarDigital LibraryDigital Library
  54. Hari Prasath Palani, Uro Giudice, and Nicholas A. Giudice. 2016. Evaluation of non-visual zooming operations on touchscreen devices. In Proceedings of the International Conference on Universal Access in Human-Computer Interaction. Springer, 162--174.Google ScholarGoogle Scholar
  55. Hari Prasath Palani, Jennifer L. Tennison, G. Bernard Giudice, and Nicholas A. Giudice. 2018. Touchscreen-based haptic information access for assisting blind and visually-impaired users: Perceptual parameters and design guidelines. In Proceedings of the International Conference on Applied Human Factors and Ergonomics. Springer, 837--847.Google ScholarGoogle Scholar
  56. Pew. 2019. Demographics of Mobile Device Ownership and Adoption in the United States. Retrieved from https://www.pewinternet.org/fact-sheet/mobile/.Google ScholarGoogle Scholar
  57. View Plus. 2018. View Plus. Retrieved from http://www.viewplus.com.Google ScholarGoogle Scholar
  58. Edman Polly. 1992. Tactile Graphics. American Foundation for the Blind. New York.Google ScholarGoogle Scholar
  59. Benjamin Poppinga, Charlotte Magnusson, Martin Pielot, and Kirsten Rassmus-Gröhn. 2011. TouchOver map: Audio-tactile exploration of interactive maps. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services. ACM, 545--550.Google ScholarGoogle ScholarDigital LibraryDigital Library
  60. Monoj Kumar Raja. 2011. The Development and Validation of a New Smartphone Based Non-visual Spatial Interface for Learning Indoor Layouts. Master’s Thesis. The University of Maine.Google ScholarGoogle Scholar
  61. Ravi Rastogi and Dianne T. V. Pawluk. 2010. Automatic, intuitive zooming for people who are blind or visually impaired. In Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 239--240.Google ScholarGoogle Scholar
  62. Ravi Rastogi and Dianne T. V. Pawluk. 2013. Toward an improved haptic zooming algorithm for graphical information accessed by individuals who are blind and visually impaired. Assist. Technol. 25, 1 (2013), 9--15.Google ScholarGoogle ScholarCross RefCross Ref
  63. Ravi Rastogi, T. V. Dianne Pawluk, and Jessica Ketchum. 2013. Intuitive tactile zooming for graphics accessed by individuals who are blind and visually impaired. IEEE Trans. Neural Syst. Rehab. Eng. 21, 4 (2013), 655--663.Google ScholarGoogle ScholarCross RefCross Ref
  64. Alexander Russomanno, R. Brent Gillespie, Sile O’Modhrain, and Mark Burns. 2015. The design of pressure-controlled valves for a refreshable tactile display. In Proceedings of the IEEE World Haptics Conference (WHC’15). IEEE, 177--182.Google ScholarGoogle ScholarCross RefCross Ref
  65. Freedom Scientific. 2019. Freedom Scientific. Retrieved from https://www.freedomscientific.com/.Google ScholarGoogle Scholar
  66. Joseph C. Stevens, Emerson Foulke, and Matthew Q. Patterson. 1996. Tactile acuity, aging, and Braille reading in long-term blindness.J. Exper. Psychol.: Appl. 2, 2 (1996), 91.Google ScholarGoogle ScholarCross RefCross Ref
  67. Ed Summers, Julianna Langston, Robert Allison, and Jennifer Cowley. 2012. Using SAS/GRAPH to create visualizations that also support tactile and auditory interaction. In Proceedings of the SAS Global Forum. Citeseer.Google ScholarGoogle Scholar
  68. Hong Z. Tan, Nathaniel I. Durlach, Charlotte M. Reed, and William M. Rabinowitz. 1999. Information transmission with a multifinger tactual display. Percept. Psychophys. 61, 6 (1999), 993--1008.Google ScholarGoogle ScholarCross RefCross Ref
  69. Tanvas. 2019. Tanvas. Retrieved from https://tanvas.co/.Google ScholarGoogle Scholar
  70. Blitab Technology. 2018. Blitab. Retrieved from http://blitab.com/.Google ScholarGoogle Scholar
  71. Jennifer L. Tennison, Zachary S. Carril, Nicholas A. Giudice, and Jenna L. Gorlewicz. 2018. Comparing haptic pattern matching on tablets and phones: Large screens are not necessarily better. Optom. Vis. Sci. 95, 9 (2018), 720--726.Google ScholarGoogle ScholarCross RefCross Ref
  72. Jennifer L. Tennison and Jenna L. Gorlewicz. 2016. Toward non-visual graphics representations on vibratory touchscreens: Shape exploration and identification. In Proceedings of the International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. Springer, 384--395.Google ScholarGoogle Scholar
  73. Jennifer L. Tennison and Jenna L. Gorlewicz. 2019. Non-visual perception of lines on a multimodal touchscreen tablet. ACM Trans. Appl. Percept. 16, 1 (2019), 6.Google ScholarGoogle ScholarDigital LibraryDigital Library
  74. Jennifer L. Tennison, P. Merlin Uesbeck, Nicholas A. Giudice, Andreas Stefik, Derrick W. Smith, and Jenna L. Gorlewicz. 2020. Establishing vibration-based tactile line profiles for use in multimodal graphics. Trans. Appl. Percept. 17, 2 (2020), 1--14.Google ScholarGoogle ScholarDigital LibraryDigital Library
  75. American Thermoform. 2019. Swell Touch Paper. Retrieved from http://www.americanthermoform.com/product/swell-touch-paper/.Google ScholarGoogle Scholar
  76. Robert W. Van Boven, Roy H. Hamilton, Thomas Kauffman, Julian P. Keenan, and Alvaro Pascual-Leone. 2000. Tactile spatial resolution in blind Braille readers. Neurology 54, 12 (2000), 2230--2236.Google ScholarGoogle ScholarCross RefCross Ref
  77. Yasemin Vardar, Burak Güçlü, and Cagatay Basdogan. 2017. Effect of waveform on tactile perception by electrovibration displayed on touch screens. IEEE Trans. Haptics 10, 4 (2017), 488--499.Google ScholarGoogle ScholarDigital LibraryDigital Library
  78. ViTAL. 2019. Retrieved from https://www.vital.education/.Google ScholarGoogle Scholar
  79. Bruce N. Walker. 2002. Magnitude estimation of conceptual data dimensions for use in sonification.J. Exper. Psychol.: Appl. 8, 4 (2002), 211.Google ScholarGoogle ScholarCross RefCross Ref
  80. Bruce N. Walker, Jeffrey Lindsay, Amanda Nance, Yoko Nakano, Dianne K. Palladino, Tilman Dingler, and Myounghoon Jeon. 2013. Spearcons (speech-based earcons) improve navigation performance in advanced auditory menus. Hum. Fact. 55, 1 (2013), 157--182.Google ScholarGoogle ScholarCross RefCross Ref
  81. Bruce N. Walker and Lisa M. Mauney. 2010. Universal design of auditory graphs: A comparison of sonification mappings for visually impaired and sighted listeners. ACM Trans. Access. Comput. 2, 3 (2010), 12.Google ScholarGoogle ScholarDigital LibraryDigital Library
  82. WHO. 2011. Visual Impairment and Blindness. Fact Sheet No. 282. Retrieved from http://www.who.int/mediacentre/factsheets/fs282/en/.Google ScholarGoogle Scholar
  83. Maarten W. A. Wijntjes, Thijs Van Lienen, Ilse M. Verstijnen, and Astrid M. L. Kappers. 2008. Look what I have felt: Unidentified haptic line drawings are identified after sketching. Acta Psycholog. 128, 2 (2008), 255--263.Google ScholarGoogle ScholarCross RefCross Ref
  84. Heng Xu, Michael A. Peshkin, and J. Edward Colgate. 2019. UltraShiver: Lateral force feedback on a bare fingertip via ultrasonic oscillation and electroadhesion. IEEE Trans. Haptics 12, 4 (2019), 497--507.Google ScholarGoogle ScholarDigital LibraryDigital Library
  85. Koji Yatani and Khai Nhut Truong. 2009. SemFeel: A user interface with semantic tactile feedback for mobile touch-screen devices. In Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology. ACM, 111--120.Google ScholarGoogle ScholarDigital LibraryDigital Library
  86. Wai Yu and Stephen A. Brewster. 2002. Comparing two haptic interfaces for multimodal graph rendering. In Proceedings of the 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (HAPTICS’02). IEEE, 3--9.Google ScholarGoogle ScholarDigital LibraryDigital Library
  87. Wai Yu and Stephen A. Brewster. 2003. Evaluation of multimodal graphs for blind people. Univ. Access Inf. Soc. 2, 2 (2003), 105--124.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Design Guidelines and Recommendations for Multimodal, Touchscreen-based Graphics

                Recommendations

                Comments

                Login options

                Check if you have access through your login credentials or your institution to get full access on this article.

                Sign in

                Full Access

                • Published in

                  cover image ACM Transactions on Accessible Computing
                  ACM Transactions on Accessible Computing  Volume 13, Issue 3
                  September 2020
                  152 pages
                  ISSN:1936-7228
                  EISSN:1936-7236
                  DOI:10.1145/3415159
                  Issue’s Table of Contents

                  Copyright © 2020 ACM

                  Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

                  Publisher

                  Association for Computing Machinery

                  New York, NY, United States

                  Publication History

                  • Published: 3 August 2020
                  • Accepted: 1 May 2020
                  • Revised: 1 February 2020
                  • Received: 1 September 2019
                  Published in taccess Volume 13, Issue 3

                  Permissions

                  Request permissions about this article.

                  Request Permissions

                  Check for updates

                  Qualifiers

                  • research-article
                  • Research
                  • Refereed

                PDF Format

                View or Download as a PDF file.

                PDF

                eReader

                View online with eReader.

                eReader

                HTML Format

                View this article in HTML Format .

                View HTML Format