10.1145/3334480.3382897acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
abstract

Combining Eye Tracking and Verbal Response to Understand the Impact of a Global Filter

Published:25 April 2020Publication History

ABSTRACT

Visual attention guides the integration of two streams: the global, that rapidly processes the scene; and the local, that processes details. For people with autism, the integration of these two streams can be disrupted by the tendency to privilege details (local processing) instead of seeing the big picture (global processing). Consequently, people with autism may struggle with typical visual attention, evidenced by their verbal description of local features when asked to describe overall scenes. This paper aims to explore how one adult with autism see and understand the global filter of natural scenes.

References

  1. Magali Batty and Margot J. Taylor. 2003. Early processing of the six basic facial emotional expressions. Cognitive Brain Research 17, 3: 613--620.Google ScholarGoogle ScholarCross RefCross Ref
  2. Jules Davidoff, Elisabeth Fonteneau, and Joel Fagot. 2008. Local and global processing: Observations from a remote culture. Cognition 108, 3: 702--709.Google ScholarGoogle ScholarCross RefCross Ref
  3. Susan Hagen. 2012. The mind's eye. Rochester Review 74, 4: 32--37.Google ScholarGoogle Scholar
  4. Gillian R. Hayes, Sen H. Hirano, Gabriela Marcu, Mohamad Monibi, David H. Nguyen, and Michael T. Yeganyan. 2010. Interactive visual supports for children with autism. Personal and Ubiquitous Computing 14, 7: 663--680.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Ronald Hübner and Gregor Volberg. 2005. The Integration of Object Levels and Their Content: A Theory of Global/Local Processing and Related Hemispheric Differences. Journal of Experimental Psychology: Human Perception and Performance 31, 3: 520--541.Google ScholarGoogle Scholar
  6. Seth D. König and Elizabeth A. Buffalo. 2014. A nonparametric method for detecting fixations and saccades using cluster analysis: Removing the need for arbitrary thresholds. Journal of Neuroscience Methods 227: 121--131.Google ScholarGoogle ScholarCross RefCross Ref
  7. John Liechty, Rik Pieters, and Michel Wedel. 2003. Global and local covert visual attention: Evidence from a bayesian hidden markov model. Psychometrika 68, 4: 519--541.Google ScholarGoogle ScholarCross RefCross Ref
  8. E. O. Lillie, B. Patay, J. Diamant, B. Issell, E. J. Topol, and N. J. Schork. 2011. The n-of-1 clinical trial: the ultimate strategy for individualizing medicine? Personalized medicine 8, 2: 161--173.Google ScholarGoogle Scholar
  9. David Navon. 1977. Forest before trees: The precedence of global features in visual perception. Cognitive psychology 9, 3: 353--383.Google ScholarGoogle Scholar
  10. A Oliva. 1997. Coarse Blobs or Fine Edges? Evidence That Information Diagnosticity Changes the Perception of Complex Visual Stimuli. Cognitive Psychology 34, 1: 72--107.Google ScholarGoogle ScholarCross RefCross Ref
  11. Bethany Percha, Edward B. Baskerville, Matthew Johnson, Joel T. Dudley, and Noah Zimmerman. 2019. Designing Robust N-of-1 Studies for Precision Medicine: Simulation Study and Design Recommendations. Journal of Medical Internet Research 21, 4: e12641.Google ScholarGoogle ScholarCross RefCross Ref
  12. Rik Pieters and Michel Wedel. 2017. A review of eye-tracking research in marketing. In Review of marketing research. Routledge, 143--167.Google ScholarGoogle Scholar
  13. Caroline E. Robertson and Simon Baron-Cohen. 2017. Sensory perception in autism. Nature Reviews Neuroscience 18, 11: 671--684.Google ScholarGoogle ScholarCross RefCross Ref
  14. Stine Tanggaard. 2016. Mental Imagery in High-Functioning Autism Spectrum Disorder. Universdity of Oslo.Google ScholarGoogle Scholar
  15. K. Wm. Hall, C. Perin, P. G. Kusalik, C. Gutwin, and S. Carpendale. 2016. Formalizing Emphasis in Information Visualization. Comput. Graph. Forum 35, 3 (June 2016), 717--737.Google ScholarGoogle ScholarCross RefCross Ref
  16. Shuo Wang, Ming Jiang, Xavier Morin Duchesne, Elizabeth A. Laugeson, Daniel P. Kennedy, Ralph Adolphs, and Qi Zhao. 2015. Atypical Visual Saliency in Autism Spectrum Disorder Quantified through Model-Based Eye Tracking. Neuron 88, 3: 604--616. 2Google ScholarGoogle ScholarCross RefCross Ref
  17. Michel Wedel, Rik Pieters, and John Liechty. 2008. Attention switching during scene perception: How goals influence the time course of eye movements across advertisements. Journal of Experimental Psychology: Applied 14, 2: 129--138.Google ScholarGoogle ScholarCross RefCross Ref
  18. Juan Xu, Ming Jiang, Shuo Wang, Mohan S. Kankanhalli, and Qi Zhao. 2014. Predicting human gaze beyond pixels. Journal of Vision 14, 1: 28--28.Google ScholarGoogle ScholarCross RefCross Ref
  19. Viseth Sean, Franceli Linney Cibrian, Jazette Johnson, Hollis Pass, and LouAnne E. Boyd. 2019. Toward digital image processing and eye tracking to promote visual attention for people with autism. In UbiComp/ISWC '19.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Combining Eye Tracking and Verbal Response to Understand the Impact of a Global Filter

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Article Metrics

      • Downloads (Last 12 months)18
      • Downloads (Last 6 weeks)0

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format .

    View HTML Format
    About Cookies On This Site

    We use cookies to ensure that we give you the best experience on our website.

    Learn more

    Got it!