Abstract
In the age of ubiquitous visual technologies and systems, our perceptive apparatuses are constantly challenged, adapted, and shaped by instruments and machines, rendering the observing body as an active site of knowledge. Your Eye's Motion by Luna is an interactive installation that uses real-time eye-tracking to control a robotic creature named Luna (Figure 1). Materializing eye movements through a wondrous spectacle of light, motion, and color, the observer becomes conscious of her gaze enacted and extended by a robotic counterpart. Building on a diverse set of theories and understandings of vision from the fields of cybernetics, visual studies, embodied mind, and more, the project explores how our perceptual apparatuses and bodies are reconfigured in relation to machines and the environment to afford new ways of seeing. Once we see how observing bodies accommodate feedback from actions to cognition, we can uncover the embodied and affective potential of eye movement as an interface for robotics. The curiosity of Luna invests in this potential, articulating a unity between our embodied percepts and machinic environments to create a "vision machine."
Supplemental Material
Available for Download
Supplemental movie, appendix, image and software files for, Instruments of Vision: Eye-Tracking and Robotics as an Embodied Interface
- Jonathan Crary. 1988. Techniques of the Observer. October 45 (1988), 3. DOI: https://doi.org/10.2307/779041Google Scholar
- Jonathan Crary. 2012. Techniques of the Observer: On Vision and Modernity in the Nineteenth Century. MIT Press, Cambridge, MA.Google Scholar
- Andy Clark. 2019. Surfing Uncertainty: Prediction, Action, and the Embodied Mind. Oxford University Press, Oxford, UK, 208--216.Google Scholar
- Anne-Heloise Dautel. n.d. Perceptual moments at Anamorphic Waves exhibition, Ugly Duck, 2019.Google Scholar
- Marcel Duchamp. 1946--1966. Étant donnés. Philadelphia Museum of Art. Retrieved March 30, 2021 from https://www.philamuseum.org/exhibitions/324.html?page=2Google Scholar
- Karl Friston, Rick A. Adams, Laurent Perrinet, and Michael Breakspear. 2012. Perceptions as hypotheses: saccades as experiments. Frontiers in Psychology 3 (2012), 151.Google Scholar
Cross Ref
- Mark B. N. Hansen. 2006. New Philosophy for New Media. MIT Press, Cambridge, MA.Google Scholar
- Jerome Lettvin, Humberto Maturana, Warren McCulloch, and Walter Pitts. 1959. What the frog's eye tells the frog's brain. Proceedings of the IRE 47, 11 (1959), 1950. DOI: https://doi.org/10.1109/JRPROC.1959.287207Google Scholar
Cross Ref
- David C. Lindberg. 1996. Theories of Vision from Al-Kindi to Kepler. University of Chicago Press, Chicago, 5.Google Scholar
- Lev Manovich. 2001. The Language of New Media. MIT Press, Cambridge, MA, 282.Google Scholar
- J. Kevin O'Regan and Alva Noë. 2001. A sensorimotor account of vision and visual consciousness. Behavioral and Brain Sciences 24, 5 (2001), 939.Google Scholar
Cross Ref
- Jeffrey Shaw. 1995. PLACE. Jeffrey Shaw Compendium. https://www.jeffreyshawcompendium.com/platform/place/Google Scholar
- Francisco J. Varela, Eleanor Rosch, and Evan Thompson. 1991. The Embodied Mind: Cognitive Science and Human Experience. MIT Press, Cambridge, MA, 205.Google Scholar
- Paul Virilio. 2007. The Vision Machine. Indiana University Press, Bloomington, 4.Google Scholar
- Heinz von Foerster. 2010. Understanding Understanding: Essays on Cybernetics and Cognition. Springer, New York, NY, 283.Google Scholar
Index Terms
Instruments of Vision: Eye-Tracking and Robotics as an Embodied Interface
Recommendations
Telepresence Robot with Image-Based Face Tracking and 3D Perception with Human Gesture Interface Using Kinect Sensor
SBRLARSROBOCONTROL '14: Proceedings of the 2014 Joint Conference on Robotics: SBR-LARS Robotics Symposium and RobocontrolThis paper presents a Telepresence Mobile Robot using a Kinect sensor as the main perception/interface device. Firstly, using the Kinect camera (Webcam) and image processing techniques, it is possible to detect a human face, allowing the robot to track ...
Humanoid robot control using depth camera
HRI '11: Proceedings of the 6th international conference on Human-robot interactionMost human interactions with the environment depend on our ability to navigate freely and to use our hands and arms to manipulate objects. Developing natural means of controlling these abilities in humanoid robots can significantly broaden the usability ...
Embodied Geometric Reasoning with a Robot: The Impact of Robot Gestures on Student Reasoning about Geometrical Conjectures
CHI '22: Proceedings of the 2022 CHI Conference on Human Factors in Computing SystemsIn this paper, we explore how the physically embodied nature of robots can influence learning through non-verbal communication, such as gesturing. We take an embodied cognition perspective to examine student interactions with a NAO robot that uses ...





Comments