BAVRD 2024

Presenter: Jiwon Yeon PhD

Institution: Stanford University

Poster Title: A Dataset of Eye Movements Under Different Cognitive Tasks in the Same Sensory Environments

Abstract: How much of our internal goals get expressed in the pattern of eye movements that we make? We created a dataset preparing for public sharing to answer this question. Participants (n=15) wore wearable eye-tracking glasses (Neon, Pupil Labs, Germany) and performed tasks in the same sensory environments with different goals. In a pair of tasks, after walking the same path, participants were either asked about the surroundings (walking-no memory task) or tested on a list of 20 words they had memorized before the walking (walking-memory task; the two tasks were counterbalanced across participants). In another pair, participants constructed structures (LEGO-building task; 5 min), and deconstructed the structures and sorted the LEGO blocks by color (LEGO-sorting task; until completed). We validated head movement (i.e., IMU data) in a separate session and calibrated eye positions three times during the experiment, and validated the data using the saccade peak velocity and amplitude. As a first step of our analysis, we looked into saccadic eye movement profiles. We classified saccades using the ReModNav algorithm (Dar, Wagner, & Hanke, 2021). The two walking tasks showed no difference in the behaviors (p’s > 0.05). Compared to the LEGO-sorting task, the LEGO-building task showed smaller saccade amplitude (M = 11.204°, STE: 0.399°; LEGO-sorting: M = 12.585°, STE: 0.422°; t = -2.3, p = 0.03) and longer intersaccadic intervals (M = 0.713 sec, STE = 0.036 sec, LEGO-sorting: M = 0.516 sec, STE = 0.014 sec; t = 4.08 p < 0.01) compared to the sorting task. Our result suggests that internal goals can be distinguished from the patterns of eye movement even in the same sensory environments. It also opens up the possibility that computational modeling techniques may be used to determine internal goals participants had from the eye movement patterns.