Active fixational control in a real-world grooming task

Poster Presentation 0.000: Monday, May 18, 2026, 8:30 am – 12:30 pm, Pavilion
Session: Eye Movements: Natural, complex tasks

Ruitao Lin1, Jie Z. Wang1, Michele A. Cox1, Paul Jolly1, Yuanhao H. Li1, Soma Mizobuchi1, T. Scott Murdison2, Alina Neverodska1, Zhetuo Zhao1, Michele Rucci1; 1University of Rochester, 2Reality Labs

During natural visual exploration, head and eye movements cooperate to control the motion of the stimulus on the retina. Previous research with head immobilization has shown that observers regulate their fixational drifts according to task demands: retinal image motion is attenuated when fine spatial discrimination is required. Here we asked whether similar tuning of retinal motion occurs under fully natural, head-free conditions during a dynamic task where demands for fine spatial vision and visuomotor coordination change over time. Observers (N = 7) performed a complex visual search and motor-interaction task inspired by primate social grooming. They searched for small, colorful “fleas” (2.5-mm radius spheres) hidden in a box filled with brown crinkled paper simulating fur. Subjects freely moved their head and hands, using tweezers to manipulate the “fur,” locate the fleas, and deposit them into a container. Precise measurements of head, eye, and hand movements were continuously recorded using a custom apparatus combining a high-precision motion-capture system with a specifically designed magnetic-induction eye tracker. Subjects wore scleral coils in both eyes and a tight-fitting helmet with markers for head tracking. Tweezer position and state (open/closed) were tracked with additional markers. Retinal image motion was reconstructed from head and eye measurements using a two-nodal-point eye model. Retinal motion traces showed the characteristic alternation between saccades and drifts. Critically, intersaccadic drift exhibited systematic modulation during the task: retinal speed declined within individual fixations and was markedly attenuated in the seconds preceding the grasping of a flea. These changes increased the relative power of higher spatial frequencies both within each fixation and across the trial. The results parallel previous findings under head immobilization and demonstrate continuous, task-dependent tuning of retinal motion during natural behavior, revealing a dynamic oculomotor strategy that adaptively reduces retinal motion to support fine spatial processing in complex, naturalistic tasks.

Acknowledgements: Research supported by Reality Labs, NIH EY18363 and P30 EY001319