Tracking visual targets during simulated self-motion

Poster Presentation 26.309: Saturday, May 18, 2024, 2:45 – 6:45 pm, Banyan Breezeway
Session: Motion: Optic flow

Matt D. Anderson1 (), Jorge Otero-Millan, Emily A. Cooper; 1University of California, Berkeley, 2University of California, Berkeley, 3University of California, Berkeley

Movements of the head and body create visual motion across the retina, often called optic flow. When we try to fixate on a visual target embedded in optic flow, eye movements are required to compensate for the target's retinal motion and stabilize its image on the fovea. Previous research (Niemann et al., 1998; Lappe et al., 1998), however, has demonstrated that passive tracking of targets embedded in optic flow can be undercompensatory (gain < 1): stimulus speed exceeds eye speed and the foveal image is not stable. Undercompensation was observed in experiments simulating forward self-motion over a rendered ground plane. Features on the ground plane were rendered at a fixed scale and distributed uniformly in world-space, such that perspective projection created a density gradient on the retina. In this scenario, the average of perifoveal motion signals is biased towards slower speeds because feature density is maximized towards the horizon where speed equals zero. If perifoveal features influence tracking speed, this could explain the low gain. However, the contrast energy of natural images is, on average, scale-invariant (i.e., fractal). Thus, natural scenes, should not contain the same gradient in visual feature density. We thus predicted that passive visual tracking during forward translation over naturalistic ground textures should show a gain closer to unity. We tested this prediction by asking observers to passively view forward-translation optic flow with different ground density patterns. We measured eye movements with a video-based eye tracker. Ground-plane textures were of uniform feature density in either world or retinal coordinates. Our results suggest that tracking gain was higher for stimuli with uniform retinal density. If feature tracking during self-motion is sensitive to the spatial distribution of motion signals, then investigating the spatial properties of optic flow in natural scenes is an important element of modelling oculomotor behavior.

Acknowledgements: This was work partially supported by Alcon.