Multisensory Exploration of Visual Motion Perception

Poster Presentation 26.477: Saturday, May 16, 2026, 2:45 – 6:45 pm, Pavilion
Session: Multisensory Processing: Recalibration, temporal

Danica Barron1, Ralph Hale2, Troy Smith3; 1University of North Georgia

Smooth pursuit eye movements are optimized for rigid motion, where spatial relationships remain stable. Non-rigid motion, which continuously deforms, impairs pursuit by reducing gain and increasing positional errors. Koerfer et al. (2024) found that non-rigid motion is incapable of being tracked with smooth pursuit. Preliminary evidence from Barron et al. (2025) suggests a trend toward auditory cues reducing corrective saccade errors during non-rigid motion. Although pursuit is typically considered a predominantly visual process, multisensory theories suggest that redundant auditory or tactile cues could enhance target localization when visual coherence is degraded. We tested whether synchronized audio and tactile cues influence pursuit during non-rigid motion tracking. Participants tracked a vortex-style target across five conditions: rigid visual motion (baseline), non-rigid visual motion alone, and non-rigid motion paired with spatially mapped auditory cues, vibrotactile feedback, or combined audio-tactile cues. Primary measures included pursuit gain, absolute pursuit speed, and post-saccadic positional error. We predicted that non-rigid motion would reliably impair pursuit relative to rigid controls, and that multisensory cues would impact tracking behavior. Auditory cues may provide supplementary velocity information through continuous modulation, whereas tactile feedback may enhance body-centered localization. The combined condition tests whether multisensory redundancy yields additive improvements or interference. Patterns indicating improvements in gain or reduced corrective saccades would support a role for cross-modal influences on oculomotor tracking, whereas persistent deficits would reinforce visual dominance in oculomotor control. This study evaluates whether smooth pursuit's traditional visual-centric characterization holds under conditions of unstable motion structure, demonstrating how multisensory signals impact tracking behavior.