Humans optimally integrate vision and proprioception during continuous movement

Poster Presentation 53.438: Tuesday, May 19, 2026, 8:30 am – 12:30 pm, Pavilion
Session: Action: Pointing, tracking

Jonathan Tsay1, Pam Villavicencio2, Mabel Ziman1, Dominik Straub3, Matthias Will4, Cris de la Malla2; 1Carnegie Mellon University, 2University of Barcelona, 3University of Cambridge, 4University of California Berkeley

Imagine swatting an annoying fly—tracking its erratic visual motion while continuously sensing your arm’s movement through proprioception. Success in such situations hinges on effective multisensory integration of vision and proprioception. Decades of work using static position judgments show that these cues are combined in a statistically optimal manner. However, whether this optimality principle holds during dynamic, continuous movements remains untested. To address this gap, we employed a continuous psychophysics paradigm in which participants used their left hand to continuously track, for 20-seconds along a constrained forward–backward axis, either a randomly (Brownian motion) moving visual cursor, a proprioceptive signal from a passively moved right hand controlled by a motorized manipulandum, or both signals concurrently. We manipulated visual uncertainty (clear vs. Gaussian-blurred cursor) and visuo-proprioceptive alignment (0-cm vs. 5-cm translational shift). Strikingly, the results closely matched the predictions of optimality in several ways. First, visual tracking degraded under high visual uncertainty. Second, visual tracking was worse than proprioceptive tracking, consistent with greater uncertainty in vision arising from slower neural transmission during movement. Third, when cues were aligned (0-cm shift), multisensory tracking exceeded both visual and proprioceptive unimodal tracking performance, demonstrating optimal cue integration for continuous movements. Fourth, when a visuo-proprioceptive discrepancy (5-cm shift) was introduced, multisensory tracking landed between the two cues but was systematically biased away from vision (the more uncertain cue) and toward proprioception (the less uncertain cue). Finally, after exposure to the discrepancy, vision shifted strongly toward proprioception, whereas proprioception showed minimal shift toward vision—revealing that recalibration, like integration, followed optimality principles. Together, these behavioral findings, supported by preliminary Bayesian modeling, show that vision and proprioception are integrated and recalibrated optimally during continuous movement, extending multisensory optimality from static positional judgments to continuous real-world movements.

Acknowledgements: CM is funded by grant PID2023-150883NB-I00 from the MCIN/AEI/10.13039/501100011033. PV is supported by grant PRE2021-097890 funded by MICIU/AEI/10.13039/501100011033 and by the FSE+.