Visuo-motor gain adaptation for visual stability during horizontal head translation with and without binocular disparity
Poster Presentation 16.309: Friday, May 15, 2026, 3:45 – 6:00 pm, Banyan Breezeway
Session: Multisensory Processing: Motor
Schedule of Events | Search Abstracts | Symposia | Talk Sessions | Poster Sessions
Nobuyoshi Takase1 (), Hiroaki Shigemasu2, Michiteru Kitazaki1; 1Toyohashi University of Technology, 2Kochi University of Technology
Perceiving a stable visual world despite head movements requires accurate visuo-motor coordination. Kim et al. (Front.VR, 2021) showed that during horizontal linear head oscillations, increasing visuo-motor gain conflict makes the scene look more unstable. Kitazaki (i-Perception, 2013) demonstrated that visuo-motor adaptation of visual stability emerges after two minutes of yaw-axis rotation and is not strictly specific to retinal location. However, for linear head movements, it remains unclear how prolonged exposure alters the visuo-motor gain that observers perceive as yielding a stationary scene, and how other aspects of self-motion perception and posture respond to gain manipulation (Teng et al., Sci.Rep., 2025). The present study investigated how the visuo-motor gain for stable perception adapts to different gains, with or without binocular disparity. Twenty-seven participants viewed a virtual cloud of random spheres while their head movements were tracked with a 3D-tracker-embedded head-mounted display. The virtual viewpoint was updated according to the measured horizontal head translation with a gain of 0.5, 1.0, or 2.0. Each session consisted of pre-test, adaptation, and post-test trials. In the pre- and post-tests, participants adjusted the visuo-motor gain during left–right head movements so that the visual environment appeared stable. During the 5-min adaptation phase, they made left–right head movements (30 cm, 0.33 Hz) while viewing the spheres at a fixed gain (0.5, 1.0, or 2.0) either with or without binocular disparity. Each participant completed six sessions (3 gains × 2 disparity conditions) in random order. We found a significant main effect of gain (F(2, 52)=5.38, p=.008), but no main effect of disparity and no interaction. The visuo-motor gain that produced a stable scene in the post-test shifted toward the adaptation gain, both with and without binocular disparity. These results suggest that visuo-motor gain adaptation for stable perception occurs for linear head movements and is largely independent of binocular disparity.
Acknowledgements: Supported by JSPS Kakenhi Grant Numbers JP22KK0158, Murata Science and Education Foundation