A Novel VR-EEG Framework for Studying Adaptive Processes in Natural Vision

Poster Presentation 36.334: Sunday, May 17, 2026, 2:45 – 6:45 pm, Banyan Breezeway
Session: Spatial Vision: Natural images, texture

Nawal Yahiaoui1 (), Peter Neri1,2; 1Ecole Normale Supérieure, PSL University & CNRS, Paris, France, 2Italian Institute of Technology, Erzelli campus, Genoa, Italy

The visual system maintains stable perceptual representations despite constant changes in sensory input produced by self-motion and environmental dynamics. In humans, it has been proposed that perceptual stability arises from adaptive mechanisms that flexibly encode sequences of sensory observations rather than relying purely on spatial maps. Yet how these mechanisms unfold over time and interact with behavior to support persistent visual representation remains unclear. To address this question, we developed a novel experimental framework combining Virtual Reality (VR) with frequency-tagged wireless EEG to examine visual invariance during naturalistic exploration at both behavioral and neural levels. Participants navigated a controlled, achromatic 3D environment in which objects differed only in position, orientation and size. Motor and vestibular cues were minimized by restricting physical movement. Exploration occurred either actively, through self-initiated teleportation, or passively, by following replays of each participant's own exploration trajectory. This design ensured that the two interaction modes provided nearly identical visual stimulation. Following exploration, participants completed a memory-based decision task after being teleported to a 2D bird's-eye view perspective, identifying the object to which a transformation had been applied. Our framework enabled robust decoding of stimulus-specific information via frequency tagging, despite the complexity and noise inherent to VR. Model-based decoding reliably predicted stimulus identity from EEG, and prediction accuracy differed between active and passive exploration. This difference was supported by behavioral performance, demonstrating that exploration mode modulates the encoding of scene features and the formation of invariant representations. These effects could not be attributed to differences in memory performance or attentional engagement, suggesting that exploration mode modulates perceptual encoding rather than post-perceptual processes. Together, these findings highlight the potential of combining VR and EEG to explore the adaptive mechanisms underlying visual invariance and perceptual persistence in naturalistic conditions.