Predictive scene memory enhances peripheral color awareness during active vision
Poster Presentation 53.336: Tuesday, May 19, 2026, 8:30 am – 12:30 pm, Banyan Breezeway
Session: Scene Perception: Virtual reality
Schedule of Events | Search Abstracts | Symposia | Talk Sessions | Poster Sessions
Ben Chamberlain Zivsak1, Anna Mynick2, Caroline Robertson3, Michael Cohen4; 1Dartmouth College, 2Dartmouth College, 3Dartmouth College, 4Amherst College
During active real-world viewing, observers frequently fail to notice when color is removed from most of their visual periphery, revealing profound limits on perceptual awareness (Cohen et al., 2020). Yet, our perceptual experience feels richly saturated across our visual field. What explains this contradiction? Here we ask whether memory for familiar environments supplies the missing color that perception itself fails to deliver (Cohen et al., 2024), building on recent work showing that scene memory selectively speeds judgments for expected adjacent views during planned head movements (Mynick et al., 2025). Participants (N=12) studied five immersive real-world scenes drawn in head-mounted Virtual Reality (VR). In a subsequent task, they viewed either a 45° scene prime of a studied scene that was spatially adjacent to an upcoming target view (valid prime condition, N=6) or a grey neutral prime with no relevant visual information (neutral prime condition, N=6), before executing a head-turn to a corresponding target scene viewpoint. Critically, on the final trial, the target image was completely desaturated except for a central 25 DVA circular aperture. Participants were immediately asked a series of questions to assess awareness of the absence of peripheral color. Participants who received familiar scene primes were significantly more likely to report noticing the loss of peripheral color (33% of trials) than participants in the neutral prime group (0%), despite identical visual input on the target trials. This suggests that visuospatial memory for familiar environments may support the subjective impression of a richly colored visual world during active scene viewing. Taken together, these results suggest that our visual awareness in naturalistic visual settings is actively constructed from predictive memory representations of the surrounding environment (Mynick et al., 2025), shaping both what we expect to see and what we become aware of when those expectations fail.