Neural interpolation of dynamic visual information in natural scenes

Poster Presentation 26.466: Saturday, May 18, 2024, 2:45 – 6:45 pm, Pavilion
Session: Scene Perception: Neural mechanisms

There is a Poster PDF for this presentation, but you must be a current member or registered to attend VSS 2024 to view it.
Please go to your Account Home page to register.

Lu-Chun Yeh1 (), Max Bardelang1, Daniel Kaiser1,2; 1Mathematical Institute, Department of Mathematics and Computer Science, Physics, Geography, Justus Liebig University Gießen, 2Center for Mind, Brain and Behavior (CMBB), Philipps University Marburg and Justus Liebig University Gießen

Adaptive natural vision requires our brain to interpolate missing information about occluded objects in the environment. Previous studies suggest that this process is supported by visual cortex „filling in” occluded parts of scene images. However, we live in a dynamic world, and objects keep moving in and out of occlusion (e.g., when trains move through tunnels). Here, we used multivariate pattern analysis on time-frequency-resolved EEG data to track neural representations during dynamic occlusion. Participants watched 4-second videos of a person walking across a scene (either left-to-right or right-to-left) while performing an unrelated fixation task. The videos featured three conditions: The person walking across a blank background (isolated condition), across the scene without occlusion (visible condition), or across the scene while being dynamically occluded between 1.5 and 3 seconds (occluded condition). We trained linear classifiers on EEG response patterns to discriminate rightward- and leftward-walking in the isolated condition and tested them on the visible and occluded conditions. Classifiers trained on time-locked broadband responses, as well as on alpha (8-12Hz) and beta (13-30Hz) rhythms, successfully discriminated walking direction in the visible condition. However, only classifiers trained on alpha rhythms could discriminate walking direction in the occluded condition. Critically, we introduced an additional condition during which the person stopped in front of a natural obstacle (e.g., a river). We found that alpha dynamics tracked the termination of motion in this condition, even when it was hidden by the occluder. Together, our results provide evidence for an automatic interpolation of information during dynamic occlusion. The alpha dynamics that mediate this interpolation may constitute a neural correlate of top-down processes that „fill in“ missing information based on context.