Representations of imaginary scene in the alpha band

Poster Presentation 26.346: Saturday, May 18, 2024, 2:45 – 6:45 pm, Banyan Breezeway
Session: Visual Memory: Working memory and neural mechanisms

Rico Stecher1, Daniel Kaiser1,2; 1Mathematical Institute, Justus-Liebig-University Gießen, 2Center for Mind, Brain and Behavior (CMBB), Philipps-University Marburg and Justus-Liebig-University Gießen

Our conscious experience is enriched by our brain’s capacity to visualize a myriad of different worlds. How does our brain create such complex mental images? Previous research suggests that our brain generates mental images of individual objects by recalling their visual contents via top-down-related alpha rhythms. Based on this notion, we investigated if more complex visual contents such as natural scenes are also encoded in the alpha band during visual imagery. In our first EEG experiment, participants imagined 16 natural scenes according to detailed three-sentence descriptions and viewed images of them in a separate task. Using multivariate decoding techniques on neural rhythms, we show that imagined scenes and their properties are represented in cortical alpha activity and these representations are partly shared with late scene perception. In a second EEG experiment, we aimed to further characterize these scene representations in the alpha band in individual participants. Here, we tested few participants extensively, having them imagine 16 natural scenes according to short prompts for a total of 10 recording sessions. We then used a latent text-to-image diffusion model to synthesize scene images using the same prompts that were shown to our participants. By comparing layer activations of a scene-trained DNN in response to these AI-generated images to neural scene representations in the alpha band, we show that the contents encoded in imagery-related alpha dynamics of individual participants can be approximated by images „dreamt up“ by generative text-to-image models. Overall, our results indicate that our brain creates mental images of complex natural environments by recalling scene-related visual contents via alpha rhythms.

Acknowledgements: D.K. is supported by the Deutsche Forschungsgemeinschaft (DFG; SFB/TRR 135, project nr. 222641018), an ERC Starting Grant (ERC-2022-STG 101076057) as well as “The Adaptive Mind”, funded by the Hessian Ministry of Higher Education, Science, Research and Art.