Dissociable Decoding of Predictive Sensory Processing from EEG Oscillations

Poster Presentation 23.355: Saturday, May 18, 2024, 8:30 am – 12:30 pm, Banyan Breezeway
Session: Multisensory Processing: Neural coding

Soukhin Das1,2 (), Dr. Mingzhou Ding3, Dr. George (Ron) Mangun1,2,4; 1Center for Mind and Brain, 2Department of Psychology, University of California, Davis, 3Pruitt Family Department of Biomedical Engineering, University of Florida, 4Department of Neurology, University of California, Davis

Studies have established that attention is capable of operating across diverse sensory modalities, such as vision and audition, playing a pivotal role in our ability to integrate and process multisensory information. Despite this, the neural mechanisms that underlie cross-modal attention remain largely elusive. In this investigation, we utilized event-related potentials (ERPs) to probe the neural basis of cross-modal attention through a 2x2 cue-target design. Auditory (HEAR or SEE) or visual cues (H or S) were employed to indicate the modality (visual/auditory) of the to-be-attended target. After a random delay, auditory tones or visual gratings were presented as target stimuli in the cued modality in 80% of the trials. Conversely, in 20% of the trials, the targets were presented in the un-cued modality, constituting invalid trials. Participants (n=32) were instructed to discriminate the frequency (wide versus narrow) of visual gratings or the tone (high versus low) of auditory stimuli across all trials, regardless of cue validity. Decoding alpha power using SVM uncovered distinctive patterns in early and late latencies during the cue-to-target period. Alpha oscillations exhibited unique cortical patterns based on the to-be-attended target modality. We found robust decoding accuracies for the to-be-attended modality within respective sensory areas, i.e., central electrodes for the auditory and parieto-occipital electrodes for the visual modality. Temporal generalization further illustrated the evolving nature of alpha patterns over time. For both modalities, our findings indicated the sustained representation of sensory information in a serial manner across the hierarchy, emphasizing the maintenance of predictive processing. Furthermore, an alignment between cortical alpha patterns during stimulus processing and the response window suggested a connection between prediction signals and decision-making processes. Our findings contribute to understanding the role of alpha oscillations in cross-modal attentional control. This work extends the current framework for decoding the neural mechanisms of cross-modal attention.

Acknowledgements: NIMH117991