Multimodal metaperception: insights from multisensory integration

Poster Presentation: Saturday, May 18, 2024, 8:30 am – 12:30 pm, Pavilion
Session: Decision Making: Perceptual decision making 1

Nicola Domenici1, Pascal Mamassian1; 1École Normale Supérieure, PSL University, CNRS, Paris, France

Trusting our perception is crucial to interacting with the external world. Although most of the research on the topic focused on one single sense at a time, in our daily lives the brain is stormed by different sensory stimulations which are often fused into unitary percepts. To date, however, how the brain evaluates the trustworthiness of multimodal representations is still unclear. To investigate this, we developed an ensemble of visual, auditory, and audiovisual temporal bisection tasks. Stimuli consisted of three sequential events and participants had to report if the second event was closer to the first or last. Stimuli could be purely visual (100ms flash), purely auditory (100ms pink noise), or a combination of the two. When both sensory cues were presented, they were either synchronous or with an asynchrony of 100ms. Introducing such bimodal stimulations was essential to increasing sensitivity and creating consistent perceptual biases. Four stimulus difficulties were chosen, leading to 0.15, 0.35, 0.65, and 0.85 probability of responding ‘closer-to-the-last-event’. These four difficulty levels were then placed in a confidence-forced-choice design, asking 15 participants to perform two consecutive perceptual decisions and report which one they felt was most likely to be correct. Notably, participants were instructed to base their confidence on visual information only, which made it possible to investigate how bimodal conflicts were mirrored at the confidence level. To better represent the dynamics of multimodal metaperception, we then used a confidence generative model to compare different predictions. Our results indicated that confidence evidence was generated from individual unisensory cues and subsequently combined with their corresponding sensory reliability. Surprisingly, participants computed confidence from their unimodal representations even after fusing them into an integrated percept. This suggests that confidence does not necessarily develop subsequently to multisensory integration, but that it still has access to the original unisensory evidence.

Acknowledgements: ANR grant no. ANR-18-CE28-0015