Perceptual Confidence for Auditory-Induced Visual Position Biases

Poster Presentation 26.473: Saturday, May 16, 2026, 2:45 – 6:45 pm, Pavilion
Session: Multisensory Processing: Recalibration, temporal

Alejandro Lorca Vyhmeister1, Pascal Mamassian1,2; 1École Normale Supérieure Paris, 2Centre national de la recherche scientifique (CNRS)

To efficiently interact with the world, we constantly integrate different sensory sources and engage in metaperceptual processes to judge the validity of the resulting multisensory percepts. However, most research in metaperception has focused on unisensory stimuli. Here, we are interested in the metaperception of multisensory percepts. We designed an audiovisual paradigm that biased the perceived position of a visual flash by manipulating the timing of an auditory sound. Visual stimuli were three Gaussian blobs presented successively at three vertically aligned positions. Auditory stimuli were pure tones. All flashes and tones lasted 33ms. The first and third visual and auditory stimuli were synchronized and separated by 467ms. The second flash was presented at one of five possible positions near the spatial midpoint, and always halfway in time (233ms from the beginning). The time of the second tone could start early, on time, or late relative to the second flash (at 167, 233, or 300ms from beginning). A no-sound condition was also added as a baseline. Participants reported the perceived position of the second flash with the computer mouse. After their perceptual decision, participants judged how confident they were in the accuracy of their perceptual report using a continuous scale. As expected, reported positions of the second flash followed physical positions, with some regression to the mean of the five possible positions, and some additional biases when the tone was presented early. Importantly, early tones created a spatial bias towards the first flash, and reversely for late tones. For each flash position condition, confidence judgments were higher for perceptual reports closer to the median position for that condition, thereby revealing metacognitive sensitivity. Finally, no differences were observed in metacognitive sensitivity across all sound conditions. These results suggest that participants cannot detect visual spatial biases induced by a sound in our experiment.

Acknowledgements: MSCA Doctoral Network "CODE" (grant number: 101119647)