Semantic relationships between sounds and images modulate attention even when the stimuli are task-irrelevant

Poster Presentation 33.419: Sunday, May 19, 2024, 8:30 am – 12:30 pm, Pavilion
Session: Attention: Features, objects 1

Kira Wegner-Clemens1, Dwight J Kravitz1, Sarah Shomstein1; 1George Washington University

Semantic information plays an important but poorly understood role in guiding attention in naturalistic scenes. Semantic relationships among visual objects have been shown to modulate attentional priority, even in tasks where object identity is irrelevant. In an audiovisual context, semantically related sounds can improve search speeds for visual targets, with the benefit scaling with the degree of semantic relatedness. However, prior research almost exclusively focused on the targets defined by their identity, meaning the visual semantic information was task-relevant. Thus, it is unclear whether crossmodal semantic relationships influence attention only when they are task relevant, or whether those relationships play a more general role in attentional selection. In the present study, we investigate whether an audiovisual semantic benefit exists when both the image and sound’s semantic information are task-irrelevant. Participants were presented with two images and a sound, then subsequently presented with two Gabor patches at the image locations and asked to identify whether the target Gabor was slanted clockwise or counterclockwise. On valid trials, when the sound matched the image where the target Gabor subsequently appeared, participants responded significantly faster than on invalid trials, when the sound matched the image at the location where a distractor appeared. The size of the validity benefit was modulated by the degree of semantic relationship between the sound and the other image. In a mixed effect model with relatedness and validity as fixed effects and subject, target location, and rotation direction as random effects, there was a significant interaction between semantic relatedness and validity, such that a stronger semantic relatedness between the unmatched image and sound results in a smaller validity effect. These results show that crossmodal semantic relationships guide attention even when task-irrelevant, suggesting that semantic relatedness plays a general role in guiding attentional selection.

Acknowledgements: NIH F31EY034030 to KWC; NSF BCS 1921415 to SSS; NSF BCS 2022572 to SSS & DJK