Musically induced microvalences in high-level visual processing of everyday scenes

Poster Presentation 53.344: Tuesday, May 23, 2023, 8:30 am – 12:30 pm, Banyan Breezeway
Session: Scene Perception: Categorization, memory, cognition

Elizabeth Galbo1, Nathan Lincoln-DeCusatis1, Elissa M. Aminoff1; 1Fordham University

Our understanding of a scene, although typically described only within the visual domain, can be influenced by other modalities. Here, we examine the link between visual and auditory cognitive processing, or cross-modal processing, using an affective priming paradigm. Current theories do not typically incorporate scene affect into models of scene understanding, yet we explore how the affect associated with a scene can be modulated by music. In the current experiment, participants (N = 39) rated both how much they enjoyed musical excerpts and images of everyday, neutral scenes. A novel musical stimulus dataset was created for the present study to ensure the musical examples did not carry semantic associations and that the observed effects would be attributed to their affectual influence. This dataset included sixty-four original miniature piano compositions composed with features controlled along six binaries. Participants listened to a brief musical excerpt, reported an affect rating from really dislike to really like, then viewed and rated a neutral scene (e.g., a dining room). A significant difference between scene affect ratings after participants heard music they disliked and liked was found on both the participant and individual scene levels. These results imply auditory processing plays a role in scene understanding. Not only were participants’ scene ratings modulated by the affect of the musical stimuli, but the same scene was rated more positively or negatively depending on the affect of the preceding musical example. Crossmodal processing occurs between music and scene perception, and our results demonstrate how one can affect the perception of the other.