Dyad arrangement affects perceived emotional intensity

Poster Presentation 56.465: Tuesday, May 23, 2023, 2:45 – 6:45 pm, Pavilion
Session: Face Perception: Social cognition

Katie L.H. Gray1 (), Zoe St Louis-King1, Richard Cook2, Mahsa Barzy1; 1University of Reading, 2Birkbeck, University of London

Social perception research has tended to focus on the visual perception of individual faces and bodies, however, there is growing interest in the visual perception of social interactions viewed from a third-person perspective. Previous research has found that the facial and bodily expression of one interactant can influence the perceived expression of another when they are presented face-to-face, but this is not the case when they are presented back-to-back. This suggests that the arrangement of the interaction affects the perception of emotional expression. Whilst it is important to see how the presence of one interactant influences the perception of another, this approach says little about how the interaction as a whole is perceived, which could be critical for making decisions about whether to approach or avoid groups of people when encountered in everyday life. In the present study, we explored whether the arrangement of expressive dyads influenced their perceived emotional intensity. In an online study (N = 75), we presented emotionally expressive dyads where both individuals presented in the dyad expressed the same basic emotion (happy-happy, angry-angry, or neutral-neutral). Participants categorised the overall emotion expressed in the dyad, and also rated the dyads’ emotional intensity. We found that participants did not significantly differ in their emotional categorisation accuracy or response times for facing versus non-facing dyads. However, participants perceived the expressions to be more intense when presented face-to-face than back-to-back. This finding has implications for how multiple people are visually perceived, and may help inform how we make approach and avoidance decisions when we encounter groups of people in everyday life.

Acknowledgements: This research was funded by an award from the Leverhulme Trust to KG (RPG-2019-394). RC is supported by an award from the European Research Council (ERC-STG-715824).