Revealing the Mechanisms of Real-World Action Perception: Behavioral and EEG Evidence from Live vs. Video Actions

Poster Presentation 23.480: Saturday, May 16, 2026, 8:30 am – 12:30 pm, Pavilion
Session: Action: Miscellaneous

Elif Ahsen Çakmakci1, Sezan Oral2, Burcu A. Ürgen1,3; 1Bilkent University, Ankara, Turkey, 2University of Luxembourg, Esch-sur-Alzette, Luxembourg, 3Aysel Sabuncu Brain Research Center, Ankara, Turkey

Perceiving others’ actions is essential for survival, interaction, and communication. Yet, despite decades of research, we remain far from developing neurally inspired computational models that approach human action perception. A major limitation is that most neuroscience studies rely on 2D images or videos, which lack the real-time presence and social affordances of actual actions. As a result, the fundamental mechanisms of real-world action perception remain poorly understood. Here, we investigate the behavioral and neural mechanisms underlying the perception of real (live) versus video-based actions. Using a novel experimental setup (Pekçetin et. al., 2023), we conducted a two-session EEG study (N = 26) in which participants observed peripheral actions presented either live or via video while performing a central task under low vs. high attentional load. We analyzed the data using mass univariate ERPs, time-frequency analyses, and time-resolved representational similarity analysis (RSA). Behaviorally, central-task performance was consistently lower for Real than for Video stimuli, with the largest Real–Video difference emerging under high attentional load. ERP analyses revealed reliable differences between live and video conditions during the 150–450 ms window following action onset. Time-frequency analyses over occipital and parietal regions showed that video actions evoked substantially weaker alpha (8–12 Hz) and beta (15–25 Hz) suppression than live actions, indicating reduced perceptual engagement. Complementing these findings, time-resolved RSA robustly distinguished live from video actions within the 250–750 ms window. Taken together, our results demonstrate that live actions engage perceptual systems differently and more powerfully than their video-based counterparts. These findings underscore the limitations of screen-mediated paradigms and highlight the need for more ecologically grounded approaches in social and action perception research.

Acknowledgements: This study was funded by The Scientific and Technological Research Council of Turkey (TÜBİTAK), Project no: 122K915