Neural Responses to Natural Versus AI-generated Affective Images
Poster Presentation 23.331: Saturday, May 16, 2026, 8:30 am – 12:30 pm, Banyan Breezeway
Session: Scene Perception: Models, natural image statistics
Schedule of Events | Search Abstracts | Symposia | Talk Sessions | Poster Sessions
Yujun Chen1, Ruogu Fang1, Andreas Keil2, Mingzhou Ding1; 1Biomedical Engineering, University of Florida, 2Psychology, University of Florida
The International Affective Picture System (IAPS) contains 1,182 well-characterized photographs depicting natural scenes varying in affective content. These pictures are used extensively in affective neuroscience to investigate the neural correlates of emotional processing. Recently, in an effort to augment this dataset, we have begun to generate synthetic emotional images by combining IAPS pictures and diffusion-based AI models. The goal of this study is to compare the neural responses to IAPS pictures and matching AI-generated images. The stimulus set consisted of 60 IAPS pictures (20 pleasant, 20 neutral, 20 unpleasant) and 60 matching AI-generated images (20 pleasant, 20 neutral, 20 unpleasant). In a recording session, a total of 30 IAPS pictures and 30 matching AI-generated images were presented in random order, where each image was displayed for 3 seconds with neighboring images separated by an interval of 2.8 to 3.5 seconds. Each experiment consisted of 10 recording sessions spread over two days. The fMRI data was recorded on a 3T Siemens Prisma scanner. Pupil responses to image presentation were monitored using an MRI-compatible Eyelink1000 Eyetracker. Our preliminary analysis of the fMRI data (N=28) showed that (1) both the IAPS pictures and the matching AI-generated images activated higher-order visual areas such as OTJ and components of the anterior emotion network such as the insula and (2) in the retinotopic visual cortex including V1, MVPA (Multivariate Pattern Analysis) classifiers built to decode emotional categories from voxel patterns activated by IAPS pictures could be used to decode emotional categories from voxel patterns activated by AI-generated images and vice versa, suggesting that the IAPS pictures and the matching AI images evoked similar neural responses in the brain. Efforts to confirm these findings are underway by recruiting additional participants. Analysis is also being expanded to include the comparison of such measures as functional connectivity and pupillometry.
Acknowledgements: MH125615 and MH112558 and NSF grant BCS2318984