Pattern of associations across categories in visual recognition

Poster Presentation 56.449: Tuesday, May 21, 2024, 2:45 – 6:45 pm, Pavilion
Session: Object Recognition: Structure of categories

Laura Soen1 (), Céline Gillebert1, Hans Op de Beeck1; 1Brain and Cognition, Leuven Brain Institute, Faculty of Psychology & Educational Sciences, KU Leuven

Localized brain damage can result in relatively specific problems in mental faculties, including problems with visual recognition. It is still a matter of debate why certain deficits and combinations of deficits occur in visual recognition. Recent advancements in neuroscience have revealed the complexity of the system that supports visual recognition. In addition to category-selective regions, studies found distinct activation profiles among different categories based on multivoxel selectivity patterns. Our project aims to investigate the interrelation between visual recognition of distinct object categories, faces, and words, focusing upon the categories for which most selectivity has been observed in neuroimaging studies. The objective is to explore potential patterns, associations and dissociations, in visual recognition abilities. Therefore we developed a new test battery which measures the visual recognition of 10 object categories: faces, words, bodies, hands, houses, tools, food, animals (cats & dogs), musical instruments, and cars. Notably, the test battery makes use of 3D stimuli, challenging participants with visual recognition in various orientations. In an initial study involving 250 healthy participants, an exploratory factor analysis conducted on accuracies from different object categories unveiled two factors: one related to inanimate categories and another to animate categories. Moreover, the accuracies on inanimate categories displayed higher correlations with each other, as did the accuracies on animate categories, compared to the correlations between the accuracies on animate and inanimate categories, which were lower. This suggests a pattern in visual recognition depending on the degree of animacy. In subsequent stages, our goal is to demonstrate how the performance associations on our test battery is related to neural overlap in the brain, representational overlap in Deep Neural Networks (DNNs) and brain lesions in posterior stroke patients.

Acknowledgements: Research Foundation - Flanders