Object Affordances through the window of Verb Usage Patterns and Behavior

Poster Presentation 36.472: Sunday, May 19, 2024, 2:45 – 6:45 pm, Pavilion
Session: Action: Reach, grasp, track

Maryam Vaziri-Pashkam1 (), Ka-Chin Lam2, Natalia Pallis-hassani2, Aida Mirebrahimi3, Aryan Zoroufi4, Francisco Pereira2, Chris Baker2; 1Department of Psychological and Brain Sciences, University of Delaware, 2National Institute of Mental Health, 3Carnegie Mellon University, 4Massachusetts Institute of Technology

When we see objects, we immediately know how to interact with them. Little research has been performed to understand what information people glean from objects about the interactions they support. Here, we first used language as a means to tap into humans’ knowledge of what actions can be performed with an object. Using a large database of ~1850 object categories (THINGS database) and ~5000 verbs, we identified applications of each verb to each object in a large text corpus. We then used this data to embed each object in a space where dimensions correspond to verbs that apply to similar objects. We showed, in behavioral experiments, that these extracted embedding dimensions are meaningful to human observers. Next, to reveal people’s understanding of potential actions towards objects, we conducted online behavioral experiments in which we presented images of individual objects from the THINGS database and asked people about the actions they associate with the objects and body parts they use to interact with the objects. Many objects, including both tool and non-tool items, had a strong action association. Although hand was the most common body part implicated, other body parts were also reported to be heavily involved in interacting with objects. Together, these results indicate strong object-action associations evident in both text corpora and in people’s reports from viewing pictures of objects. They uncover the richness of object interactions and argue for moving beyond simple hand grasps and beyond the specific category of tools in future behavioral and neuroscientific experiments.