Gaze scan-paths affect recall strategy in context dependent memory.

Poster Presentation 43.313: Monday, May 20, 2024, 8:30 am – 12:30 pm, Banyan Breezeway
Session: Visual Memory: Encoding, retrieval

There is a Poster PDF for this presentation, but you must be a current member or registered to attend VSS 2024 to view it.
Please go to your Account Home page to register.

Neomi Mizrachi1 (), Ehud Ahissar1; 1Weizmann Institute of Science

Motor-sensory dynamics is an essential component of visual perception. During an episode, the scanning patterns of the eyes affect the acquired information. To reckon eye movements' role in context dependent memory, we designed a recall task in virtual reality (VR) environments. Two groups were asked to explore a virtual room with 15 virtual daily objects; 20 minutes later they were asked to retrieve objects’ names in either an environment that was similar (SIM group) or different (DIFF group) from the encoding environment. Only under similar context (SIM group), gaze scan-paths at recall were spatially-temporally similar to encoding scan-paths, such that the same locations were visited at the same relative times. Gaze spatial-temporal similarity dictated also recall strategy: SIM group participants retrieved objects' names according to the order of their gaze fixations at encoding and not by random or semantic connections. These results suggest that gaze scan-paths dynamics play a significant role in context dependent memory.