Contextual cueing in change detection tasks

Poster Presentation 53.324: Tuesday, May 21, 2024, 8:30 am – 12:30 pm, Banyan Breezeway
Session: Visual Memory: Working memory and attention

Courtney Turner1 (), Mikel Jimenez1, Anna Grubert1; 1Durham University

Visual scenes contain detailed features and objects at certain locations that form a global visuospatial context. In attention research, visual search was found to be faster and more accurate in constant as compared to random global contexts, i.e., when a target was repeatedly presented at a fixed location within a spatially constant context of distractors. Such contextual cueing effects demonstrate that global contextual information is learned implicitly to facilitate search. Here we ask whether contextual cueing can directly be measured at the level of working memory (which is used to guide selection in visual search). We measured working memory capacity K and the contralateral delay activity (CDA) of the event-related potential under low and high-load conditions of a visual change detection task. In different blocks, memory displays contained two or four coloured squares. Participants had to retain these colours at their spatial locations for 1000ms and then compare them to a colour set in a test display in which the colours were either identical (50% no-change trials), or one of them changed (50% change trials). The critical manipulation concerned the global context of the memory displays – in half of the trials, the memory displays showed the same colour squares at fixed locations (constant context), in the other half, the colours and locations of the squares were selected completely randomly (random context). K values and CDA amplitudes (measured during the retention period) in response to trials with constant as compared to random contexts were substantially increased, both in the low and high memory load conditions. These converging behavioural and electrophysiological findings suggest that implicitly learned global stimulus configurations directly affected visuospatial working memory capacity. In a wider framework, this observation proposes that previously observed contextual cueing effects in visual search may depend on learning effects in visuospatial working memory.

Acknowledgements: This work was funded by a research grant of the Leverhulme Trust (RPG-2020-319).