2016 Young Investigator – Nicholas Turk-Browne

Nicholas Turk-Browne

Associate Professor, Associate Chair, Department of Psychology, Princeton University

Nicholas Turk-Browne is the 2016 winner of the Elsevier/VSS Young Investigator Award. Trained at the University of Toronto and then at Yale University, Nicholas Turk-Browne was awarded a PhD in Cognitive Psychology in 2009 under the supervision of Marvin Chun and Brian Scholl. Following his PhD, Nick took up a position at Princeton University, where he is currently an associate professor.

In the past 7 years following his PhD, Nick has established an active and dynamic lab that uses multidisciplinary methodologies to advance our understanding of the neural circuits that mediate visual cognition. Nick combines behavior, brain imaging, and computational modeling to bridge across key areas in the field of visual cognition: visual learning, memory and attention. His pioneering work on visual statistical learning has demonstrated that our ability to extract perceptual regularities relies on interactions between the hippocampus and the visual cortex. Nick has shown that this circuit supports predictive representations based on implicitly learned associations. Further, his work shows that — although implicit — statistical learning can be modulated by task demands and, in turn, learned regularities automatically draw attention. Nick’s contributions extend to groundbreaking methodological developments that combine neuroimaging and machine learning to understand the brain dynamics that support visual cognition. Finally, Nick’s recent work using neural fluctuations as feedback during real-time fMRI to train attention has strong potential for translational clinical applications. 

Elsevier/Vision Research Article

Attention and perception in memory systems

Monday, May  16, 12:30 pm, Talk Room 2

The labeling of brain structures by function, such as the “visual” system, “attention” networks, and “memory” systems, reinforces an appealing division of cognitive labor over the brain. At the same time, neural representations can be widely distributed and real-world behaviors require the coordination of much of the brain. An alternative way to think about brain function is in terms of the computations that different brain regions and networks perform and to try to understand when and how these computations participate in different cognitive processes. In this presentation, I will discuss some recent findings from my lab that illustrate this perspective, particularly about the involvement of memory systems in attention and perception. First, I will show that goal-directed attention modulates the state of the hippocampus — the canonical memory system in the brain — and through this, determines what aspects of visual experience we remember. Second, I will show that pattern completion, a core computation of the hippocampus, supports predictive coding in early visual cortex. These and other studies highlight the broad reach of vision science in the mind and brain.