Dynamic Processes in Vision

Dynamic Processes in Vision

Friday, May 8, 3:30 – 5:30 pm Royal Ballroom 4-5

Organizer: Jonathan D. Victor (Weill Medical College of Cornell University)

Presenters: Sheila Nirenberg (Dept. of Physiology and Biophysics, Weill Medical College of Cornell University), Diego Contreras (Dept. of Neuroscience, University of Pennsylvania School of Medicine), Charles E. Connor (Dept. of Neuroscience, The Johns Hopkins University School of Medicine), Jeffrey D. Schall (Department of Psychology, Vanderbilt University)

Symposium Description

The theme of the symposium is the importance of analyzing the time course of neural activity for understanding behavior. Given the very obviously spatial nature of vision, it is often tempting to ignore dynamics, and to focus on spatial processing and maps. As the speakers in this symposium will show, dynamics are in fact crucial: even for processes that appear to be intrinsically spatial, the underlying mechanism often resides in the time course of neural activity. The symposium brings together prominent scientists who will present recent studies that exemplify this unifying theme. Their topics will cover the spectrum of VSS, both anatomically and functionally (retinal ganglion cell population coding, striate cortical mechanisms of contrast sensitivity regulation, extrastriate cortical analysis of shape, and frontal and collicular gaze control mechanisms). Their work utilizes sophisticated physiological techniques, ranging from large-scale multineuronal ex-vivo recording to intracellular in vivo recording, and employs a breadth of analytical approaches, ranging from information theory to dynamical systems.

Because of the mechanistic importance of dynamics and the broad range of the specific topics and approaches, it is anticipated that the symposium will be of interest to physiologists and non-physiologists alike, and that many VSS members will find specific relevance to their own research.

Abstracts

How neural systems adjust to different environments: an intriguing role for gap junction coupling

Sheila Nirenberg

The nervous system has an impressive ability to self-adjust – that is, as it moves from one environment to another, it can adjust itself to accommodate the new conditions. For example, as it moves into an environment with new stimuli, it can shift its attention; if the stimuli are low contrast, it can adjust its contrast sensitivity; if the signal-to-noise ratio is low, it can change its spatial and temporal integration properties. How the nervous system makes these shifts isn’t clear. Here we show a case where it was possible to obtain an answer. It’s a simple case, but one of the best-known examples of a behavioral shift – the shift in visual integration time that accompanies the switch from day to night vision. Our results show that the shift is produced by a mechanism in the retina – an increase in coupling among horizontal cells. Since coupling produces a shunt, the increase causes a substantial shunting of horizontal cell current, which effectively inactivates the cells. Since the cells play a critical role in shaping integration time (they provide feedback to photoreceptors that keeps integration time short), inactivating them causes integration time to become longer. Thus, a change in the coupling of horizontal cells serves as a mechanism to shift the visual system from short to long integration times.  The results raise a new, and possibly generalizable idea: that a neural system can be shifted from one state to another by changing the coupling of one of its cell classes.

Cortical network dynamics and response gain

Diego Contreras

The transformation of synaptic input into spike output by single neurons is a key process underlying the representation of information in sensory cortex. The slope, or gain, of this input-output function determines neuronal sensitivity to stimulus parameters and provides a measure of the contribution of single neurons to the local network. Neuronal gain is not constant and may be modulated by changes in multiple stimulus parameters. Gain modulation is a common neuronal phenomenon that modifies response amplitude without changing selectivity.  Computational and in vitro studies have proposed cellular mechanisms of gain modulation based on the postsynaptic effects of background synaptic activation, but these mechanisms have not been studied in vivo.  Here we used intracellular recordings from cat primary visual cortex to measure neuronal gain while changing background synaptic activity with visual stimulation.  We found that increases in the membrane fluctuations associated with increases in synaptic input do not obligatorily result in gain modulation in vivo.  However, visual stimuli that evoked sustained changes in resting membrane potential, input resistance, and membrane fluctuations robustly modulated neuronal gain.  The magnitude of gain modulation depended critically on the spatiotemporal properties of the visual stimulus.  Gain modulation in vivo may thus be determined on a moment-to-moment basis by sensory context and the consequent dynamics of synaptic activation.

Dynamic integration of object structure information in primate visual cortex

Charles E. Connor

Object perception depends on extensive processing of visual information through multiple stages in the ventral pathway of visual cortex.  We use neural recording to study how information about object structure is processed in intermediate and higher-level ventral pathway cortex of macaque monkeys.  We find that neurons in area V4 (an intermediate stage) represent object boundary fragments by means of basis function tuning for position, orientation, and curvature.  At subsequent stages in posterior, central, and anterior inferotemporal cortex (PIT/CIT/AIT), we find that neurons integrate information about multiple object fragments and their relative spatial configurations.  The dynamic nature of this integration process can be observed in the evolution of neural activity patterns across time following stimulus onset.  At early time points, neurons are responsive to individual object fragments, and their responses to combined fragments are linearly additive.  Over the course of approximately 60 ms, responses to individual object fragments decline and responses to specific fragment combinations increase.  This evolution toward nonlinear selectivity for multi-fragment configurations involves both shifts in response properties within neurons and shifts in population activity levels between primarily linear and primarily nonlinear neurons.  This pattern is consistent with a simple network model in which the strength of feedforward and recurrent inputs varies continuously across neurons.

Timing of selection for the guidance of gaze

Jeffrey D. Schall

Time is of the essence in the execution of visually guided behavior in dynamic environments.  We have been investigating how the visual system responds to unexpected changes of the image when a saccade is being planned.  Performance of stop signal or double-step tasks can be explained as the outcome of a race between a process that produces the saccade and a process that interrupts the preparation.  Neural correlates of dynamic target selection and these race processes have been identified in the frontal eye field and superior colliculus.  The timecourse of these processes can provide useful leverage for understanding how early visual processing occurs.