Implications of Microscopic Eye Movements for Retinal Encoding
63.435, Wednesday, May 15, 8:30 am - 12:30 pm, Orchid Ballroom
John George1, Jennifer Schei1, Peter Schultz2, Garrett Kenyon1,2; 1Los Alamos National Laboratory, 2New Mexico Consortium
The eye is constantly moving. In addition to voluntary saccades, "fixational" eye movements, drift and tremor on the scale of individual photoreceptors, overlap frequencies of oscillations observed in LGN and cortex. Such eye movements are implicated in the perception of fine spatial detail and other perceptual tasks. We set out to explore mechanisms underling perceptual consequences of microscopic eye movements using computational models, and retinal electrophysiology. We postulate that eye movements temporally modulate the visual response; that precisely timed and spatially coherent neural population activation enhances the detection and learning of visual features, and may encode relationships between features. We employed a model of the outer retina developed by van Hateren, (expanded to a 32x32 array of photoreceptors, with electrical coupling between horizontal cells) coupled to spiking models of the inner retina and primary visual cortex, implemented in our package Petavision. We employed a range of stimuli: illuminated points, noisy Gabor gratings, and still images. Stimulus patterns were randomly displaced. Movement orthogonal to the orientation of a grating blurred structure and reduced contrast encoded by spike rate, ultimately obliterating spatial detail. In contrast, orthogonal movements enhanced reconstructions based on temporal covariance. This effect might enhance detection of extended spatial features encoded by synchronized firing. In electrophysiological studies of isolated retina (tiger salamander), we simulated eye movements by jittering the visual stimulus; responses were recorded with multi-electrode arrays. As predicted, synthetic eye movements elicited a strong periodic response at the jitter frequency from individual cells in the salamander retina. Our models suggest that microscopic eye movements might enhance the representation of features in visual imagery encoded by correlation within a population of neurons. These predictions are testable: by functional optical imaging and electrode array measurements in isolated retina or by psychophysical investigation of the detection of perceptual targets perturbed by microscopic displacements.