Dynamical neural model of lightness computation and perceptual fading of retinally stabilized images

Poster Presentation: Tuesday, May 21, 2024, 8:30 am – 12:30 pm, Banyan Breezeway
Session: Color, Light and Materials: Lightness, brightness

Michael Rudd1 (), Idris Shareef1; 1University of Nevada, Reno

We recently proposed a neural model that accounts to within <6% error for lightness matches made to Staircase Gelb and simultaneous contrast displays comprising real illuminated surfaces. Here, we demonstrate how the model accounts for the perceptual fading that occurs when images are stabilized on the retina (Troxler, 1804, Riggs et al., 1953). In the model, cortical lightness computations are derived from transient ON and OFF cell responses in the early visual system that are generated in the course of fixational eye movements. The ON and OFF responses are sorted by eye movement direction in visual cortex to produce a set of spatiotopic maps of ON and OFF activations. Activations within these maps trigger spatial filling-in of lightness and darkness within independent ON and OFF networks, which are combined at the final modeling stage to compute perceived reflectance (lightness). We elaborate these mechanisms to produce a more detailed neurophysiological theory. We propose how early temporal responses of ON and OFF cells are read out (decoded) in visual cortex to trigger lightness and darkness induction signals, and we explicitly model cortical magnification, which further improves the fit to psychophysical data. Two key takeaways are: 1) the model accounts for multiple lightness phenomena, including fading of stabilized images, with high quantitative precision and in a biologically plausible way; 2) estimated rates of fixational eye movements known as microsaccades (Martinez-Conde et al., 2004) are too low to explain the dynamics of lightness phenomenology. We suggest that the higher rate eye movements known as tremor can better account for the perceptual data within the context of an otherwise identical neural framework. Correspondences between the model's processing stages and cortical neurophysiology will be discussed, and the computations performed at different model stages will be illustrated through a combination of still images and movies.