Seeing through another’s eyes: Modeling and correcting for individual differences in color appearance

Poster Presentation 43.351: Monday, May 20, 2024, 8:30 am – 12:30 pm, Banyan Breezeway
Session: Color, Light and Materials: Appearance, categories

There is a Poster PDF for this presentation, but you must be a current member or registered to attend VSS 2024 to view it.
Please go to your Account Home page to register.

Camilla Simoncelli1 (), Michael Webster1; 1University of Nevada

Individual differences in color vision arise at many levels, from the spectral sensitivities of the cones to how individuals judge or label color appearance. Peripheral sensitivity differences are routinely corrected (e.g. to control for equiluminance) and there are growing efforts to calibrate displays and create standards that account for the spectral sensitivities of individual users. However, these sensitivity differences do not predict and therefore cannot correct for the substantial differences that also occur in color appearance. As examples, it is well established that differences in color appearance are not dependent on factors such as the density of preretinal screening pigments or the cone ratios, which strongly impact sensitivity. We developed a procedure that directly adjusts images for the varied color percepts of different observers, based on previous measurements of variations in hue scaling (Emery et al. PNAS 2023) and on a new task where we measured unique and binary hues as well as the achromatic point. Chromaticities in the image are first mapped onto the average scaling function. The corresponding hue percepts are then used to estimate the chromatic axis that would produce the same hue percept in an individual, based on their individual scaling function. Such images should have the property that two observers – each looking at different images tailored to their specific hue percepts – should describe the colors in the images in more similar ways.We use procedure to visualize the range of phenomenal color experience when different observers are looking at the same physical stimulus. The correction we developed is similar in principle to correcting for low-level visual differences in sensitivity (e.g. for observer metamerism) but instead compensates for high-level differences in color perception, and could be used to factor out potential perceptual differences for application and analyses of tasks like color communication or data visualization.

Acknowledgements: Supported by EY-010834