Predicting the affective connotation of colormap data visualizations from estimated emotional associations of constituent colors
Poster Presentation 26.457: Saturday, May 16, 2026, 2:45 – 6:45 pm, Pavilion
Session: Color, Light and Materials: Affect, cognition
Schedule of Events | Search Abstracts | Symposia | Talk Sessions | Poster Sessions
Halle C. Braun1, Kushin Mukherjee2, Seth R. Gorelik3, Karen B. Schloss1; 1University of Wisconsin-Madison, 2Stanford University, 3Woodwell Climate Research Center
Research in affective visualization design aims to predict the affective connotation of information visualizations. Previous work quantified the effects of hue, saturation, and lightness on affective connotation of monochromatic colormaps (Braun et al., 2026), but this simple regression approach could not easily extend to complex, polychromatic colormaps. We propose a new approach, which predicts affective connotation of whole colormaps from affective connotation of constituent colors (e.g., 256 colors in standard color scales, like Viridis). Although it may seem unlikely that the “whole” could be well-predicted by the sum of its “parts,” prior evidence supported the feasibility of our approach on a smaller scale: color-emotion associations for color pairs were well-predicted by their two constituent colors (Ou et al., 2024). In this study, we first generated colormap data visualizations from 40 color scales (256 constituent colors) and had participants (n=80) rate associations between each colormap and five emotions (angry, sad, happy, disgust, and fearful). To generate predictions for each constituent color (c=256) within each color scale (s=40) for each emotion (e=5), it was infeasible to collect human color-concept association ratings (51,200 color-emotion pairs). Instead, we estimated associations computationally using color space regression models (Schloss et al., 2018) from existing color-emotion association data for 71 colors systematically sampled across CIELAB color space (Mukherjee et al., VSS-2022). For all five emotions, the aggregated association estimates across all colors within each color scale were strongly correlated with participant color-emotion association ratings for whole colormap data visualizations (angry r=0.88, sad r=0.84, happy r=0.61, fearful r=0.68, disgust r=0.73). Thus, affective connotation of whole colormaps was well predicted by the sum of their parts, despite having estimated the parts computationally rather than through direct judgments. Our approach has the potential to scale to predict affective connotation of colors in virtually any kind of visualization.
Acknowledgements: NSF grant BCS-2419493 to K.B.S