The psychophysics of style

Poster Presentation 26.430: Saturday, May 18, 2024, 2:45 – 6:45 pm, Pavilion
Session: Object Recognition: Visual preference

Tal Boger1 (), Chaz Firestone1; 1Johns Hopkins University

Images vary not only in content, but also in style. When viewing a Monet painting, for example, we see both the scenery it depicts (lilies dotting a pond) and the manner in which it does so (broken brushstrokes, blended colors, etc.). Parsing images in this way is a remarkable perceptual achievement, akin to separating illumination and reflectance to achieve color constancy, or disentangling letter-identities from typefaces when reading. What is the nature of this process, and what are its psychophysical signatures? Here, 9 experiments reveal 3 new phenomena of style perception. (1) Style tuning. Using neural style-transfer models, we rendered natural scenes in the styles of famous artists. Then, inspired by ‘font tuning’ (wherein text is easier to read in a single typeface than multiple typefaces), we asked observers to scan arrays of images and enumerate all scenes of one type (e.g., mountains). Observers were faster and more accurate in same-style arrays than mixed-style arrays [E1–E2]. Such tuning accumulated over time [E3] and survived controls for color and luminance [E4]. (2) Style discounting. Analogous to ‘discounting the illuminant’ in color constancy, we find that vision ‘discounts’ style. Changes to a scene’s content (e.g., Monet-pond → Monet-building) were more easily detected than changes to its style (Monet-pond → Klimt-pond; E5), even when low-level image statistics predicted the opposite [E6]. (3) Style extrapolation. After viewing items in a given style (e.g., a fork and knife from one cutlery set), observers misremembered seeing additional items from that style (the spoon from that set; E7), even with low-level similarity equated across lures [E8–E9]. Such errors suggest spontaneous representation of the unseen items — as if mentally 'rendering' objects in newly learned styles. While we typically associate style with more qualitative approaches, our work explores how tools from vision research can illuminate its psychological basis.

Acknowledgements: NSF BCS 2021053, NSF GRFP 2023351964