Ensemble Size Perception Across Depth in Real vs. Virtual Environments

Poster Presentation 33.318: Sunday, May 17, 2026, 8:30 am – 12:30 pm, Banyan Breezeway
Session: Perceptual Organization: Ensembles

Katrín Fjóla Aspelund1, William Gaudreau1, Burke Zimmer1, Ömer Dağlar Tanrikulu1; 1University of New Hampshire

The present study examined how viewing distance influences single and ensemble size perception in real-world (RW) and virtual reality (VR) environments. Previous work using 2D displays or stereoscopes has shown that ensemble judgments follow perceived size after size constancy, rather than retinal size. Here, we asked how this process operates with more natural stimuli and richer depth cues. In the RW experiment, participants viewed handmade paper cubes and completed two size judgment tasks. In the single condition, one cube of variable size appeared at one of three depth planes (0.5, 1, or 2 m), and participants matched its physical size by adjusting a probe cube at 1 m. In the ensemble condition, eight cubes of varying sizes appeared at a single depth plane, and participants matched their perceived average size. In the VR experiment, different groups performed single and ensemble 2AFC tasks in a virtual supermarket, judging which of two flour packs (or two groups of eight packs) at 2.5 and 5 m contained more flour. Points of subjective equality (PSEs) were estimated with the staircase method. In the RW, observers underestimated near cubes and overestimated far cubes, for both single and ensemble displays, even though far cubes projected smaller retinal images. This pattern replicates earlier screen-based findings and suggests that a biased internal depth map feeds both individual and ensemble size constancy. Ensemble judgments tended to be more accurate than single-object judgments, consistent with improved depth estimates when multiple items are present. In VR, PSEs were again larger for the far plane in both tasks, with slightly larger overestimation for single than ensemble displays. Together, the findings indicate that ensemble size perception relies on the same 3D size–distance machinery as single-object perception and that pooling across items can partially mitigate, but does not eliminate, over-constancy.