Automatic biases in visual and auditory perception

Poster Presentation 16.340: Friday, May 15, 2026, 3:45 – 6:00 pm, Banyan Breezeway
Session: Temporal Processing: Duration and timing perception

Linda Garami1, József Fiser1,2; 1Central European University, 2Center for Cognitive Computation

The brain relies on various internally generated categories and stereotypes, formed through biased processing of inputs, to make rapid predictions. We investigated the hypothesis that applying such biases -identifying and focusing on particular groups and patterns that prioritize fast, goal-relevant efficiency over veridical accuracy- is not limited to higher-order cognitive processes. Instead, this strategy is applied automatically and recursively from the earliest levels of sensory processing. We focused on identifying basic chunking principles in the visual and auditory modalities. Behaviorally, chunking consistently leads to decreased accuracy in detecting perceptual changes at perceived chunk boundaries compared to changes within a segment. Using this measure in stimulus sequences with increasing internal structure across different modalities, we tested whether automatically applied temporal grouping principles are shared across vision and audition. We implemented a stream-segregation paradigm in which we manipulated the duration structure of the elements: 200- and 600-ms simple tones or basic visual objects were presented in a continuously repeating three-element pattern, such as Short–Short–Long or Short–Long–Long. Our results (N = 35) show that (1) participants exhibit similar sensitivity patterns across modalities: as soon as a repeating pattern emerges, sensitivity to environmental changes becomes biased by the same chunking principles. Boundaries in both modalities are marked by longer-duration sensory objects that function as closing elements; (2) this bias is automatic and often consciously unavailable to participants, even when attention is directed toward the stream; and (3) this pattern representation is highly resilient to absolute changes in element duration, as long as the relative duration supports a consistent repeating pattern. These results suggest that the perceptual system organizes sequences of elements with varying absolute durations into chains of tokens based on rapidly formed categories that remain remarkably stable even in noisy environments.

Acknowledgements: HORIZON-MSCA-2023-PF-01 No.101155302