Temporal Processing

Talk Session: Saturday, May 20, 2023, 5:15 – 6:45 pm, Talk Room 2
Moderator: Pascal Mamassian, Ecole Normale Supérieure and CNRS

Talk 1, 5:15 pm, 25.21

Event-based warping: An illusory distortion of time within events

Rui Zhe Goh1 (), Hanbei Zhou1, Chaz Firestone1, Ian Phillips1; 1Johns Hopkins University

Objects are fundamental units of perception which structure our experience of space. A striking finding about object representation is that objecthood warps our perception of spatial distance, such that two dots perceived within an object appear farther apart than two dots perceived in empty space — an illusion known as “Object-based Warping” (Vickery & Chun, 2010). However, just as objects are fundamental to our experience of space, events are fundamental to our experience of time. Does spatial object-based warping have a temporal event-based counterpart? Here, we show that it does: Just as two dots in an *object* are perceived as farther apart in *space*, we show that two probes within an *event* are perceived as further apart in *time* — introducing “Event-based Warping” to the literature. In our first experiment, subjects judged the duration between two auditory probes (i.e., two tones) with respect to a standard reference duration. There were two types of trials: In event trials, probes were presented during an auditory event (a brief period of noise during an otherwise-silent soundtrack). In non-event trials, probes were not presented during any auditory event but simply played in silence. Subjects judged probes within an event to be further apart than probes not within an event, showing that event representations warp perceived duration. Crucially, we further demonstrate that this temporal warping phenomenon also arises in vision. In a second, cross-modal experiment, observers judged the duration between two visual probes (i.e., two flashes) presented either within or not within an auditory event. Again, subjects judged probes on event trials as further apart in time than probes on non-event trials, showing that event-based warping occurs cross-modally. We suggest that object-based warping and event-based warping are instances of a more general phenomenon in which structural representations give rise to perceptual distortions.

Acknowledgements: National Science Foundation BCS #2021053, Hopkins Office for Undergraduate Research

Talk 2, 5:30 pm, 25.22

Interocular binding of chromatic signals across time

Benjamin M Chin1 (), Johannes Burge1; 1University of Pennsylvania

The temporal dynamics of visual processing depend on the properties of the stimulus being processed. Luminance, contrast, spatial frequency, and color all influence the speed of processing and the duration of temporal integration. But it is not well understood how the visual system combines complementary signals that, when isolated, are processed with different temporal dynamics. Are processing dynamics of the combination determined by the more rapidly or more sluggishly processed component, or by some compromise between them? Psychophysics and electrophysiology have established that S-cone signals are processed with longer delays and temporal integration periods than L-cone signals. Here, we use a classic visual illusion--the Pulfrich effect—to examine how L- and S-cone isolating signals are combined across time. Six observers viewed a dichoptically presented oscillating bar. One eye’s stimulus caused pure L-cone modulation. The other eye’s stimulus caused both L- and S-cone modulations, with S-cone contrasts that were titrated with condition. In two experiments, the tasks were to report the direction of motion and the orientation of the stimulus’ perceived trajectory in depth. Seven-level psychometric functions were measured. Observers perceived anomalous Pulfrich effects consistent with the L+S stimulus being processed with a longer delay, and with a longer temporal integration period. Processing delay and integration duration increased with the contrast of the S- cone component of combined stimulus. The anomalous Pulfrich effect, a consequence of different temporal integration periods in the two eyes, is characterized by a motion trajectory that appears oriented left- or right-side back from the true direction of motion (Chin & Burge, 2022). The present results demonstrate both that the combined signal is processed more sluggishly than its fastest constituent, and that interocular color differences spanning only a fraction of the full color gamut cause processing differences that can lead to substantial misperceptions of motion in depth.

Acknowledgements: This work was supported by NIH grant R01-EY028571 from the National Eye Institute and the Office of Social and Behavioral Science

Talk 3, 5:45 pm, 25.23

The temporal sensitivity of visual cortex reflects an eccentricity-dependent variation in surround inhibition

Carlyn Patterson Gentile1,2 (), Manuel Spitschan3,4, Huseyin Taskin1, Andrew Bock1, Geoffrey Aguirre1; 1University of Pennsylvania Perelman School of Medicine, Philadelphia, PA, 2Children's Hospital of Philadelphia, Philadelphia, PA, 3Translational Sensory & Circadian Neuroscience, Max Planck Institute for Biological Cybernetics, Tübingen, Germany, 4Chronobiology & Health, TUM Department of Sport and Health Sciences (TUM SG), Technical University of Munich, Munich, Germany

We characterized eccentricity variation in the temporal sensitivity of human primary visual cortex (V1) to flicker that targeted the post-receptoral channels (luminance, red-green, blue-yellow). These responses were modeled as a transformation of the signals that arise in the different classes of retinal ganglion cells (RGCs). We measured 7T BOLD fMRI from two participants viewing a high-contrast, flickering, wide (~150°) uniform field. Stimulus frequency varied logarithmically between 2 and 64 Hz and targeted L+M+S, L-M, and S-[L+M] cone combinations. We obtained the average LGN response, and V1 responses across eccentricity bands. These data were fit with a temporal sensitivity model based on electrophysiologic responses of macaque midget, parasol, and bistratified RGCs to chromatic and achromatic flicker at multiple eccentricities (Yeh et al. 1995; Solomon et al. 2002, 2005). Low pass filtering, surround inhibition, and multiplicative gain were applied to the fixed RGC responses to fit the LGN and V1 data. Model comparison and inference were accomplished by fitting across bootstrap resampling of the imaging acquisitions. fMRI responses reflected known properties of the visual system, including higher peak temporal sensitivity to achromatic vs. chromatic stimuli, and low-pass filtering between LGN and V1. Peak temporal sensitivity decreased at greater V1 eccentricities, a finding not predicted by retinal responses. A model that accounted for these data in terms of RGC signals had the following elements: 1) low pass filtering between the retina and LGN, and between the LGN and cortex, 2) delayed surround suppression applied to LGN and V1 responses pooled by post-receptoral channel, strongest at the fovea and weaker towards the periphery, and 3) multiplicative gain that amplified retinal signals at low eccentricities for blue-yellow ~10x as compared to luminance, and ~10x for luminance compared to red-green. Delayed surround suppression helps explain differences in flicker temporal sensitivity between RGCs and V1.

Acknowledgements: Supported by the Research to Prevent Blindness, Lions Club International Foundation Low Vision Award to GKA, and P30 EY001583: Core Grant for Vision Research.

Talk 4, 6:00 pm, 25.24

Alpha power modulates long-lasting feature integration

Maëlan Q. Menétrey1 (), Michael H. Herzog1, David Pascucci1; 1Laboratory of Psychophysics, École Polytechnique Fédérale de Lausanne (EPFL)

How alpha rhythms influence visual perception is an open question for more than a century. For example, it was proposed that two stimuli falling within a cycle are integrated into a single percept. However, temporal integration can last longer than the window of an alpha cycle. Here, we analyzed electroencephalography recordings (EEG) during the sequential metacontrast paradigm (SQM), where information is integrated across space and time for almost half a second. In the SQM, participants discriminate the offset of a central line followed by a stream of flanking lines. Surprisingly, the offset is visible at the flanking lines even though they are not offset. When a second line in the stream has an offset in the opposite direction, the two offsets integrate: only a small offset is perceived, which is either in accordance with the first or the second offset. Integration of the offsets is mandatory and unconscious. Using linear discriminant analysis, we first isolated electrodes carrying information about the reported offset in the stream. We then showed that pre-stimulus alpha activity at these electrodes is predictive of the reported percept: increases in alpha power led to more frequent reports of the first offset, decreases to more frequent reports of the second offset. In contrast, we found no effect of phase. Since the two offsets integrate before conscious perception is reached, these results suggest that alpha power affects the relative weighting of individual features during unconscious processing. We argue that the use of paradigms with long-lasting integration helps to disentangle the effects of power and phase and to better understand how alpha rhythms modulate perception.

Acknowledgements: This work was supported by the Swiss National Science Foundation (grant number 325130_204898, “Basics of visual processing: the first half second”; grant numbers PZ00P1_179988 and PZ00P1_179988/2)

Talk 5, 6:15 pm, 25.25

Serial Dependence Biases Realistic Skin Cancer Diagnosis

Zhihang Ren1 (), Xinyu Li1, Dana Pietralla2, Mauro Manassi3, Stella Yu1,4, David Whitney1; 1University of California, Berkeley, 2University of Cologne, 3University of Aberdeen, 4University of Michigan, Ann Arbor

Serial Dependence is a perceptual phenomenon in which the visual system tends to bias representations toward recent visual history (Fischer & Whitney, 2014; Cicchini, et al., 2014). Studies under both lab and naturalistic scenarios have shown that radiologists’ perception is influenced by serial dependence, which could have a negative impact on diagnostic accuracy (Manassi et al., 2019; Manassi et. al., 2021; Ren et. al., 2022). However, those studies often suffer from a dearth of expert participants; typically, less than 20 experts are recruited per experiment. In this study, we analyzed a large volume of diagnostic data from a commercially available mobile app. All of the 7,798 skin cancer images used were genuine and were retrieved from either public databases or medical clients and were randomly selected and randomly ordered for each participant. In total, 756,001 diagnostic judgments from 1,137 medical students and residents were analyzed. We found an effect of serial dependence, such that judgments on the current image were pulled toward the observer's previous experience. Importantly, this effect displayed both feature-tuning and temporal-tuning, two of the diagnostic criteria of serial dependence. With respect to feature-tuning, although the sequential images were completely random, there was stronger serial dependence when two medical images happened to be similar in severity (estimated based on popularity vote). Utilizing a semantic similarity metric in deep learning, we also found larger serial dependence effects when two skin cancer images were, by chance, semantically similar. We also found that shorter inter-trial intervals led to stronger serial dependence, suggesting a kind of temporal tuning of the effect. This study suggests that serial dependence may negatively impact skin cancer diagnoses in a realistic diagnostic scenario. It also hints at the possible methods in which these biases can be mitigated.

Acknowledgements: This work has been supported by National Institutes of Health (NIH) under grant #R01CA236793.

Talk 6, 6:30 pm, 25.26

Overestimated speed at short durations creates a novel motion-position dissociation

Pascal Mamassian1 (); 1Ecole Normale Supérieure and CNRS

The motion and position of an object are sometimes dissociated, including famously in the motion aftereffect where a physically stationary test stimulus is perceived simultaneously in motion and not changing position. The study of other phenomena such as the motion-induced position shift or the curveball illusion has shown that motion and position can indeed be dissociated but are not fully independent. Here we are interested in the initial position and speed of a moving object. It is well-known that the initial position of a moving object is mislocated in its direction of motion (Fröhlich effect). We manipulated the duration of a line moving at 4 deg/sec and measured both its perceived initial position and its perceived speed. We used forced-choice psychophysical discriminations to another line that was presented either simultaneously, stationary, and with varying spatial offsets (position task), or sequentially and with varying speeds (speed task). Not too surprisingly, when the duration of the movement was brief, the spatial bias for the initial position was also short. The magnitude of the spatial bias increased with motion duration and reached a plateau of 0.4 deg at about 200 msec. More surprisingly, there was a large over-estimation of perceived speed for brief durations, up to 14 deg/sec for 50 msec durations. The magnitude of the speed overestimation decreased with motion duration and vanished also after 200 msec. Interestingly, for durations under 200 msec, the spatial bias in perceived initial location was almost as large as the full trajectory of the object over its duration, suggesting that the moving object was almost perceived stationary. Yet, for these short durations, there was a large over-estimation of perceived speed. Taken together, these results highlight a paradox in that the same moving object appears both almost static and moving very fast.

Acknowledgements: ANR grant ANR-22-CE28-0025 "IntegratedTime"