Time/Room: Friday, May 15, 2015, 12:00 – 2:00 pm, Pavilion
Organizer(s): Jan Drewes and David Melcher; Center for Mind/Brain Sciences (CIMeC), University of Trento, Rovereto, Italy
Presenters: Huan Luo, Ian C. Fiebelkorn, Ayelet N. Landau, Jan Drewes, Rufin VanRullen
The majority of studies in vision science treat variability across trials as noise. However, there is a long-standing idea that oscillations in attention and other brain mechanisms lead to regular oscillations in perceptual and behavioral performance (Walsh, 1952; Callaway & Yeager, 1960; Harter, 1967). The idea of oscillations in perception and behavior has recently received renewed interest (Busch et al, 2009; Drewes & VanRullen, 2009; Van Rullen et al, 2011; Landau & Fries, 2012; Fiebelkorn et al, 2013; Song et al, 2014). In light of this increased interest in the study of oscillations and their manifestations in perception and behavior, we wish to bring together a diverse group of researchers to present novel results and methods aimed at the measurement, understanding and interpretation of these oscillatory effects.
Behavioral oscillations: hidden temporal dynamics in visual attention
Speaker: Huan Luo; Institute of Biophysics, Chinese Academy of Sciences
Neuronal oscillations are widely known to contribute to various aspects of cognition, but most associated evidence is based upon post-hoc relationships between recorded brain dynamics and behavior. It remains largely unknown whether brain oscillations causally mediate behavior and can be directly manifested in behavioral performances. Interestingly, several recent psychophysical studies, by employing a time-resolved measurement, revealed rhythmic fluctuations (Landau & Fries, 2012; Fiebelkorn et al., 2013) and even neurophysiologically relevant spectrotemporal dynamics (Song et al., 2014) directly in behavior. In this talk, I will present our recent studies in which we examined fine temporal dynamics of behavioral performances in various classical visual paradigms. Together, the results suggest that behavioral data, instead of being sluggish and unable to reflect underlying neuronal dynamics, actually contain rich temporal structures (i.e., ‘behavioral oscillations’, Song et al., 2014), in a somewhat neurophysiology relevant manner. I propose that these new ‘behavioral oscillations’ findings, in combination with well-established neuronal oscillation work, speak to an oscillation-based temporal organization mechanism in visual attention.
Rhythmic sampling at both cued and uncued locations
Speaker: Ian C. Fiebelkorn; Neurophysiology of Attention and Perception Laboratory, Princeton University
The brain directs its limited processing resources through various selection mechanisms, broadly referred to as attention. Spatial selection, one such mechanism, is sometimes likened to a spotlight, continuously highlighting regions of the visual scene for preferential processing. Evidence suggests that the operation of this spotlight is linked, at least in part, to neural oscillations. In fact, rhythmic fluctuations attributable to spatial selection have been directly observed in behavior. When spatial selection is fixed at a single target location, visual-target detection oscillates at 8 Hz. When spatial selection is split between two equally likely target locations, visual-target detection at each location instead oscillates at 4 Hz, with peaks in detection alternating between the two locations. Landau and Fries (2012) proposed that these oscillatory patterns at 8 and 4 Hz are attributable to the same neural source, either sampling a single location or alternately sampling two locations. We recently observed both patterns during an experimental task that utilized three potential target locations. A cue (80% valid) indicated the location where a visual target was most likely to occur. As predicted, visual-target detection at the cued location oscillated at 8 Hz, suggesting that participants successfully deployed spatial selection. Yet visual-target detection at each of two uncued locations oscillated at 4 Hz, with peaks in detection alternating between the uncued locations. I will argue that these behavioral data, rather than reflecting a single neural source, support the existence of two attentional spotlights that concurrently sample the visual scene: one fixed spotlight that samples the most relevant location, and a second moving spotlight that rhythmically monitors less relevant locations. We have now replicated these behavioral findings in two monkeys, demonstrating that rhythmic sampling is consistent across primate species. We will next use electrophysiological recordings to investigate the neural sources underlying these behavioral oscillations.
Distributed attention is implemented through theta-rhythmic gamma modulation
Speaker: Ayelet N. Landau; Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max Planck Society and Hebrew University, Jerusalem
When subjects monitor a single spatial location, target detection depends on the pre-target phase of an ~8 Hz brain rhythm. When multiple locations are monitored, performance decrements suggest a division of the 8 Hz rhythm over the number of locations. This suggests that different locations are sequentially sampled. Indeed, when subjects monitor two locations, performance benefits alternate at a 4 Hz rhythm. These performance alternations followed a reset of attention to one location. Although resets are common and important events for attention, it is unknown, whether in the absence of resets, ongoing attention operates rhythmically. Here, we examined whether spatially specific attentional sampling can be revealed by ongoing pre-target brain rhythms. Specifically, visually induced gamma-band activity plays a role in spatial attention and therefore, we hypothesized that performance can be predicted by a theta-rhythmic gamma modulation. Brain rhythms were assessed with MEG, while subjects monitored bilateral grating stimuli for a unilateral target. The corresponding contralateral gamma-band responses were subtracted from each other to isolate spatially-selective, target-related fluctuations. The resulting lateralized-gamma activity (LGA) showed opposite 4 Hz phases prior to detected versus missed targets. The 4 Hz phase of pre-target LGA accounted for a 14% modulation in performance. These findings suggest that spatial attention is an ongoing theta-rhythmic sampling process, with each sampling cycle implemented through gamma-band synchrony. This extends previous findings by demonstrating that in the case of distributed attention, gamma-band synchrony is shaped by the slower sampling rhythm that governs performance benefits.
Oscillations in behavioral performance for rapidly presented natural scenes
Speaker: Jan Drewes; Center for Mind/Brain Sciences (CIMeC), University of Trento, Rovereto, Italy
Authors: Weina Zhu1, David Melcher2; 1Kunming Institute of Zoology, Chinese Academy of Sciences, Kunming, China, 2Center for Mind/Brain Sciences (CIMeC), University of Trento, Rovereto, Italy
Humans are capable of rapidly extracting object and scene category information from visual scenes, raising the question of how the visual system achieves this high speed performance. Recently, several studies have demonstrated oscillatory effects in the behavioral outcome of low-level visual tasks, hinting at a possibly cyclic nature of visual processing. Here we present evidence that these oscillatory effects may also be manifest in a more complex target discrimination task using natural scenes as stimuli. In our experiment, a stream of neutral images (containing neither vehicles nor animals) was rapidly presented centrally at 20 ms/image. Embedded in this image stream were one or two presentations of a target image randomly selected from two categories (vehicles and animals) and subjects were asked to decide the target image category. On trials with two presentations, the ISI was varied systematically from 0 to 600ms. At a varying time prior to the first target presentation, the screen background was flashed with the intent of creating a phase reset in the visual system. When sorting trials by the temporal distance between flash and first target presentation, a strong oscillation in behavioral performance emerged, peaking at 10Hz, consistent with previous studies showing an oscillation in detection threshold. On trials with two targets, longer ISIs between the led to reduced detection performance, implying a temporal integration window for object category discrimination. However, the ‘animal’ trials additionally exhibited a significant oscillatory component at around 5Hz. These findings suggest that there are alternating optimal and non-optimal time periods for which stimulus repetition and integration can improve visual recognition, perhaps due to recurrent processing in complex visual scene perception.”
Speaker: Rufin VanRullen; Université de Toulouse; UPS; Centre de Recherche Cerveau et Cognition; Toulouse, France and CNRS; CerCo; France
Various pieces of experimental evidence using both psychophysical and physiological (EEG) measurements have lead us (and others) to conclude that at least certain aspects of visual perception and attention are intrinsically rhythmic. For example, in a variety of perceptual and attentional tasks, the trial-by-trial outcome was found to depend on the precise phase of pre-stimulus EEG oscillations in specific frequency bands (between 7 and 15Hz). This suggests that there are “good” and “bad” phases for perception and attention; in other words, perception and attention proceed as a succession of cycles. These cycles are normally invisible, but in specific situations they can be directly experienced as an illusory flicker superimposed on the static scene. The brain oscillations that drive these perceptual cycles are not strictly spontaneous, but can also be modulated by visual stimulation. Therefore, by manipulating the structure of the stimulation sequence (e.g. white noise), it is possible to control the instantaneous phase of the relevant perceptual rhythm, and thereby ensure that a given target will be perceived (if presented at the proper phase) or will go unnoticed (at the opposite phase). Better, by taking into account individual differences in oscillatory responses, we can even tailor specific stimulus sequences with an embedded target that can only be perceived by one observer, but not another – a form of “neuro-encryption”.