The Brain Correlates of Perception and Action: from Neural Activity to Behavior

S2 – The Brain Correlates of Perception and Action: from Neural Activity to Behavior

Time/Room: Friday, May 19, 2017, 12:00 – 2:00 pm, Pavilion
Organizer(s): Simona Monaco, Center for Mind/Brain Sciences, University of Trento & Annalisa Bosco, Department of Pharmacy and Biotech, University of Bologna
Presenters: J. Douglas Crawford, Patrizia Fattori, Simona Monaco, Annalisa Bosco, Jody C. Culham

< Back to 2017 Symposia

Symposium Description

In recent years neuroimaging and neurophysiology have enabled cognitive neuroscience to identify numerous brain areas that are involved in sensorimotor integration for action. This research has revealed cortical and subcortical brain structures that work in coordination to allow accurate hand and eye movements. The visual information about objects in the environment is integrated into the motor plan through a cascade of events known as visuo-motor integration. These mechanisms allow not only to extract relevant visual information for action, but also to continuously update this information throughout action plan and execution. As our brain evolved to act towards real objects in the natural environment, studying hand and eye movements in experimental situations that resemble the real world is critical for our understanding of the action system. This aspect has been relatively neglected in the cognitive sciences, mostly because of the challenges associated with the experimental setups and technologies. This symposium provides a comprehensive view of the neural mechanisms underlying sensory-motor integration for the production of eye and hand movements in situations that are common to real life. The range of topics covered by the speakers encompasses the visual as well as the motor and cognitive neuro-sciences, and therefore are relevant to junior and senior scientists specialized in any of these areas. We bring together researchers from macaque neurophysiology to human neuroimaging and behavior. The combination of works that use these cutting-edge techniques offers a unique insight into the effects that are detected at the neuronal level, extended to neural populations and trans-lated into behavior. There will be five speakers. Doug Crawford will address the neuronal mechanisms underlying perceptual-mo-tor integration during head-unrestrained gaze shifts in the frontal eye field and superior colliculus of macaques. Patrizia Fattori will describe how the activity of neurons in the dorsomedial visual stream of macaques is modulated by gaze and hand movement direction as well as properties of real objects. Jody Culham will illustrate the neural representation for visually guided actions and real objects in the human brain revealed by functional magnetic resonance imaging (fMRI). Simona Monaco will describe the neural mechanisms in the human brain underlying the influence of intended action on sensory processing and the involvement of the early visual cortex in action planning and execution. Annalisa Bosco will detail the behavioral aspects of the influence exerted by action on perception in human participants.

Visual-motor transformations at the Neuronal Level in the Gaze System

Speaker: J. Douglas Crawford, Centre for Vision Research, York University
Additional Authors: AmirSaman Sajad, Center for Integrative & Cognitive Neuroscience, Vanderbilt University and Morteza Sadeh, Centre for Vision Research, York University

The fundamental question in perceptual-motor integration is how, and at what level, do sensory signals become motor signals? Does this occur between brain areas, within brain areas, or even within individual neu-rons? Various training or cognitive paradigms have been combined with neurophysiology and/or neuroimaging to address this question, but the visuomotor transformations for ordinary gaze saccades remain elusive. To address these questions, we developed a method for fitting visual and motor response fields against various spatial models without any special training, based on trial-to-trial variations in behavior (DeSouza et al .2011). More recently we used this to track visual-motor transformations through time. We find that superior colliculus and frontal eye field visual responses encode target direction, whereas their motor responses encode final gaze position relative to initial eye orientation (Sajad et al. 2015; Sadeh et al. 2016). This occurs both between neuron populations, but can also be observed within individual visuomotor cells. When a memory delay is imposed, a gradual transition of intermediate codes is observed (perhaps due to an imperfect memory loop), with a further ‘leap’ toward gaze motor coding in the final memory-motor transformation (Sajad et al. 2016). However, we found a similar spatiotemporal transition even within the brief burst of neural activity that accompanies a reactive, visually-evoked saccade. What these data suggest is that visuomotor transformations are a network phenomenon that is simultaneously observable at the level of individual neurons, and distributed across different neuronal populations and structures.

Neurons for eye and hand action in the monkey medial posterior parietal cortex

Speaker: Patrizia Fattori, University of Bologna
Additional Authors: Fattori Patrizia, Breveglieri Rossella, Galletti Claudio, Department of Pharmacy and Biotechnology, University of Bologna

In the last decades, several components of the visual control of eye and hand movements have been disentangled by studying single neurons in the brain of awake macaque monkeys. In this presentation, particular attention will be given to the influence of the direction of gaze upon the reaching activity of neurons of the dorsomedial visual stream. We recorded from the caudal part of the medial posterior parietal cortex, finding neurons sensitive to the direction and amplitude of arm reaching actions. The reaching activity of these neurons was influenced by the direction of gaze, some neurons preferring foveal reaching, others peripheral reaching. Manipulations of eye/target positions and of hand position showed that the reaching activity could be in eye-centered, head-centered, or a mixed frame of reference according to the considered neuron. We also found neurons modulated by the visual features of real objects and neurons modulated also by grasping movements, such as wrist orientation and grip formation. So it seems that the entire neural machinery for encoding eye and hand action is hosted in the dorsomedial visual stream. This machinery takes part in the sequence of visuomotor transformations required to encode many aspects of the reach-to-grasp actions.

The role of the early visual cortex in action

Speaker: Simona Monaco, Center for Mind/Brain Sciences, University of Trento
Additional Authors: Simona Monaco, Center for Mind/Brain Sciences, University of Trento; Doug Crawford, Centre for Vision Research, York University; Luca Turella, Center for Mind/Brain Sciences, University of Trento; Jody Culham, Brain and Mind Institution

Functional magnetic resonance imaging has recently allowed showing that intended action modulates the sensory processing of object orientation in areas of the action network in the human brain. In particular, intended actions can be decoded in the early visual cortex using multivoxel pattern analyses before the movements are initiated, regardless of whether the tar-get object is visible or not. In addition, the early visual cortex is rerecruited during actions in the dark towards stimuli that have been previously seen. These results suggest three main points. First, the action-driven modulation of sensory processing is shown at the neural level in a network of areas that include the early visual cortex. Second, the role of the early visual cortex goes well beyond the processing of sensory information for perception and might be the target of reentrant feedback for sensory-motor integration. Third, the early visual cortex shows action-driven modulation during both action planning and execution, suggesting a continuous exchange of information with higher-order visual-motor areas for the production of a motor output.

The influence of action execution on object size perception

Speaker: Annalisa Bosco, Department of Pharmacy and Biotechnology, University of Bologna
Additional Authors: Annalisa Bosco, Department of Pharmacy and Biotechnology, University of Bologna; Patrizia Fattori, Department of Pharmacy and Biotechnology, University of Bologna

When performing an action, our perception is focused towards object visual properties that enable us to execute the action successfully. However, the motor system is also able to influence perception, but only few studies reported evidence for hand action-induced visual perception modifications. Here, we aimed to study for a feature-specific perceptual modulation before and after a reaching and grasping action. Two groups of subjects were instructed to either grasp or reach to different sized bars and, before and after the action, to perform a size perceptual task by manual and verbal report. Each group was tested in two experimental conditions: no prior knowledge of action type, where subjects did not know the successive type of movement, and prior knowledge of action type, where they were aware about the successive type of movement. In both manual and verbal perceptual size responses, we found that after a grasping movement the size perception was significantly modified. Additionally, this modification was enhanced when the subjects knew in advance the type of movement to execute in the subsequent phase of task. These data suggest that the knowledge of action type and the execution of the action shape the perception of object properties.

Neuroimaging reveals the human neural representations for visually guided grasping of real objects and pictures

Speaker: Jody C. Culham, Brain and Mind Institute, University of Western Ontario
Additional Authors: Jody C. Culham, University of Western Ontario; Sara Fabbri, Radboud University Nijmegen; Jacqueline C. Snow, University of Nevada, Reno; Erez Freud, Carnegie-Mellon University

Neuroimaging, particularly functional magnetic resonance imaging (fMRI), has revealed many human brain areas that are involved in the processing of visual information for the planning and guidance of actions. One area of particular interest is the anterior intraparietal sulcus (aIPS), which is thought to play a key role in processing information about object shape for the visual control of grasping. However, much fMRI research has relied on artificial stimuli, such as two-dimensional photos, and artificial actions, such as pantomimed grasping. Recent fMRI studies from our lab have used representational similarity analysis on the patterns of fMRI activation from brain areas such as aIPS to infer neural coding in participants performing real actions upon real objects. This research has revealed the visual features of the object (particularly elongation) and the type of grasp (including the number of digits and precision required) that are coded in aIPS and other regions. Moreover, this work has suggested that these neural representations are affected by the realness of the object, particularly during grasping. Taken together, these results highlight the value of using more ecological paradigms to study sensorimotor control.

< Back to 2017 Symposia