1
|
Kang JU, Mooshagian E, Snyder LH. Functional organization of posterior parietal cortex circuitry based on inferred information flow. Cell Rep 2024; 43:114028. [PMID: 38581681 PMCID: PMC11090617 DOI: 10.1016/j.celrep.2024.114028] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2023] [Revised: 02/09/2024] [Accepted: 03/15/2024] [Indexed: 04/08/2024] Open
Abstract
Many studies infer the role of neurons by asking what information can be decoded from their activity or by observing the consequences of perturbing their activity. An alternative approach is to consider information flow between neurons. We applied this approach to the parietal reach region (PRR) and the lateral intraparietal area (LIP) in posterior parietal cortex. Two complementary methods imply that across a range of reaching tasks, information flows primarily from PRR to LIP. This indicates that during a coordinated reach task, LIP has minimal influence on PRR and rules out the idea that LIP forms a general purpose spatial processing hub for action and cognition. Instead, we conclude that PRR and LIP operate in parallel to plan arm and eye movements, respectively, with asymmetric interactions that likely support eye-hand coordination. Similar methods can be applied to other areas to infer their functional relationships based on inferred information flow.
Collapse
Affiliation(s)
- Jung Uk Kang
- Department of Neuroscience, Washington University School of Medicine, St. Louis, MO 63110, USA.
| | - Eric Mooshagian
- Department of Neuroscience, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Lawrence H Snyder
- Department of Neuroscience, Washington University School of Medicine, St. Louis, MO 63110, USA
| |
Collapse
|
2
|
Local field potentials in the parietal reach region reveal mechanisms of bimanual coordination. Nat Commun 2021; 12:2514. [PMID: 33947840 PMCID: PMC8096826 DOI: 10.1038/s41467-021-22701-3] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2018] [Accepted: 03/25/2021] [Indexed: 02/02/2023] Open
Abstract
Primates use their arms in complex ways that frequently require coordination between the two arms. Yet the planning of bimanual movements has not been well-studied. We recorded spikes and local field potentials (LFP) from the parietal reach region (PRR) in both hemispheres simultaneously while monkeys planned and executed unimanual and bimanual reaches. From analyses of interhemispheric LFP-LFP and spike-LFP coherence, we found that task-specific information is shared across hemispheres in a frequency-specific manner. This shared information could arise from common input or from direct communication. The population average unit activity in PRR, representing PRR output, encodes only planned contralateral arm movements while beta-band LFP power, a putative PRR input, reflects the pattern of planned bimanual movement. A parsimonious interpretation of these data is that PRR integrates information about the movement of the left and right limbs, perhaps in service of bimanual coordination.
Collapse
|
3
|
The Caudal Part of Putamen Represents the Historical Object Value Information. J Neurosci 2018; 39:1709-1719. [PMID: 30573645 DOI: 10.1523/jneurosci.2534-18.2018] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2018] [Revised: 11/26/2018] [Accepted: 12/11/2018] [Indexed: 12/19/2022] Open
Abstract
The basal ganglia, especially the circuits originating from the putamen, are essential for controlling normal body movements. Notably, the putamen receives inputs not only from motor cortical areas but also from multiple sensory cortices. However, how these sensory signals are processed in the putamen remains unclear. We recorded the activity of tentative medium spiny neurons in the caudal part of the putamen when the monkey viewed many fractal objects. We found many neurons that responded to these objects, mostly in the ventral region. We called this region "putamen tail" (PUTt), as it is dorsally adjacent to "caudate tail" (CDt). Although PUTt and CDt are mostly separated by a thin layer of white matter, their neurons shared several features. Almost all of them had receptive fields in the contralateral hemifield. Moreover, their responses were object selective (i.e., variable across objects). The object selectivity was higher in the ventral region (i.e., CDt > PUTt). Some neurons above PUTt, which we called the caudal-dorsal putamen (cdPUT), also responded to objects, but less selectively than PUTt. Next, we examined whether these visual neurons changed their responses based on the reward outcome. We found that many neurons encoded the values of many objects based on long-term memory, but not based on short-term memory. Such stable value responses were stronger in PUTt and CDt than in cdPUT. These results suggest that PUTt, together with CDt, controls saccade/attention among objects with different historical values, and may control other motor actions as well.SIGNIFICANCE STATEMENT Although the putamen receives inputs not only from motor cortical areas but also from sensory cortical areas, how these sensory signals are processed remains unclear. Here we found that neurons in the caudal-ventral part of the putamen (putamen tail) process visual information including spatial and object features. These neurons discriminate many objects, first by their visual features and later by their reward values as well. Importantly, the value discrimination was based on long-term memory, but not on short-term memory. These results suggest that the putamen tail controls saccade/attention among objects with different historical values and might control other motor actions as well.
Collapse
|
4
|
de Haan MJ, Brochier T, Grün S, Riehle A, Barthélemy FV. Real-time visuomotor behavior and electrophysiology recording setup for use with humans and monkeys. J Neurophysiol 2018; 120:539-552. [PMID: 29718806 PMCID: PMC6139457 DOI: 10.1152/jn.00262.2017] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Large-scale network dynamics in multiple visuomotor areas is of great interest in the study of eye-hand coordination in both human and monkey. To explore this, it is essential to develop a setup that allows for precise tracking of eye and hand movements. It is desirable that it is able to generate mechanical or visual perturbations of hand trajectories so that eye-hand coordination can be studied in a variety of conditions. There are simple solutions that satisfy these requirements for hand movements performed in the horizontal plane while visual stimuli and hand feedback are presented in the vertical plane. However, this spatial dissociation requires cognitive rules for eye-hand coordination different from eye-hand movements performed in the same space, as is the case in most natural conditions. Here we present an innovative solution for the precise tracking of eye and hand movements in a single reference frame. Importantly, our solution allows behavioral explorations under normal and perturbed conditions in both humans and monkeys. It is based on the integration of two noninvasive commercially available systems to achieve online control and synchronous recording of eye (EyeLink) and hand (KINARM) positions during interactive visuomotor tasks. We also present an eye calibration method compatible with different eye trackers that compensates for nonlinearities caused by the system's geometry. Our setup monitors the two effectors in real time with high spatial and temporal resolution and simultaneously outputs behavioral and neuronal data to an external data acquisition system using a common data format. NEW & NOTEWORTHY We developed a new setup for studying eye-hand coordination in humans and monkeys that monitors the two effectors in real time in a common reference frame. Our eye calibration method allows us to track gaze positions relative to visual stimuli presented in the horizontal workspace of the hand movements. This method compensates for nonlinearities caused by the system’s geometry and transforms kinematics signals from the eye tracker into the same coordinate system as hand and targets.
Collapse
Affiliation(s)
- Marcel Jan de Haan
- Institut de Neurosciences de la Timone, Centre National de la Recherche Scientifique-Aix-Marseille Université, UMR7289, Marseille , France.,Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA Brain Institute I (INM-10), Forschungszentrum Jülich, Jülich , Germany
| | - Thomas Brochier
- Institut de Neurosciences de la Timone, Centre National de la Recherche Scientifique-Aix-Marseille Université, UMR7289, Marseille , France
| | - Sonja Grün
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA Brain Institute I (INM-10), Forschungszentrum Jülich, Jülich , Germany.,RIKEN Brain Science Institute, Hirosawa, Wako-Shi, Saitama , Japan.,Theoretical Systems Neurobiology, RWTH Aachen University , Aachen , Germany
| | - Alexa Riehle
- Institut de Neurosciences de la Timone, Centre National de la Recherche Scientifique-Aix-Marseille Université, UMR7289, Marseille , France.,Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA Brain Institute I (INM-10), Forschungszentrum Jülich, Jülich , Germany
| | - Frédéric V Barthélemy
- Institut de Neurosciences de la Timone, Centre National de la Recherche Scientifique-Aix-Marseille Université, UMR7289, Marseille , France.,Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA Brain Institute I (INM-10), Forschungszentrum Jülich, Jülich , Germany
| |
Collapse
|
5
|
Mooshagian E, Wang C, Holmes CD, Snyder LH. Single Units in the Posterior Parietal Cortex Encode Patterns of Bimanual Coordination. Cereb Cortex 2018; 28:1549-1567. [PMID: 28369392 PMCID: PMC5907348 DOI: 10.1093/cercor/bhx052] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2016] [Revised: 02/07/2017] [Accepted: 02/10/2017] [Indexed: 11/12/2022] Open
Abstract
Bimanual coordination is critical for a broad array of behaviors. Drummers, for example, must carefully coordinate movements of their 2 arms, sometimes beating on the same drum and sometimes on different ones. While coordinated behavior is well-studied, the early stages of planning are not well understood. In the parietal reach region (PRR) of the posterior parietal cortex (PPC), the presence of neurons that modulate when either arm moves by itself has been taken as evidence for a role in bimanual coordination. To test this notion, we recorded neurons during both unilateral and bimanual movements. We find that the activity that precedes an ipsilateral arm movement is primarily a sensory response to a target in the neuron's visual receptive field and not a plan to move the ipsilateral arm. In contrast, the activity that precedes a contralateral arm movement is the sum of a movement plan plus a sensory response. Despite not coding ipsilateral arm movements, about half of neurons discriminate between different patterns of bimanual movements. These results provide direct evidence that PRR neurons represent bimanual reach plans, and suggest that bimanual coordination originates in the sensory-to-motor processing stream prior to the motor cortex, within the PPC.
Collapse
Affiliation(s)
- Eric Mooshagian
- Department of Neuroscience, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Cunguo Wang
- Department of Neuroscience, Washington University School of Medicine, St. Louis, MO 63110, USA
| | - Charles D Holmes
- Department of Biomedical Engineering, Washington University in St. Louis, St. Louis, MO 63130, USA
| | - Lawrence H Snyder
- Department of Neuroscience, Washington University School of Medicine, St. Louis, MO 63110, USA
- Department of Biomedical Engineering, Washington University in St. Louis, St. Louis, MO 63130, USA
| |
Collapse
|
6
|
Spatial eye-hand coordination during bimanual reaching is not systematically coded in either LIP or PRR. Proc Natl Acad Sci U S A 2018; 115:E3817-E3826. [PMID: 29610356 PMCID: PMC5910835 DOI: 10.1073/pnas.1718267115] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/11/2023] Open
Abstract
When we reach for something, we also look at it. If we reach for two objects at once, one with each hand, we look first at one and then the other. It is not known which brain areas underlie this coordination. We studied two parietal areas known to be involved in eye and arm movements. Neither area was sensitive to the order in which the targets were looked at. This implies that coordinated saccades are driven by downstream areas and not by the parietal cortex as is commonly assumed. We often orient to where we are about to reach. Spatial and temporal correlations in eye and arm movements may depend on the posterior parietal cortex (PPC). Spatial representations of saccade and reach goals preferentially activate cells in the lateral intraparietal area (LIP) and the parietal reach region (PRR), respectively. With unimanual reaches, eye and arm movement patterns are highly stereotyped. This makes it difficult to study the neural circuits involved in coordination. Here, we employ bimanual reaching to two different targets. Animals naturally make a saccade first to one target and then the other, resulting in different patterns of limb–gaze coordination on different trials. Remarkably, neither LIP nor PRR cells code which target the eyes will move to first. These results suggest that the parietal cortex plays at best only a permissive role in some aspects of eye–hand coordination and makes the role of LIP in saccade generation unclear.
Collapse
|
7
|
Li Y, Wang Y, Cui H. Eye-hand coordination during flexible manual interception of an abruptly appearing, moving target. J Neurophysiol 2018; 119:221-234. [DOI: 10.1152/jn.00476.2017] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
As a vital skill in an evolving world, interception of moving objects relies on accurate prediction of target motion. In natural circumstances, active gaze shifts often accompany hand movements when exploring targets of interest, but how eye and hand movements are coordinated during manual interception and their dependence on visual prediction remain unclear. Here, we trained gaze-unrestrained monkeys to manually intercept targets appearing at random locations and circularly moving with random speeds. We found that well-trained animals were able to intercept the targets with adequate compensation for both sensory transmission and motor delays. Before interception, the animals' gaze followed the targets with adequate compensation for the sensory delay, but not for extra target displacement during the eye movements. Both hand and eye movements were modulated by target kinematics, and their reaction times were correlated. Moreover, retinal errors and reaching errors were correlated across different stages of reach execution. Our results reveal eye-hand coordination during manual interception, yet the eye and hand movements may show different levels of prediction based on the task context. NEW & NOTEWORTHY Here we studied the eye-hand coordination of monkeys during flexible manual interception of a moving target. Eye movements were untrained and not explicitly associated with reward. We found that the initial saccades toward the moving target adequately compensated for sensory transmission delays, but not for extra target displacement, whereas the reaching arm movements fully compensated for sensorimotor delays, suggesting that the mode of eye-hand coordination strongly depends on behavioral context.
Collapse
Affiliation(s)
- Yuhui Li
- Brain and Behavior Discovery Institute, Medical College of Georgia, Augusta University, Augusta, Georgia
| | - Yong Wang
- Brain and Behavior Discovery Institute, Medical College of Georgia, Augusta University, Augusta, Georgia
- State Key Laboratory of Brain and Cognitive Science, Institute of Biophysics, Chinese Academy of Sciences, Beijing, China
| | - He Cui
- Brain and Behavior Discovery Institute, Medical College of Georgia, Augusta University, Augusta, Georgia
- CAS Key Laboratory of Primate Neurobiology, Shanghai, China
- CAS Center for Excellence in Brain Science and Intelligent Technology, Shanghai, China
- Institute of Neuroscience, Chinese Academy of Sciences, Shanghai, China
| |
Collapse
|