1
|
Ziemba CM, Goris RLT, Stine GM, Perez RK, Simoncelli EP, Movshon JA. Neuronal and Behavioral Responses to Naturalistic Texture Images in Macaque Monkeys. J Neurosci 2024; 44:e0349242024. [PMID: 39197942 PMCID: PMC11484546 DOI: 10.1523/jneurosci.0349-24.2024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2024] [Revised: 06/19/2024] [Accepted: 08/10/2024] [Indexed: 09/01/2024] Open
Abstract
The visual world is richly adorned with texture, which can serve to delineate important elements of natural scenes. In anesthetized macaque monkeys, selectivity for the statistical features of natural texture is weak in V1, but substantial in V2, suggesting that neuronal activity in V2 might directly support texture perception. To test this, we investigated the relation between single cell activity in macaque V1 and V2 and simultaneously measured behavioral judgments of texture. We generated stimuli along a continuum between naturalistic texture and phase-randomized noise and trained two macaque monkeys to judge whether a sample texture more closely resembled one or the other extreme. Analysis of responses revealed that individual V1 and V2 neurons carried much less information about texture naturalness than behavioral reports. However, the sensitivity of V2 neurons, especially those preferring naturalistic textures, was significantly closer to that of behavior compared with V1. The firing of both V1 and V2 neurons predicted perceptual choices in response to repeated presentations of the same ambiguous stimulus in one monkey, despite low individual neural sensitivity. However, neither population predicted choice in the second monkey. We conclude that neural responses supporting texture perception likely continue to develop downstream of V2. Further, combined with neural data recorded while the same two monkeys performed an orientation discrimination task, our results demonstrate that choice-correlated neural activity in early sensory cortex is unstable across observers and tasks, untethered from neuronal sensitivity, and therefore unlikely to directly reflect the formation of perceptual decisions.
Collapse
Affiliation(s)
- Corey M Ziemba
- Center for Neural Science, New York University, New York, NY
| | - Robbe L T Goris
- Center for Neural Science, New York University, New York, NY
| | - Gabriel M Stine
- Center for Neural Science, New York University, New York, NY
| | - Richard K Perez
- Center for Neural Science, New York University, New York, NY
| | - Eero P Simoncelli
- Center for Neural Science, New York University, New York, NY
- Center for Computational Neuroscience, Flatiron Institute, New York, NY
| | | |
Collapse
|
2
|
Laamerad P, Liu LD, Pack CC. Decision-related activity and movement selection in primate visual cortex. SCIENCE ADVANCES 2024; 10:eadk7214. [PMID: 38809984 PMCID: PMC11135405 DOI: 10.1126/sciadv.adk7214] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/07/2023] [Accepted: 04/24/2024] [Indexed: 05/31/2024]
Abstract
Fluctuations in the activity of sensory neurons often predict perceptual decisions. This connection can be quantified with a metric called choice probability (CP), and there is a longstanding debate about whether CP reflects a causal influence on decisions or an echo of decision-making activity elsewhere in the brain. Here, we show that CP can reflect a third variable, namely, the movement used to indicate the decision. In a standard visual motion discrimination task, neurons in the middle temporal (MT) area of primate cortex responded more strongly during trials that involved a saccade toward their receptive fields. This variability accounted for much of the CP observed across the neuronal population, and it arose through training. Moreover, pharmacological inactivation of MT biased behavioral responses away from the corresponding visual field locations. These results demonstrate that training on a task with fixed sensorimotor contingencies introduces movement-related activity in sensory brain regions and that this plasticity can shape the neural circuitry of perceptual decision-making.
Collapse
Affiliation(s)
- Pooya Laamerad
- Department of Neurology and Neurosurgery, Montreal Neurological Institute, McGill University, Montreal, Canada
| | - Liu D. Liu
- Department of Neurology and Neurosurgery, Montreal Neurological Institute, McGill University, Montreal, Canada
| | | |
Collapse
|
3
|
Hacohen-Brown S, Gilboa-Schechtman E, Zaidel A. Modality-specific effects of threat on self-motion perception. BMC Biol 2024; 22:120. [PMID: 38783286 PMCID: PMC11119305 DOI: 10.1186/s12915-024-01911-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2023] [Accepted: 05/08/2024] [Indexed: 05/25/2024] Open
Abstract
BACKGROUND Threat and individual differences in threat-processing bias perception of stimuli in the environment. Yet, their effect on perception of one's own (body-based) self-motion in space is unknown. Here, we tested the effects of threat on self-motion perception using a multisensory motion simulator with concurrent threatening or neutral auditory stimuli. RESULTS Strikingly, threat had opposite effects on vestibular and visual self-motion perception, leading to overestimation of vestibular, but underestimation of visual self-motions. Trait anxiety tended to be associated with an enhanced effect of threat on estimates of self-motion for both modalities. CONCLUSIONS Enhanced vestibular perception under threat might stem from shared neural substrates with emotional processing, whereas diminished visual self-motion perception may indicate that a threatening stimulus diverts attention away from optic flow integration. Thus, threat induces modality-specific biases in everyday experiences of self-motion.
Collapse
Affiliation(s)
- Shira Hacohen-Brown
- Gonda Multidisciplinary Brain Research Center, Bar-Ilan University, 5290002, Ramat Gan, Israel
| | - Eva Gilboa-Schechtman
- Gonda Multidisciplinary Brain Research Center, Bar-Ilan University, 5290002, Ramat Gan, Israel
- Department of Psychology, Bar-Ilan University, 5290002, Ramat-Gan, Israel
| | - Adam Zaidel
- Gonda Multidisciplinary Brain Research Center, Bar-Ilan University, 5290002, Ramat Gan, Israel.
| |
Collapse
|
4
|
Ziemba CM, Goris RLT, Stine GM, Perez RK, Simoncelli EP, Movshon JA. Neuronal and behavioral responses to naturalistic texture images in macaque monkeys. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.02.22.581645. [PMID: 38464304 PMCID: PMC10925125 DOI: 10.1101/2024.02.22.581645] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/12/2024]
Abstract
The visual world is richly adorned with texture, which can serve to delineate important elements of natural scenes. In anesthetized macaque monkeys, selectivity for the statistical features of natural texture is weak in V1, but substantial in V2, suggesting that neuronal activity in V2 might directly support texture perception. To test this, we investigated the relation between single cell activity in macaque V1 and V2 and simultaneously measured behavioral judgments of texture. We generated stimuli along a continuum between naturalistic texture and phase-randomized noise and trained two macaque monkeys to judge whether a sample texture more closely resembled one or the other extreme. Analysis of responses revealed that individual V1 and V2 neurons carried much less information about texture naturalness than behavioral reports. However, the sensitivity of V2 neurons, especially those preferring naturalistic textures, was significantly closer to that of behavior compared with V1. The firing of both V1 and V2 neurons predicted perceptual choices in response to repeated presentations of the same ambiguous stimulus in one monkey, despite low individual neural sensitivity. However, neither population predicted choice in the second monkey. We conclude that neural responses supporting texture perception likely continue to develop downstream of V2. Further, combined with neural data recorded while the same two monkeys performed an orientation discrimination task, our results demonstrate that choice-correlated neural activity in early sensory cortex is unstable across observers and tasks, untethered from neuronal sensitivity, and thus unlikely to reflect a critical aspect of the formation of perceptual decisions. Significance statement As visual signals propagate along the cortical hierarchy, they encode increasingly complex aspects of the sensory environment and likely have a more direct relationship with perceptual experience. We replicate and extend previous results from anesthetized monkeys differentiating the selectivity of neurons along the first step in cortical vision from area V1 to V2. However, our results further complicate efforts to establish neural signatures that reveal the relationship between perception and the neuronal activity of sensory populations. We find that choice-correlated activity in V1 and V2 is unstable across different observers and tasks, and also untethered from neuronal sensitivity and other features of nonsensory response modulation.
Collapse
|
5
|
Zheng Q, Gu Y. From Multisensory Integration to Multisensory Decision-Making. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2024; 1437:23-35. [PMID: 38270851 DOI: 10.1007/978-981-99-7611-9_2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/26/2024]
Abstract
Organisms live in a dynamic environment in which sensory information from multiple sources is ever changing. A conceptually complex task for the organisms is to accumulate evidence across sensory modalities and over time, a process known as multisensory decision-making. This is a new concept, in terms of that previous researches have been largely conducted in parallel disciplines. That is, much efforts have been put either in sensory integration across modalities using activity summed over a duration of time, or in decision-making with only one sensory modality that evolves over time. Recently, a few studies with neurophysiological measurements emerge to study how different sensory modality information is processed, accumulated, and integrated over time in decision-related areas such as the parietal or frontal lobes in mammals. In this review, we summarize and comment on these studies that combine the long-existed two parallel fields of multisensory integration and decision-making. We show how the new findings provide insight into our understanding about neural mechanisms mediating multisensory information processing in a more complete way.
Collapse
Affiliation(s)
- Qihao Zheng
- Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, Chinese Academy of Sciences, Shanghai, China
| | - Yong Gu
- Systems Neuroscience, SInstitute of Neuroscience, Chinese Academy of Sciences, Shanghai, China.
| |
Collapse
|
6
|
Zeng Z, Zhang C, Gu Y. Visuo-vestibular heading perception: a model system to study multi-sensory decision making. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220334. [PMID: 37545303 PMCID: PMC10404926 DOI: 10.1098/rstb.2022.0334] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2022] [Accepted: 05/15/2023] [Indexed: 08/08/2023] Open
Abstract
Integrating noisy signals across time as well as sensory modalities, a process named multi-sensory decision making (MSDM), is an essential strategy for making more accurate and sensitive decisions in complex environments. Although this field is just emerging, recent extraordinary works from different perspectives, including computational theory, psychophysical behaviour and neurophysiology, begin to shed new light onto MSDM. In the current review, we focus on MSDM by using a model system of visuo-vestibular heading. Combining well-controlled behavioural paradigms on virtual-reality systems, single-unit recordings, causal manipulations and computational theory based on spiking activity, recent progress reveals that vestibular signals contain complex temporal dynamics in many brain regions, including unisensory, multi-sensory and sensory-motor association areas. This challenges the brain for cue integration across time and sensory modality such as optic flow which mainly contains a motion velocity signal. In addition, new evidence from the higher-level decision-related areas, mostly in the posterior and frontal/prefrontal regions, helps revise our conventional thought on how signals from different sensory modalities may be processed, converged, and moment-by-moment accumulated through neural circuits for forming a unified, optimal perceptual decision. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
Affiliation(s)
- Zhao Zeng
- CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, Chinese Academy of Sciences, 200031 Shanghai, People's Republic of China
- University of Chinese Academy of Sciences, 100049 Beijing, People's Republic of China
| | - Ce Zhang
- CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, Chinese Academy of Sciences, 200031 Shanghai, People's Republic of China
- University of Chinese Academy of Sciences, 100049 Beijing, People's Republic of China
| | - Yong Gu
- CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, Chinese Academy of Sciences, 200031 Shanghai, People's Republic of China
- University of Chinese Academy of Sciences, 100049 Beijing, People's Republic of China
| |
Collapse
|
7
|
Levi AJ, Zhao Y, Park IM, Huk AC. Sensory and Choice Responses in MT Distinct from Motion Encoding. J Neurosci 2023; 43:2090-2103. [PMID: 36781221 PMCID: PMC10042117 DOI: 10.1523/jneurosci.0267-22.2023] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2022] [Revised: 01/16/2023] [Accepted: 01/19/2023] [Indexed: 02/15/2023] Open
Abstract
The macaque middle temporal (MT) area is well known for its visual motion selectivity and relevance to motion perception, but the possibility of it also reflecting higher-level cognitive functions has largely been ignored. We tested for effects of task performance distinct from sensory encoding by manipulating subjects' temporal evidence-weighting strategy during a direction discrimination task while performing electrophysiological recordings from groups of MT neurons in rhesus macaques (one male, one female). This revealed multiple components of MT responses that were, surprisingly, not interpretable as behaviorally relevant modulations of motion encoding, or as bottom-up consequences of the readout of motion direction from MT. The time-varying motion-driven responses of MT were strongly affected by our strategic manipulation-but with time courses opposite the subjects' temporal weighting strategies. Furthermore, large choice-correlated signals were represented in population activity distinct from its motion responses, with multiple phases that lagged psychophysical readout and even continued after the stimulus (but which preceded motor responses). In summary, a novel experimental manipulation of strategy allowed us to control the time course of readout to challenge the correlation between sensory responses and choices, and population-level analyses of simultaneously recorded ensembles allowed us to identify strong signals that were so distinct from direction encoding that conventional, single-neuron-centric analyses could not have revealed or properly characterized them. Together, these approaches revealed multiple cognitive contributions to MT responses that are task related but not functionally relevant to encoding or decoding of motion for psychophysical direction discrimination, providing a new perspective on the assumed status of MT as a simple sensory area.SIGNIFICANCE STATEMENT This study extends understanding of the middle temporal (MT) area beyond its representation of visual motion. Combining multineuron recordings, population-level analyses, and controlled manipulation of task strategy, we exposed signals that depended on changes in temporal weighting strategy, but did not manifest as feedforward effects on behavior. This was demonstrated by (1) an inverse relationship between temporal dynamics of behavioral readout and sensory encoding, (2) a choice-correlated signal that always lagged the stimulus time points most correlated with decisions, and (3) a distinct choice-correlated signal after the stimulus. These findings invite re-evaluation of MT for functions outside of its established sensory role and highlight the power of experimenter-controlled changes in temporal strategy, coupled with recording and analysis approaches that transcend the single-neuron perspective.
Collapse
Affiliation(s)
- Aaron J Levi
- Center for Perceptual Systems, Departments of Neuroscience and Psychology, The University of Texas at Austin, Austin, Texas 78705
| | - Yuan Zhao
- Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, New York 11794
| | - Il Memming Park
- Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, New York 11794
| | - Alexander C Huk
- Center for Perceptual Systems, Departments of Neuroscience and Psychology, The University of Texas at Austin, Austin, Texas 78705
- Fuster Laboratory, University of California Los Angeles, Los Angeles CA 90095
| |
Collapse
|
8
|
Zeng F, Zaidel A, Chen A. Contrary neuronal recalibration in different multisensory cortical areas. eLife 2023; 12:82895. [PMID: 36877555 PMCID: PMC9988259 DOI: 10.7554/elife.82895] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Accepted: 02/21/2023] [Indexed: 03/07/2023] Open
Abstract
The adult brain demonstrates remarkable multisensory plasticity by dynamically recalibrating itself based on information from multiple sensory sources. After a systematic visual-vestibular heading offset is experienced, the unisensory perceptual estimates for subsequently presented stimuli are shifted toward each other (in opposite directions) to reduce the conflict. The neural substrate of this recalibration is unknown. Here, we recorded single-neuron activity from the dorsal medial superior temporal (MSTd), parietoinsular vestibular cortex (PIVC), and ventral intraparietal (VIP) areas in three male rhesus macaques during this visual-vestibular recalibration. Both visual and vestibular neuronal tuning curves in MSTd shifted - each according to their respective cues' perceptual shifts. Tuning of vestibular neurons in PIVC also shifted in the same direction as vestibular perceptual shifts (cells were not robustly tuned to the visual stimuli). By contrast, VIP neurons demonstrated a unique phenomenon: both vestibular and visual tuning shifted in accordance with vestibular perceptual shifts. Such that, visual tuning shifted, surprisingly, contrary to visual perceptual shifts. Therefore, while unsupervised recalibration (to reduce cue conflict) occurs in early multisensory cortices, higher-level VIP reflects only a global shift, in vestibular space.
Collapse
Affiliation(s)
- Fu Zeng
- Key Laboratory of Brain Functional Genomics (Ministry of Education), East China Normal UniversityShanghaiChina
| | - Adam Zaidel
- Gonda Multidisciplinary Brain Research Center, Bar-Ilan UniversityRamat GanIsrael
| | - Aihua Chen
- Key Laboratory of Brain Functional Genomics (Ministry of Education), East China Normal UniversityShanghaiChina
| |
Collapse
|
9
|
Fine I, Park WJ. Do you hear what I see? How do early blind individuals experience object motion? Philos Trans R Soc Lond B Biol Sci 2023; 378:20210460. [PMID: 36511418 PMCID: PMC9745882 DOI: 10.1098/rstb.2021.0460] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2022] [Accepted: 09/13/2022] [Indexed: 12/15/2022] Open
Abstract
One of the most important tasks for 3D vision is tracking the movement of objects in space. The ability of early blind individuals to understand motion in the environment from noisy and unreliable auditory information is an impressive example of cortical adaptation that is only just beginning to be understood. Here, we compare visual and auditory motion processing, and discuss the effect of early blindness on the perception of auditory motion. Blindness leads to cross-modal recruitment of the visual motion area hMT+ for auditory motion processing. Meanwhile, the planum temporale, associated with auditory motion in sighted individuals, shows reduced selectivity for auditory motion. We discuss how this dramatic shift in the cortical basis of motion processing might influence the perceptual experience of motion in early blind individuals. This article is part of a discussion meeting issue 'New approaches to 3D vision'.
Collapse
Affiliation(s)
- Ione Fine
- Department of Psychology, University of Washington, Seattle, WA 98195-1525, USA
| | - Woon Ju Park
- Department of Psychology, University of Washington, Seattle, WA 98195-1525, USA
| |
Collapse
|
10
|
Causal contribution of optic flow signal in Macaque extrastriate visual cortex for roll perception. Nat Commun 2022; 13:5479. [PMID: 36123363 PMCID: PMC9485245 DOI: 10.1038/s41467-022-33245-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2022] [Accepted: 09/08/2022] [Indexed: 11/08/2022] Open
Abstract
Optic flow is a powerful cue for inferring self-motion status which is critical for postural control, spatial orientation, locomotion and navigation. In primates, neurons in extrastriate visual cortex (MSTd) are predominantly modulated by high-order optic flow patterns (e.g., spiral), yet a functional link to direct perception is lacking. Here, we applied electrical microstimulation to selectively manipulate population of MSTd neurons while macaques discriminated direction of rotation around line-of-sight (roll) or direction of linear-translation (heading), two tasks which were orthogonal in 3D spiral coordinate using a four-alternative-forced-choice paradigm. Microstimulation frequently biased animal's roll perception towards coded labeled-lines of the artificial-stimulated neurons in either context with spiral or pure-rotation stimuli. Choice frequency was also altered between roll and translation flow-pattern. Our results provide direct causal-link evidence supporting that roll signals in MSTd, despite often mixed with translation signals, can be extracted by downstream areas for perception of rotation relative to gravity-vertical.
Collapse
|
11
|
Cortical Mechanisms of Multisensory Linear Self-motion Perception. Neurosci Bull 2022; 39:125-137. [PMID: 35821337 PMCID: PMC9849545 DOI: 10.1007/s12264-022-00916-8] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/14/2022] [Accepted: 04/29/2022] [Indexed: 01/22/2023] Open
Abstract
Accurate self-motion perception, which is critical for organisms to survive, is a process involving multiple sensory cues. The two most powerful cues are visual (optic flow) and vestibular (inertial motion). Psychophysical studies have indicated that humans and nonhuman primates integrate the two cues to improve the estimation of self-motion direction, often in a statistically Bayesian-optimal way. In the last decade, single-unit recordings in awake, behaving animals have provided valuable neurophysiological data with a high spatial and temporal resolution, giving insight into possible neural mechanisms underlying multisensory self-motion perception. Here, we review these findings, along with new evidence from the most recent studies focusing on the temporal dynamics of signals in different modalities. We show that, in light of new data, conventional thoughts about the cortical mechanisms underlying visuo-vestibular integration for linear self-motion are challenged. We propose that different temporal component signals may mediate different functions, a possibility that requires future studies.
Collapse
|
12
|
Abstract
Navigating by path integration requires continuously estimating one's self-motion. This estimate may be derived from visual velocity and/or vestibular acceleration signals. Importantly, these senses in isolation are ill-equipped to provide accurate estimates, and thus visuo-vestibular integration is an imperative. After a summary of the visual and vestibular pathways involved, the crux of this review focuses on the human and theoretical approaches that have outlined a normative account of cue combination in behavior and neurons, as well as on the systems neuroscience efforts that are searching for its neural implementation. We then highlight a contemporary frontier in our state of knowledge: understanding how velocity cues with time-varying reliabilities are integrated into an evolving position estimate over prolonged time periods. Further, we discuss how the brain builds internal models inferring when cues ought to be integrated versus segregated-a process of causal inference. Lastly, we suggest that the study of spatial navigation has not yet addressed its initial condition: self-location.
Collapse
Affiliation(s)
- Jean-Paul Noel
- Center for Neural Science, New York University, New York, NY 10003, USA;
| | - Dora E Angelaki
- Center for Neural Science, New York University, New York, NY 10003, USA;
- Tandon School of Engineering, New York University, New York, NY 11201, USA
| |
Collapse
|
13
|
Zhou L, Zhu Q, Wu B, Qin B, Hu H, Qian Z. A comparison of directed functional connectivity among fist-related brain activities during movement imagery, movement execution, and movement observation. Brain Res 2021; 1777:147769. [PMID: 34971597 DOI: 10.1016/j.brainres.2021.147769] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2021] [Revised: 12/03/2021] [Accepted: 12/24/2021] [Indexed: 12/22/2022]
Abstract
Brain-computer interface (BCI) has been widely used in sports training and rehabilitation training. It is primarily based on action simulation, including movement imagery (MI) and movement observation (MO). However, the development of BCI technology is limited due to the challenge of getting an in-depth understanding of brain networks involved in MI, MO, and movement execution (ME). To better understand the brain activity changes and the communications across various brain regions under MO, ME, and MI, this study conducted the fist experiment under MO, ME, and MI. We recorded 64-channel electroencephalography (EEG) from 39 healthy subjects (25 males, 14 females, all right-handed) during fist tasks, obtained intensities and locations of sources using EEG source imaging (ESI), computed source activation modes, and finally investigated the brain networks using spectral Granger causality (GC). The brain regions involved in the three motor conditions are similar, but the degree of participation of each brain region and the network connections among the brain regions are different. MO, ME, and MI did not recruit shared brain connectivity networks. In addition, both source activation modes and brain network connectivity had lateralization advantages.
Collapse
Affiliation(s)
- Lu Zhou
- Department of Biomedical Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing, China
| | - Qiaoqiao Zhu
- Department of Biomedical Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing, China
| | - Biao Wu
- Electronic Information Department, Nanjing University of Aeronautics and Astronautics, Nanjing, China
| | - Bing Qin
- Department of Biomedical Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing, China
| | - Haixu Hu
- Sports Training Academy, Nanjing Sport Institute, Nanjing, China
| | - Zhiyu Qian
- Department of Biomedical Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing, China.
| |
Collapse
|
14
|
Zheng Q, Zhou L, Gu Y. Temporal synchrony effects of optic flow and vestibular inputs on multisensory heading perception. Cell Rep 2021; 37:109999. [PMID: 34788608 DOI: 10.1016/j.celrep.2021.109999] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2020] [Revised: 08/21/2021] [Accepted: 10/21/2021] [Indexed: 11/25/2022] Open
Abstract
Precise heading perception requires integration of optic flow and vestibular cues, yet the two cues often carry distinct temporal dynamics that may confound cue integration benefit. Here, we varied temporal offset between the two sensory inputs while macaques discriminated headings around straight ahead. We find the best heading performance does not occur under natural condition of synchronous inputs with zero offset but rather when visual stimuli are artificially adjusted to lead vestibular by a few hundreds of milliseconds. This amount exactly matches the lag between the vestibular acceleration and visual speed signals as measured from single-unit-activity in frontal and posterior parietal cortices. Manually aligning cues in these areas best facilitates integration with some nonlinear gain modulation effects. These findings are consistent with predictions from a model by which the brain integrates optic flow speed with a faster vestibular acceleration signal for sensing instantaneous heading direction during self-motion in the environment.
Collapse
Affiliation(s)
- Qihao Zheng
- CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, Chinese Academy of Sciences, 200031 Shanghai, China; University of Chinese Academy of Sciences, 100049 Beijing, China
| | - Luxin Zhou
- CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, Chinese Academy of Sciences, 200031 Shanghai, China; University of Chinese Academy of Sciences, 100049 Beijing, China
| | - Yong Gu
- CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, Chinese Academy of Sciences, 200031 Shanghai, China; University of Chinese Academy of Sciences, 100049 Beijing, China; Shanghai Center for Brain Science and Brain-Inspired Intelligence Technology, 201210 Shanghai, China.
| |
Collapse
|
15
|
Jing R, Yang C, Huang X, Li W. Perceptual learning as a result of concerted changes in prefrontal and visual cortex. Curr Biol 2021; 31:4521-4533.e3. [PMID: 34450086 DOI: 10.1016/j.cub.2021.08.007] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2021] [Revised: 07/12/2021] [Accepted: 08/02/2021] [Indexed: 01/05/2023]
Abstract
Our perceptual ability remarkably improves with training. Some studies on visual perceptual learning have shown refined neural representation of the trained stimulus in the visual cortex, whereas others have exclusively argued for improved readout and decision-making processes in the frontoparietal cortex. The mixed results have rendered the underlying neural mechanisms puzzling and hotly debated. By simultaneously recording from monkey visual area V4 and ventrolateral prefrontal cortex (PFC) implanted with microelectrode arrays, we dissected learning-induced cortical changes over the course of training the monkeys in a global form detection task. Decoding analysis dissociated two distinct components of neuronal population codes that were progressively and markedly enhanced in both V4 and PFC. One component was closely related to the target stimulus feature and was subject to task-dependent top-down modulation; it emerged earlier in V4 than PFC and its enhancement was specific to the trained configuration of the target stimulus. The other component of the neural code was entirely related to the animal's behavioral choice; it emerged earlier in PFC than V4 and its enhancement completely generalized to an untrained stimulus configuration. These results implicate two concurrent and synergistic learning processes: a perceptual process that is specific to the details of the trained stimulus feature and a cognitive process that is dependent on the total amount of learning experience in the trained task. When combined, these two learning processes were well predictive of the animal's learning behavior.
Collapse
Affiliation(s)
- Rui Jing
- State Key Laboratory of Cognitive Neuroscience and Learning & IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing 100875, China
| | - Chen Yang
- State Key Laboratory of Cognitive Neuroscience and Learning & IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing 100875, China
| | - Xin Huang
- State Key Laboratory of Cognitive Neuroscience and Learning & IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing 100875, China
| | - Wu Li
- State Key Laboratory of Cognitive Neuroscience and Learning & IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing 100875, China; College of Life Sciences, Beijing Normal University, Beijing 100875, China.
| |
Collapse
|
16
|
Quinn KR, Seillier L, Butts DA, Nienborg H. Decision-related feedback in visual cortex lacks spatial selectivity. Nat Commun 2021; 12:4473. [PMID: 34294703 PMCID: PMC8298450 DOI: 10.1038/s41467-021-24629-0] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2021] [Accepted: 06/22/2021] [Indexed: 11/21/2022] Open
Abstract
Feedback in the brain is thought to convey contextual information that underlies our flexibility to perform different tasks. Empirical and computational work on the visual system suggests this is achieved by targeting task-relevant neuronal subpopulations. We combine two tasks, each resulting in selective modulation by feedback, to test whether the feedback reflected the combination of both selectivities. We used visual feature-discrimination specified at one of two possible locations and uncoupled the decision formation from motor plans to report it, while recording in macaque mid-level visual areas. Here we show that although the behavior is spatially selective, using only task-relevant information, modulation by decision-related feedback is spatially unselective. Population responses reveal similar stimulus-choice alignments irrespective of stimulus relevance. The results suggest a common mechanism across tasks, independent of the spatial selectivity these tasks demand. This may reflect biological constraints and facilitate generalization across tasks. Our findings also support a previously hypothesized link between feature-based attention and decision-related activity. Feedback modulates visual neurons, thought to help achieve flexible task performance. Here, the authors show decision-related feedback is not only relayed to task-relevant neurons, suggesting a broader mechanism and supporting a previously hypothesized link to feature-based attention.
Collapse
Affiliation(s)
| | | | - Daniel A Butts
- Department of Biology and Program in Neuroscience and Cognitive Science, University of Maryland, College Park, MD, USA
| | - Hendrikje Nienborg
- Laboratory of Sensorimotor Research, National Eye Institute, National Institutes of Health, Bethesda, MD, USA.
| |
Collapse
|
17
|
Smith JET, Parker AJ. Correlated structure of neuronal firing in macaque visual cortex limits information for binocular depth discrimination. J Neurophysiol 2021; 126:275-303. [PMID: 33978495 PMCID: PMC8325604 DOI: 10.1152/jn.00667.2020] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022] Open
Abstract
Variability in cortical neural activity potentially limits sensory discriminations. Theoretical work shows that information required to discriminate two similar stimuli is limited by the correlation structure of cortical variability. We investigated these information-limiting correlations by recording simultaneously from visual cortical areas primary visual cortex (V1) and extrastriate area V4 in macaque monkeys performing a binocular, stereo depth discrimination task. Within both areas, noise correlations on a rapid temporal scale (20–30 ms) were stronger for neuron pairs with similar selectivity for binocular depth, meaning that these correlations potentially limit information for making the discrimination. Between-area correlations (V1 to V4) were different, being weaker for neuron pairs with similar tuning and having a slower temporal scale (100+ ms). Fluctuations in these information-limiting correlations just prior to the detection event were associated with changes in behavioral accuracy. Although these correlations limit the recovery of information about sensory targets, their impact may be curtailed by integrative processing of signals across multiple brain areas. NEW & NOTEWORTHY Correlated noise reduces the stimulus information in visual cortical neurons during experimental performance of binocular depth discriminations. The temporal scale of these correlations is important. Rapid (20–30 ms) correlations reduce information within and between areas V1 and V4, whereas slow (>100 ms) correlations between areas do not. Separate cortical areas appear to act together to maintain signal fidelity. Rapid correlations reduce the neuronal signal difference between stimuli and adversely affect perceptual discrimination.
Collapse
Affiliation(s)
- Jackson E T Smith
- Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
| | - Andrew J Parker
- Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
| |
Collapse
|
18
|
Chicharro D, Panzeri S, Haefner RM. Stimulus-dependent relationships between behavioral choice and sensory neural responses. eLife 2021; 10:e54858. [PMID: 33825683 PMCID: PMC8184215 DOI: 10.7554/elife.54858] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2020] [Accepted: 04/06/2021] [Indexed: 01/16/2023] Open
Abstract
Understanding perceptual decision-making requires linking sensory neural responses to behavioral choices. In two-choice tasks, activity-choice covariations are commonly quantified with a single measure of choice probability (CP), without characterizing their changes across stimulus levels. We provide theoretical conditions for stimulus dependencies of activity-choice covariations. Assuming a general decision-threshold model, which comprises both feedforward and feedback processing and allows for a stimulus-modulated neural population covariance, we analytically predict a very general and previously unreported stimulus dependence of CPs. We develop new tools, including refined analyses of CPs and generalized linear models with stimulus-choice interactions, which accurately assess the stimulus- or choice-driven signals of each neuron, characterizing stimulus-dependent patterns of choice-related signals. With these tools, we analyze CPs of macaque MT neurons during a motion discrimination task. Our analysis provides preliminary empirical evidence for the promise of studying stimulus dependencies of choice-related signals, encouraging further assessment in wider data sets.
Collapse
Affiliation(s)
- Daniel Chicharro
- Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems@UniTn, Istituto Italiano di TecnologiaRoveretoItaly
- Department of Neurobiology, Harvard Medical SchoolBostonUnited States
| | - Stefano Panzeri
- Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems@UniTn, Istituto Italiano di TecnologiaRoveretoItaly
| | - Ralf M Haefner
- Brain and Cognitive Sciences, Center for Visual Science, University of RochesterRochesterUnited States
| |
Collapse
|
19
|
Dynamics of Heading and Choice-Related Signals in the Parieto-Insular Vestibular Cortex of Macaque Monkeys. J Neurosci 2021; 41:3254-3265. [PMID: 33622780 DOI: 10.1523/jneurosci.2275-20.2021] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2020] [Revised: 01/20/2021] [Accepted: 02/17/2021] [Indexed: 02/06/2023] Open
Abstract
Perceptual decision-making is increasingly being understood to involve an interaction between bottom-up sensory-driven signals and top-down choice-driven signals, but how these signals interact to mediate perception is not well understood. The parieto-insular vestibular cortex (PIVC) is an area with prominent vestibular responsiveness, and previous work has shown that inactivating PIVC impairs vestibular heading judgments. To investigate the nature of PIVC's contribution to heading perception, we recorded extracellularly from PIVC neurons in two male rhesus macaques during a heading discrimination task, and compared findings with data from previous studies of dorsal medial superior temporal (MSTd) and ventral intraparietal (VIP) areas using identical stimuli. By computing partial correlations between neural responses, heading, and choice, we find that PIVC activity reflects a dynamically changing combination of sensory and choice signals. In addition, the sensory and choice signals are more balanced in PIVC, in contrast to the sensory dominance in MSTd and choice dominance in VIP. Interestingly, heading and choice signals in PIVC are negatively correlated during the middle portion of the stimulus epoch, reflecting a mismatch in the polarity of heading and choice signals. We anticipate that these results will help unravel the mechanisms of interaction between bottom-up sensory signals and top-down choice signals in perceptual decision-making, leading to more comprehensive models of self-motion perception.SIGNIFICANCE STATEMENT Vestibular information is important for our perception of self-motion, and various cortical regions in primates show vestibular heading selectivity. Inactivation of the macaque vestibular cortex substantially impairs the precision of vestibular heading discrimination, more so than inactivation of other multisensory areas. Here, we record for the first time from the vestibular cortex while monkeys perform a forced-choice heading discrimination task, and we compare results with data collected previously from other multisensory cortical areas. We find that vestibular cortex activity reflects a dynamically changing combination of sensory and choice signals, with both similarities and notable differences with other multisensory areas.
Collapse
|
20
|
Roe AW, Chen G, Xu AG, Hu J. A roadmap to a columnar visual cortical prosthetic. CURRENT OPINION IN PHYSIOLOGY 2020. [DOI: 10.1016/j.cophys.2020.06.009] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/25/2022]
|
21
|
Krug K. Coding Perceptual Decisions: From Single Units to Emergent Signaling Properties in Cortical Circuits. Annu Rev Vis Sci 2020; 6:387-409. [PMID: 32600168 DOI: 10.1146/annurev-vision-030320-041223] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Spiking activity in single neurons of the primate visual cortex has been tightly linked to perceptual decisions. Any mechanism that reads out these perceptual signals to support behavior must respect the underlying neuroanatomy that shapes the functional properties of sensory neurons. Spatial distribution and timing of inputs to the next processing levels are critical, as conjoint activity of precursor neurons increases the spiking rate of downstream neurons and ultimately drives behavior. I set out how correlated activity might coalesce into a micropool of task-sensitive neurons signaling a particular percept to determine perceptual decision signals locally and for flexible interarea transmission depending on the task context. As data from more and more neurons and their complex interactions are analyzed, the space of computational mechanisms must be constrained based on what is plausible within neurobiological limits. This review outlines experiments to test the new perspectives offered by these extended methods.
Collapse
Affiliation(s)
- Kristine Krug
- Lehrstuhl für Sensorische Physiologie, Institut für Biologie, Otto-von-Guericke-Universität Magdeburg, 39120 Magdeburg, Germany; .,Leibniz-Institut für Neurobiologie, 39118 Magdeburg, Germany.,Department of Physiology, Anatomy, and Genetics, Oxford University, Oxford OX1 3PT, United Kingdom
| |
Collapse
|
22
|
Zhou Y, Freedman DJ. Posterior parietal cortex plays a causal role in perceptual and categorical decisions. SCIENCE (NEW YORK, N.Y.) 2020; 365:180-185. [PMID: 31296771 DOI: 10.1126/science.aaw8347] [Citation(s) in RCA: 66] [Impact Index Per Article: 13.2] [Reference Citation Analysis] [Abstract] [Subscribe] [Scholar Register] [Received: 01/29/2019] [Accepted: 06/18/2019] [Indexed: 01/14/2023]
Abstract
Posterior parietal cortex (PPC) activity correlates with monkeys' decisions during visual discrimination and categorization tasks. However, recent work has questioned whether decision-correlated PPC activity plays a causal role in such decisions. That study focused on PPC's contribution to motor aspects of decisions (deciding where to move), but not sensory evaluation aspects (deciding what you are looking at). We employed reversible inactivation to compare PPC's contributions to motor and sensory aspects of decisions. Inactivation affected both aspects of behavior, but preferentially impaired decisions when visual stimuli, rather than motor response targets, were in the inactivated visual field. This demonstrates a causal role for PPC in decision-making, with preferential involvement in evaluating attended task-relevant sensory stimuli compared with motor planning.
Collapse
Affiliation(s)
- Yang Zhou
- Department of Neurobiology, The University of Chicago, Chicago, IL 60637, USA.
| | - David J Freedman
- Department of Neurobiology, The University of Chicago, Chicago, IL 60637, USA.
| |
Collapse
|
23
|
Choice (-history) correlations in sensory cortex: cause or consequence? Curr Opin Neurobiol 2019; 58:148-154. [PMID: 31581052 DOI: 10.1016/j.conb.2019.09.005] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2019] [Revised: 08/04/2019] [Accepted: 09/06/2019] [Indexed: 01/27/2023]
Abstract
One challenge in neuroscience, as in other areas of science, is to make inferences about the underlying causal structure from correlational data. Here, we discuss this challenge in the context of choice correlations in sensory neurons, that is, trial-by-trial correlations, unexplained by the stimulus, between the activity of sensory neurons and an animal's perceptual choice. Do these choice-correlations reflect feedforward, feedback signalling, both, or neither? We highlight recent results of correlational and causal examinations of choice and choice-history signals in sensory, and in part sensorimotor, cortex and address formal statistical frameworks to infer causal interactions from data.
Collapse
|
24
|
Predicting Perceptual Decisions Using Visual Cortical Population Responses and Choice History. J Neurosci 2019; 39:6714-6727. [PMID: 31235648 DOI: 10.1523/jneurosci.0035-19.2019] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2019] [Revised: 06/12/2019] [Accepted: 06/18/2019] [Indexed: 01/06/2023] Open
Abstract
Our understanding of the neural basis of perceptual decision making has been built in part on relating co-fluctuations of single neuron responses to perceptual decisions on a trial-by-trial basis. The strength of this relationship is often compared across neurons or brain areas, recorded in different sessions, animals, or variants of a task. We sought to extend our understanding of perceptual decision making in three ways. First, we measured neuronal activity simultaneously in early [primary visual cortex (V1)] and midlevel (V4) visual cortex while macaque monkeys performed a fine orientation discrimination perceptual task. This allowed a direct comparison of choice signals in these two areas, including their dynamics. Second, we asked how our ability to predict animals' decisions would be improved by considering small simultaneously-recorded neuronal populations rather than individual units. Finally, we asked whether predictions would be improved by taking into account the animals' choice and reward histories, which can strongly influence decision making. We found that responses of individual V4 neurons were weakly predictive of decisions, but only in a brief epoch between stimulus offset and the indication of choice. In V1, few neurons showed significant decision-related activity. Analysis of neuronal population responses revealed robust choice-related information in V4 and substantially weaker signals in V1. Including choice- and reward-history information improved performance further, particularly when the recorded populations contained little decision-related information. Our work shows the power of using neuronal populations and decision history when relating neuronal responses to the perceptual decisions they are thought to underlie.SIGNIFICANCE STATEMENT Decades of research has provided a rich description of how visual information is represented in the visual cortex. Yet how cortical responses relate to visual perception remains poorly understood. Here we relate fluctuations in small neuronal population responses, recorded simultaneously in primary visual cortex (V1) and area V4 of monkeys, to perceptual reports in an orientation discrimination task. Choice-related signals were robust in V4, particularly late in the behavioral trial, but not in V1. Models that include both neuronal responses and choice-history information were able to predict a substantial portion of decisions. Our work shows the power of integrating information across neurons and including decision history in relating neuronal responses to perceptual decisions.
Collapse
|
25
|
Abstract
Detection of the state of self-motion, such as the instantaneous heading direction, the traveled trajectory and traveled distance or time, is critical for efficient spatial navigation. Numerous psychophysical studies have indicated that the vestibular system, originating from the otolith and semicircular canals in our inner ears, provides robust signals for different aspects of self-motion perception. In addition, vestibular signals interact with other sensory signals such as visual optic flow to facilitate natural navigation. These behavioral results are consistent with recent findings in neurophysiological studies. In particular, vestibular activity in response to the translation or rotation of the head/body in darkness is revealed in a growing number of cortical regions, many of which are also sensitive to visual motion stimuli. The temporal dynamics of the vestibular activity in the central nervous system can vary widely, ranging from acceleration-dominant to velocity-dominant. Different temporal dynamic signals may be decoded by higher level areas for different functions. For example, the acceleration signals during the translation of body in the horizontal plane may be used by the brain to estimate the heading directions. Although translation and rotation signals arise from independent peripheral organs, that is, otolith and canals, respectively, they frequently converge onto single neurons in the central nervous system including both the brainstem and the cerebral cortex. The convergent neurons typically exhibit stronger responses during a combined curved motion trajectory which may serve as the neural correlate for complex path perception. During spatial navigation, traveled distance or time may be encoded by different population of neurons in multiple regions including hippocampal-entorhinal system, posterior parietal cortex, or frontal cortex.
Collapse
Affiliation(s)
- Zhixian Cheng
- Department of Neuroscience, Yale School of Medicine, New Haven, CT, United States
| | - Yong Gu
- Key Laboratory of Primate Neurobiology, CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, Chinese Academy of Sciences, Shanghai, China
| |
Collapse
|