1
|
Héroux ME, Fisher G, Axelson LH, Butler AA, Gandevia SC. How we perceive the width of grasped objects: Insights into the central processes that govern proprioceptive judgements. J Physiol 2024. [PMID: 38734987 DOI: 10.1113/jp286322] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2024] [Accepted: 04/09/2024] [Indexed: 05/13/2024] Open
Abstract
Low-level proprioceptive judgements involve a single frame of reference, whereas high-level proprioceptive judgements are made across different frames of reference. The present study systematically compared low-level (grasp → $\rightarrow$ grasp) and high-level (vision → $\rightarrow$ grasp, grasp → $\rightarrow$ vision) proprioceptive tasks, and quantified the consistency of grasp → $\rightarrow$ vision and possible reciprocal nature of related high-level proprioceptive tasks. Experiment 1 (n = 30) compared performance across vision → $\rightarrow$ grasp, a grasp → $\rightarrow$ vision and a grasp → $\rightarrow$ grasp tasks. Experiment 2 (n = 30) compared performance on the grasp → $\rightarrow$ vision task between hands and over time. Participants were accurate (mean absolute error 0.27 cm [0.20 to 0.34]; mean [95% CI]) and precise (R 2 $R^2$ = 0.95 [0.93 to 0.96]) for grasp → $\rightarrow$ grasp judgements, with a strong correlation between outcomes (r = -0.85 [-0.93 to -0.70]). Accuracy and precision decreased in the two high-level tasks (R 2 $R^2$ = 0.86 and 0.89; mean absolute error = 1.34 and 1.41 cm), with most participants overestimating perceived width for the vision → $\rightarrow$ grasp task and underestimating it for grasp → $\rightarrow$ vision task. There was minimal correlation between accuracy and precision for these two tasks. Converging evidence indicated performance was largely reciprocal (inverse) between the vision → $\rightarrow$ grasp and grasp → $\rightarrow$ vision tasks. Performance on the grasp → $\rightarrow$ vision task was consistent between dominant and non-dominant hands, and across repeated sessions a day or week apart. Overall, there are fundamental differences between low- and high-level proprioceptive judgements that reflect fundamental differences in the cortical processes that underpin these perceptions. Moreover, the central transformations that govern high-level proprioceptive judgements of grasp are personalised, stable and reciprocal for reciprocal tasks. KEY POINTS: Low-level proprioceptive judgements involve a single frame of reference (e.g. indicating the width of a grasped object by selecting from a series of objects of different width), whereas high-level proprioceptive judgements are made across different frames of reference (e.g. indicating the width of a grasped object by selecting from a series of visible lines of different length). We highlight fundamental differences in the precision and accuracy of low- and high-level proprioceptive judgements. We provide converging evidence that the neural transformations between frames of reference that govern high-level proprioceptive judgements of grasp are personalised, stable and reciprocal for reciprocal tasks. This stability is likely key to precise judgements and accurate predictions in high-level proprioception.
Collapse
Affiliation(s)
- Martin E Héroux
- Neuroscience Research Australia, Randwick, Australia
- University of New South Wales, Sydney, Australia
| | - Georgia Fisher
- Neuroscience Research Australia, Randwick, Australia
- Australian Institute of Health Innovation, Macquarie University, Macquarie Park, Australia
| | | | - Annie A Butler
- Neuroscience Research Australia, Randwick, Australia
- University of New South Wales, Sydney, Australia
| | - Simon C Gandevia
- Neuroscience Research Australia, Randwick, Australia
- University of New South Wales, Sydney, Australia
| |
Collapse
|
2
|
Schnepel P, Paricio-Montesinos R, Ezquerra-Romano I, Haggard P, Poulet JFA. Cortical cellular encoding of thermotactile integration. Curr Biol 2024; 34:1718-1730.e3. [PMID: 38582078 DOI: 10.1016/j.cub.2024.03.018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2023] [Revised: 12/24/2023] [Accepted: 03/13/2024] [Indexed: 04/08/2024]
Abstract
Recent evidence suggests that primary sensory cortical regions play a role in the integration of information from multiple sensory modalities. How primary cortical neurons integrate different sources of sensory information is unclear, partly because non-primary sensory input to a cortical sensory region is often weak or modulatory. To address this question, we take advantage of the robust representation of thermal (cooling) and tactile stimuli in mouse forelimb primary somatosensory cortex (fS1). Using a thermotactile detection task, we show that the perception of threshold-level cool or tactile information is enhanced when they are presented simultaneously, compared with presentation alone. To investigate the cortical cellular correlates of thermotactile integration, we performed in vivo extracellular recordings from fS1 in awake resting and anesthetized mice during unimodal and bimodal stimulation of the forepaw. Unimodal stimulation evoked thermal- or tactile- specific excitatory and inhibitory responses of fS1 neurons. The most prominent features of combined thermotactile stimulation are the recruitment of unimodally silent fS1 neurons, non-linear integration features, and response dynamics that favor longer response durations with additional spikes. Together, we identify quantitative and qualitative changes in cortical encoding that may underlie the improvement in perception of thermotactile surfaces during haptic exploration.
Collapse
Affiliation(s)
- Philipp Schnepel
- Max-Delbrück Center for Molecular Medicine in the Helmholtz Association (MDC), Berlin-Buch, Robert-Rössle-Strasse 10, 13125 Berlin, Germany; Neuroscience Research Center, Charité-Universitätsmedizin Berlin, Charitéplatz 1, 10117 Berlin, Germany
| | - Ricardo Paricio-Montesinos
- Max-Delbrück Center for Molecular Medicine in the Helmholtz Association (MDC), Berlin-Buch, Robert-Rössle-Strasse 10, 13125 Berlin, Germany; Neuroscience Research Center, Charité-Universitätsmedizin Berlin, Charitéplatz 1, 10117 Berlin, Germany
| | - Ivan Ezquerra-Romano
- Max-Delbrück Center for Molecular Medicine in the Helmholtz Association (MDC), Berlin-Buch, Robert-Rössle-Strasse 10, 13125 Berlin, Germany; Neuroscience Research Center, Charité-Universitätsmedizin Berlin, Charitéplatz 1, 10117 Berlin, Germany; Institute of Cognitive Neuroscience, University College London (UCL), London WC1N 3AZ, UK
| | - Patrick Haggard
- Institute of Cognitive Neuroscience, University College London (UCL), London WC1N 3AZ, UK
| | - James F A Poulet
- Max-Delbrück Center for Molecular Medicine in the Helmholtz Association (MDC), Berlin-Buch, Robert-Rössle-Strasse 10, 13125 Berlin, Germany; Neuroscience Research Center, Charité-Universitätsmedizin Berlin, Charitéplatz 1, 10117 Berlin, Germany.
| |
Collapse
|
3
|
Mazo C, Baeta M, Petreanu L. Auditory cortex conveys non-topographic sound localization signals to visual cortex. Nat Commun 2024; 15:3116. [PMID: 38600132 PMCID: PMC11006897 DOI: 10.1038/s41467-024-47546-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2023] [Accepted: 04/02/2024] [Indexed: 04/12/2024] Open
Abstract
Spatiotemporally congruent sensory stimuli are fused into a unified percept. The auditory cortex (AC) sends projections to the primary visual cortex (V1), which could provide signals for binding spatially corresponding audio-visual stimuli. However, whether AC inputs in V1 encode sound location remains unknown. Using two-photon axonal calcium imaging and a speaker array, we measured the auditory spatial information transmitted from AC to layer 1 of V1. AC conveys information about the location of ipsilateral and contralateral sound sources to V1. Sound location could be accurately decoded by sampling AC axons in V1, providing a substrate for making location-specific audiovisual associations. However, AC inputs were not retinotopically arranged in V1, and audio-visual modulations of V1 neurons did not depend on the spatial congruency of the sound and light stimuli. The non-topographic sound localization signals provided by AC might allow the association of specific audiovisual spatial patterns in V1 neurons.
Collapse
Affiliation(s)
- Camille Mazo
- Champalimaud Neuroscience Programme, Champalimaud Foundation, Lisbon, Portugal.
| | - Margarida Baeta
- Champalimaud Neuroscience Programme, Champalimaud Foundation, Lisbon, Portugal
| | - Leopoldo Petreanu
- Champalimaud Neuroscience Programme, Champalimaud Foundation, Lisbon, Portugal.
| |
Collapse
|
4
|
Oude Lohuis MN, Marchesi P, Olcese U, Pennartz CMA. Triple dissociation of visual, auditory and motor processing in mouse primary visual cortex. Nat Neurosci 2024; 27:758-771. [PMID: 38307971 DOI: 10.1038/s41593-023-01564-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2022] [Accepted: 12/19/2023] [Indexed: 02/04/2024]
Abstract
Primary sensory cortices respond to crossmodal stimuli-for example, auditory responses are found in primary visual cortex (V1). However, it remains unclear whether these responses reflect sensory inputs or behavioral modulation through sound-evoked body movement. We address this controversy by showing that sound-evoked activity in V1 of awake mice can be dissociated into auditory and behavioral components with distinct spatiotemporal profiles. The auditory component began at approximately 27 ms, was found in superficial and deep layers and originated from auditory cortex. Sound-evoked orofacial movements correlated with V1 neural activity starting at approximately 80-100 ms and explained auditory frequency tuning. Visual, auditory and motor activity were expressed by different laminar profiles and largely segregated subsets of neuronal populations. During simultaneous audiovisual stimulation, visual representations remained dissociable from auditory-related and motor-related activity. This three-fold dissociability of auditory, motor and visual processing is central to understanding how distinct inputs to visual cortex interact to support vision.
Collapse
Affiliation(s)
- Matthijs N Oude Lohuis
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
- Champalimaud Neuroscience Programme, Champalimaud Foundation, Lisbon, Portugal
| | - Pietro Marchesi
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
| | - Umberto Olcese
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands
| | - Cyriel M A Pennartz
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, Faculty of Science, University of Amsterdam, Amsterdam, Netherlands.
- Research Priority Area Brain and Cognition, University of Amsterdam, Amsterdam, Netherlands.
| |
Collapse
|
5
|
Ku Y, Zhou Y. Crossmodal Associations and Working Memory in the Brain. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2024; 1437:91-100. [PMID: 38270855 DOI: 10.1007/978-981-99-7611-9_6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/26/2024]
Abstract
Crossmodal associations between stimuli from different sensory modalities could emerge in non-synesthetic people and be stored in working memory to guide goal-directed behaviors. This chapter reviews a plethora of studies in this field to summarize where, when, and how crossmodal associations and working memory are processed. It has been found that in those brain regions that are traditionally considered as unimodal primary sensory areas, neural activity could be influenced by crossmodal sensory signals at temporally very early stage of information processing. This phenomenon could not be due to feedback projections from higher level associative areas. Sequentially, neural processes would then occur in associative cortical areas including the posterior parietal cortex and prefrontal cortex. Neural oscillations in multiple frequency bands may reflect brain activity in crossmodal associations, and it is likely that neural synchrony is related to potential neural mechanisms underlying these processes. Primary sensory areas and associative areas coordinate together through neural synchrony to fulfil crossmodal associations and to guide working memory performance.
Collapse
Affiliation(s)
- Yixuan Ku
- Department of Psychology, Center for Brain and Mental Well-being, Sun Yat-sen University, Guangzhou, China.
- Peng Cheng Laboratory, Shenzhen, China.
| | - Yongdi Zhou
- School of Psychology, Shenzhen University, Shenzhen, China
| |
Collapse
|
6
|
Landelle C, Caron-Guyon J, Nazarian B, Anton J, Sein J, Pruvost L, Amberg M, Giraud F, Félician O, Danna J, Kavounoudias A. Beyond sense-specific processing: decoding texture in the brain from touch and sonified movement. iScience 2023; 26:107965. [PMID: 37810223 PMCID: PMC10551894 DOI: 10.1016/j.isci.2023.107965] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2023] [Revised: 07/08/2023] [Accepted: 09/15/2023] [Indexed: 10/10/2023] Open
Abstract
Texture, a fundamental object attribute, is perceived through multisensory information including touch and auditory cues. Coherent perceptions may rely on shared texture representations across different senses in the brain. To test this hypothesis, we delivered haptic textures coupled with a sound synthesizer to generate real-time textural sounds. Participants completed roughness estimation tasks with haptic, auditory, or bimodal cues in an MRI scanner. Somatosensory, auditory, and visual cortices were all activated during haptic and auditory exploration, challenging the traditional view that primary sensory cortices are sense-specific. Furthermore, audio-tactile integration was found in secondary somatosensory (S2) and primary auditory cortices. Multivariate analyses revealed shared spatial activity patterns in primary motor and somatosensory cortices, for discriminating texture across both modalities. This study indicates that primary areas and S2 have a versatile representation of multisensory textures, which has significant implications for how the brain processes multisensory cues to interact more efficiently with our environment.
Collapse
Affiliation(s)
- C. Landelle
- McGill University, McConnell Brain Imaging Centre, Department of Neurology and Neurosurgery, Montreal Neurological Institute, Montreal, QC, Canada
- Aix-Marseille Université, CNRS, Laboratoire de Neurosciences Cognitives, LNC UMR 7291, Marseille, France
| | - J. Caron-Guyon
- Aix-Marseille Université, CNRS, Laboratoire de Neurosciences Cognitives, LNC UMR 7291, Marseille, France
- University of Louvain, Institute for Research in Psychology (IPSY) & Institute of Neuroscience (IoNS), Louvain Bionics Center, Crossmodal Perception and Plasticity Laboratory, Louvain-la-Neuve, Belgium
| | - B. Nazarian
- Aix-Marseille Université, CNRS, Centre IRM-INT@CERIMED, Institut de Neurosciences de la Timone, INT UMR 7289, Marseille, France
| | - J.L. Anton
- Aix-Marseille Université, CNRS, Centre IRM-INT@CERIMED, Institut de Neurosciences de la Timone, INT UMR 7289, Marseille, France
| | - J. Sein
- Aix-Marseille Université, CNRS, Centre IRM-INT@CERIMED, Institut de Neurosciences de la Timone, INT UMR 7289, Marseille, France
| | - L. Pruvost
- Aix-Marseille Université, CNRS, Perception, Représentations, Image, Son, Musique, PRISM UMR 7061, Marseille, France
| | - M. Amberg
- Université Lille, Laboratoire d'Electrotechnique et d'Electronique de Puissance, EA 2697-L2EP, Lille, France
| | - F. Giraud
- Université Lille, Laboratoire d'Electrotechnique et d'Electronique de Puissance, EA 2697-L2EP, Lille, France
| | - O. Félician
- Aix Marseille Université, INSERM, Institut des Neurosciences des Systèmes, INS UMR 1106, Marseille, France
| | - J. Danna
- Aix-Marseille Université, CNRS, Laboratoire de Neurosciences Cognitives, LNC UMR 7291, Marseille, France
- Université de Toulouse, CNRS, Laboratoire Cognition, Langues, Langage, Ergonomie, CLLE UMR5263, Toulouse, France
| | - A. Kavounoudias
- Aix-Marseille Université, CNRS, Laboratoire de Neurosciences Cognitives, LNC UMR 7291, Marseille, France
| |
Collapse
|
7
|
Pennartz CMA, Oude Lohuis MN, Olcese U. How 'visual' is the visual cortex? The interactions between the visual cortex and other sensory, motivational and motor systems as enabling factors for visual perception. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220336. [PMID: 37545313 PMCID: PMC10404929 DOI: 10.1098/rstb.2022.0336] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2023] [Accepted: 06/13/2023] [Indexed: 08/08/2023] Open
Abstract
The definition of the visual cortex is primarily based on the evidence that lesions of this area impair visual perception. However, this does not exclude that the visual cortex may process more information than of retinal origin alone, or that other brain structures contribute to vision. Indeed, research across the past decades has shown that non-visual information, such as neural activity related to reward expectation and value, locomotion, working memory and other sensory modalities, can modulate primary visual cortical responses to retinal inputs. Nevertheless, the function of this non-visual information is poorly understood. Here we review recent evidence, coming primarily from studies in rodents, arguing that non-visual and motor effects in visual cortex play a role in visual processing itself, for instance disentangling direct auditory effects on visual cortex from effects of sound-evoked orofacial movement. These findings are placed in a broader framework casting vision in terms of predictive processing under control of frontal, reward- and motor-related systems. In contrast to the prevalent notion that vision is exclusively constructed by the visual cortical system, we propose that visual percepts are generated by a larger network-the extended visual system-spanning other sensory cortices, supramodal areas and frontal systems. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
Affiliation(s)
- Cyriel M. A. Pennartz
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
| | - Matthijs N. Oude Lohuis
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
- Champalimaud Research, Champalimaud Foundation, 1400-038 Lisbon, Portugal
| | - Umberto Olcese
- Cognitive and Systems Neuroscience Group, Swammerdam Institute for Life Sciences, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Science Park 904, 1098XH Amsterdam, The Netherlands
| |
Collapse
|
8
|
Post S, Mol W, Abu-Wishah O, Ali S, Rahmatullah N, Goel A. Multimodal Temporal Pattern Discrimination Is Encoded in Visual Cortical Dynamics. eNeuro 2023; 10:ENEURO.0047-23.2023. [PMID: 37487713 PMCID: PMC10368206 DOI: 10.1523/eneuro.0047-23.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2023] [Revised: 05/12/2023] [Accepted: 06/29/2023] [Indexed: 07/26/2023] Open
Abstract
Discriminating between temporal features in sensory stimuli is critical to complex behavior and decision-making. However, how sensory cortical circuit mechanisms contribute to discrimination between subsecond temporal components in sensory events is unclear. To elucidate the mechanistic underpinnings of timing in primary visual cortex (V1), we recorded from V1 using two-photon calcium imaging in awake-behaving mice performing a go/no-go discrimination timing task, which was composed of patterns of subsecond audiovisual stimuli. In both conditions, activity during the early stimulus period was temporally coordinated with the preferred stimulus. However, while network activity increased in the preferred condition, network activity was increasingly suppressed in the nonpreferred condition over the stimulus period. Multiple levels of analyses suggest that discrimination between subsecond intervals that are contained in rhythmic patterns can be accomplished by local neural dynamics in V1.
Collapse
Affiliation(s)
- Sam Post
- Department of Psychology, University of California, Riverside, Riverside, California 92521
| | - William Mol
- Department of Psychology, University of California, Riverside, Riverside, California 92521
| | - Omar Abu-Wishah
- Department of Psychology, University of California, Riverside, Riverside, California 92521
| | - Shazia Ali
- Department of Psychology, University of California, Riverside, Riverside, California 92521
| | - Noorhan Rahmatullah
- Department of Psychology, University of California, Riverside, Riverside, California 92521
| | - Anubhuti Goel
- Department of Psychology, University of California, Riverside, Riverside, California 92521
| |
Collapse
|
9
|
Gurariy G, Randall R, Greenberg AS. Neuroimaging evidence for the direct role of auditory scene analysis in object perception. Cereb Cortex 2023; 33:6257-6272. [PMID: 36562994 PMCID: PMC10183742 DOI: 10.1093/cercor/bhac501] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2022] [Revised: 11/29/2022] [Accepted: 11/30/2022] [Indexed: 12/24/2022] Open
Abstract
Auditory Scene Analysis (ASA) refers to the grouping of acoustic signals into auditory objects. Previously, we have shown that perceived musicality of auditory sequences varies with high-level organizational features. Here, we explore the neural mechanisms mediating ASA and auditory object perception. Participants performed musicality judgments on randomly generated pure-tone sequences and manipulated versions of each sequence containing low-level changes (amplitude; timbre). Low-level manipulations affected auditory object perception as evidenced by changes in musicality ratings. fMRI was used to measure neural activation to sequences rated most and least musical, and the altered versions of each sequence. Next, we generated two partially overlapping networks: (i) a music processing network (music localizer) and (ii) an ASA network (base sequences vs. ASA manipulated sequences). Using Representational Similarity Analysis, we correlated the functional profiles of each ROI to a model generated from behavioral musicality ratings as well as models corresponding to low-level feature processing and music perception. Within overlapping regions, areas near primary auditory cortex correlated with low-level ASA models, whereas right IPS was correlated with musicality ratings. Shared neural mechanisms that correlate with behavior and underlie both ASA and music perception suggests that low-level features of auditory stimuli play a role in auditory object perception.
Collapse
Affiliation(s)
- Gennadiy Gurariy
- Department of Biomedical Engineering, Medical College of Wisconsin and Marquette University, 8701 W Watertown Plank Rd, Milwaukee, WI 53233, United States
| | - Richard Randall
- School of Music and Neuroscience Institute, Carnegie Mellon University, Pittsburgh, PA 15213, United States
| | - Adam S Greenberg
- Department of Biomedical Engineering, Medical College of Wisconsin and Marquette University, 8701 W Watertown Plank Rd, Milwaukee, WI 53233, United States
| |
Collapse
|
10
|
Sciortino P, Kayser C. Steady state visual evoked potentials reveal a signature of the pitch-size crossmodal association in visual cortex. Neuroimage 2023; 273:120093. [PMID: 37028733 DOI: 10.1016/j.neuroimage.2023.120093] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2023] [Revised: 03/31/2023] [Accepted: 04/04/2023] [Indexed: 04/08/2023] Open
Abstract
Crossmodal correspondences describe our tendency to associate sensory features from different modalities with each other, such as the pitch of a sound with the size of a visual object. While such crossmodal correspondences (or associations) are described in many behavioural studies their neurophysiological correlates remain unclear. Under the current working model of multisensory perception both a low- and a high-level account seem plausible. That is, the neurophysiological processes shaping these associations could commence in low-level sensory regions, or may predominantly emerge in high-level association regions of semantic and object identification networks. We exploited steady-state visual evoked potentials (SSVEP) to directly probe this question, focusing on the associations between pitch and the visual features of size, hue or chromatic saturation. We found that SSVEPs over occipital regions are sensitive to the congruency between pitch and size, and a source analysis pointed to an origin around primary visual cortices. We speculate that this signature of the pitch-size association in low-level visual cortices reflects the successful pairing of congruent visual and acoustic object properties and may contribute to establishing causal relations between multisensory objects. Besides this, our study also provides a paradigm can be exploited to study other crossmodal associations involving visual stimuli in the future.
Collapse
|
11
|
Xiao YJ, Wang L, Liu YZ, Chen J, Zhang H, Gao Y, He H, Zhao Z, Wang Z. Excitatory Crossmodal Input to a Widespread Population of Primary Sensory Cortical Neurons. Neurosci Bull 2022; 38:1139-1152. [PMID: 35429324 PMCID: PMC9554107 DOI: 10.1007/s12264-022-00855-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2021] [Accepted: 01/23/2022] [Indexed: 11/28/2022] Open
Abstract
Crossmodal information processing in sensory cortices has been reported in sparsely distributed neurons under normal conditions and can undergo experience- or activity-induced plasticity. Given the potential role in brain function as indicated by previous reports, crossmodal connectivity in the sensory cortex needs to be further explored. Using perforated whole-cell recording in anesthetized adult rats, we found that almost all neurons recorded in the primary somatosensory, auditory, and visual cortices exhibited significant membrane-potential responses to crossmodal stimulation, as recorded when brain activity states were pharmacologically down-regulated in light anesthesia. These crossmodal cortical responses were excitatory and subthreshold, and further seemed to be relayed primarily by the sensory thalamus, but not the sensory cortex, of the stimulated modality. Our experiments indicate a sensory cortical presence of widespread excitatory crossmodal inputs, which might play roles in brain functions involving crossmodal information processing or plasticity.
Collapse
Affiliation(s)
- Yuan-Jie Xiao
- Institute and Key Laboratory of Brain Functional Genomics of the Chinese Ministry of Education, Shanghai Key Laboratory of Brain Functional Genomics, School of Life Sciences, East China Normal University, Shanghai, 200062, China
| | - Lidan Wang
- Institute and Key Laboratory of Brain Functional Genomics of the Chinese Ministry of Education, Shanghai Key Laboratory of Brain Functional Genomics, School of Life Sciences, East China Normal University, Shanghai, 200062, China
| | - Yu-Zhang Liu
- Institute and Key Laboratory of Brain Functional Genomics of the Chinese Ministry of Education, Shanghai Key Laboratory of Brain Functional Genomics, School of Life Sciences, East China Normal University, Shanghai, 200062, China
- Department of Neuroscience, University of Pittsburgh, Pittsburgh, 15260, USA
| | - Jiayu Chen
- Institute and Key Laboratory of Brain Functional Genomics of the Chinese Ministry of Education, Shanghai Key Laboratory of Brain Functional Genomics, School of Life Sciences, East China Normal University, Shanghai, 200062, China
| | - Haoyu Zhang
- Institute and Key Laboratory of Brain Functional Genomics of the Chinese Ministry of Education, Shanghai Key Laboratory of Brain Functional Genomics, School of Life Sciences, East China Normal University, Shanghai, 200062, China
| | - Yan Gao
- Institute and Key Laboratory of Brain Functional Genomics of the Chinese Ministry of Education, Shanghai Key Laboratory of Brain Functional Genomics, School of Life Sciences, East China Normal University, Shanghai, 200062, China
| | - Hua He
- Department of Neurosurgery, Third Affiliated Hospital of the Navy Military Medical University, Shanghai, 200438, China
| | - Zheng Zhao
- Institute and Key Laboratory of Brain Functional Genomics of the Chinese Ministry of Education, Shanghai Key Laboratory of Brain Functional Genomics, School of Life Sciences, East China Normal University, Shanghai, 200062, China.
| | - Zhiru Wang
- Institute and Key Laboratory of Brain Functional Genomics of the Chinese Ministry of Education, Shanghai Key Laboratory of Brain Functional Genomics, School of Life Sciences, East China Normal University, Shanghai, 200062, China.
| |
Collapse
|
12
|
Skirzewski M, Molotchnikoff S, Hernandez LF, Maya-Vetencourt JF. Multisensory Integration: Is Medial Prefrontal Cortex Signaling Relevant for the Treatment of Higher-Order Visual Dysfunctions? Front Mol Neurosci 2022; 14:806376. [PMID: 35110996 PMCID: PMC8801884 DOI: 10.3389/fnmol.2021.806376] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2021] [Accepted: 12/17/2021] [Indexed: 11/29/2022] Open
Abstract
In the mammalian brain, information processing in sensory modalities and global mechanisms of multisensory integration facilitate perception. Emerging experimental evidence suggests that the contribution of multisensory integration to sensory perception is far more complex than previously expected. Here we revise how associative areas such as the prefrontal cortex, which receive and integrate inputs from diverse sensory modalities, can affect information processing in unisensory systems via processes of down-stream signaling. We focus our attention on the influence of the medial prefrontal cortex on the processing of information in the visual system and whether this phenomenon can be clinically used to treat higher-order visual dysfunctions. We propose that non-invasive and multisensory stimulation strategies such as environmental enrichment and/or attention-related tasks could be of clinical relevance to fight cerebral visual impairment.
Collapse
Affiliation(s)
- Miguel Skirzewski
- Rodent Cognition Research and Innovation Core, University of Western Ontario, London, ON, Canada
| | - Stéphane Molotchnikoff
- Département de Sciences Biologiques, Université de Montréal, Montreal, QC, Canada
- Département de Génie Electrique et Génie Informatique, Université de Sherbrooke, Sherbrooke, QC, Canada
| | - Luis F. Hernandez
- Knoebel Institute for Healthy Aging, University of Denver, Denver, CO, United States
| | - José Fernando Maya-Vetencourt
- Department of Biology, University of Pisa, Pisa, Italy
- Centre for Synaptic Neuroscience, Istituto Italiano di Tecnologia (IIT), Genova, Italy
- *Correspondence: José Fernando Maya-Vetencourt
| |
Collapse
|
13
|
Abstract
Learned associations between stimuli in different sensory modalities can shape the way we perceive these stimuli. However, it is not well understood how these interactions are mediated or at what level of the processing hierarchy they occur. Here we describe a neural mechanism by which an auditory input can shape visual representations of behaviorally relevant stimuli through direct interactions between auditory and visual cortices in mice. We show that the association of an auditory stimulus with a visual stimulus in a behaviorally relevant context leads to experience-dependent suppression of visual responses in primary visual cortex (V1). Auditory cortex axons carry a mixture of auditory and retinotopically matched visual input to V1, and optogenetic stimulation of these axons selectively suppresses V1 neurons that are responsive to the associated visual stimulus after, but not before, learning. Our results suggest that cross-modal associations can be communicated by long-range cortical connections and that, with learning, these cross-modal connections function to suppress responses to predictable input.
Collapse
|
14
|
Abstract
Coordination between different sensory systems is a necessary element of sensory processing. Where and how signals from different sense organs converge onto common neural circuitry have become topics of increasing interest in recent years. In this article, we focus specifically on visual-auditory interactions in areas of the mammalian brain that are commonly considered to be auditory in function. The auditory cortex and inferior colliculus are two key points of entry where visual signals reach the auditory pathway, and both contain visual- and/or eye movement-related signals in humans and other animals. The visual signals observed in these auditory structures reflect a mixture of visual modulation of auditory-evoked activity and visually driven responses that are selective for stimulus location or features. These key response attributes also appear in the classic visual pathway but may play a different role in the auditory pathway: to modify auditory rather than visual perception. Finally, while this review focuses on two particular areas of the auditory pathway where this question has been studied, robust descending as well as ascending connections within this pathway suggest that undiscovered visual signals may be present at other stages as well. Expected final online publication date for the Annual Review of Vision Science, Volume 7 is September 2021. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
Collapse
Affiliation(s)
- Meredith N Schmehl
- Department of Neurobiology, Duke University, Durham, North Carolina 27708, USA; , .,Center for Cognitive Neuroscience, Duke University, Durham, North Carolina 27708, USA.,Duke Institute for Brain Sciences, Duke University, Durham, North Carolina 27708, USA
| | - Jennifer M Groh
- Department of Neurobiology, Duke University, Durham, North Carolina 27708, USA; , .,Department of Psychology & Neuroscience, Duke University, Durham, North Carolina 27708, USA.,Department of Computer Science, Duke University, Durham, North Carolina 27708, USA.,Department of Biomedical Engineering, Duke University, Durham, North Carolina 27708, USA.,Center for Cognitive Neuroscience, Duke University, Durham, North Carolina 27708, USA.,Duke Institute for Brain Sciences, Duke University, Durham, North Carolina 27708, USA
| |
Collapse
|
15
|
Audio-visual experience strengthens multisensory assemblies in adult mouse visual cortex. Nat Commun 2019; 10:5684. [PMID: 31831751 PMCID: PMC6908602 DOI: 10.1038/s41467-019-13607-2] [Citation(s) in RCA: 29] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2018] [Accepted: 11/07/2019] [Indexed: 11/09/2022] Open
Abstract
We experience the world through multiple senses simultaneously. To better understand mechanisms of multisensory processing we ask whether inputs from two senses (auditory and visual) can interact and drive plasticity in neural-circuits of the primary visual cortex (V1). Using genetically-encoded voltage and calcium indicators, we find coincident audio-visual experience modifies both the supra and subthreshold response properties of neurons in L2/3 of mouse V1. Specifically, we find that after audio-visual pairing, a subset of multimodal neurons develops enhanced auditory responses to the paired auditory stimulus. This cross-modal plasticity persists over days and is reflected in the strengthening of small functional networks of L2/3 neurons. We find V1 processes coincident auditory and visual events by strengthening functional associations between feature specific assemblies of multimodal neurons during bouts of sensory driven co-activity, leaving a trace of multisensory experience in the cortical network.
Collapse
|
16
|
Li Q, Xi Y, Zhang M, Liu L, Tang X. Distinct Mechanism of Audiovisual Integration With Informative and Uninformative Sound in a Visual Detection Task: A DCM Study. Front Comput Neurosci 2019; 13:59. [PMID: 31555115 PMCID: PMC6727739 DOI: 10.3389/fncom.2019.00059] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2019] [Accepted: 08/16/2019] [Indexed: 02/03/2023] Open
Abstract
Previous studies have shown that task-irrelevant auditory information can provide temporal clues for the detection of visual targets and improve visual perception; such sounds are called informative sounds. The neural mechanism of the integration of informative sound and visual stimulus has been investigated extensively, using behavioral measurement or neuroimaging methods such as functional magnetic resonance imaging (fMRI) and event-related potential (ERP), but the dynamic processes of audiovisual integration cannot be characterized formally in terms of directed neuronal coupling. The present study adopts dynamic causal modeling (DCM) of fMRI data to identify changes in effective connectivity in the hierarchical brain networks that underwrite audiovisual integration and memory. This allows us to characterize context-sensitive changes in neuronal coupling and show how visual processing is contextualized by the processing of informative and uninformative sounds. Our results show that audiovisual integration with informative and uninformative sounds conforms to different optimal models in the two conditions, indicating distinct neural mechanisms of audiovisual integration. The findings also reveal that a sound is uninformative owing to low-level automatic audiovisual integration and informative owing to integration in high-level cognitive processes.
Collapse
Affiliation(s)
- Qi Li
- School of Computer Science and Technology, Changchun University of Science and Technology, Changchun, China
| | - Yang Xi
- School of Computer Science and Technology, Changchun University of Science and Technology, Changchun, China.,School of Computer Science, Northeast Electric Power University, Jilin, China
| | - Mengchao Zhang
- Department of Radiology, China-Japan Union Hospital of Jilin University, Changchun, China
| | - Lin Liu
- Department of Radiology, China-Japan Union Hospital of Jilin University, Changchun, China
| | - Xiaoyu Tang
- School of Psychology, Liaoning Normal University, Dalian, China
| |
Collapse
|
17
|
Störmer VS. Orienting spatial attention to sounds enhances visual processing. Curr Opin Psychol 2019; 29:193-198. [PMID: 31022562 DOI: 10.1016/j.copsyc.2019.03.010] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2018] [Revised: 03/12/2019] [Accepted: 03/14/2019] [Indexed: 11/20/2022]
Abstract
Attention, the mechanism by which information is selected for further processing, has mostly been studied within the visual system. While this research has been exceptionally successful, it is important to understand how attention operates across the sensory modalities. This review focuses on recent studies showing that orienting to a peripheral, salient sound affects visual processing: it enhances visual perception, boosts visual-cortical responses, and modulates visual cortex activity before the appearance of a visual object. Critically, all of these effects are spatially selective, indicating that spatial attention facilitates perceptual processing at an attended location across sensory modalities. The neural changes in visual cortex triggered by the sounds not only resemble some of the neural modulations reported in uni-modal visual attention studies, but also reveal some important differences.
Collapse
Affiliation(s)
- Viola S Störmer
- Department of Psychology, University of California, San Diego, United States.
| |
Collapse
|
18
|
Stronger responses in the visual cortex of sighted compared to blind individuals during auditory space representation. Sci Rep 2019; 9:1935. [PMID: 30760758 PMCID: PMC6374481 DOI: 10.1038/s41598-018-37821-y] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2017] [Accepted: 12/11/2018] [Indexed: 01/02/2023] Open
Abstract
It has been previously shown that the interaction between vision and audition involves early sensory cortices. However, the functional role of these interactions and their modulation due to sensory impairment is not yet understood. To shed light on the impact of vision on auditory spatial processing, we recorded ERPs and collected psychophysical responses during space and time bisection tasks in sighted and blind participants. They listened to three consecutive sounds and judged whether the second sound was either spatially or temporally further from the first or the third sound. We demonstrate that spatial metric representation of sounds elicits an early response of the visual cortex (P70) which is different between sighted and visually deprived individuals. Indeed, only in sighted and not in blind people P70 is strongly selective for the spatial position of sounds, mimicking many aspects of the visual-evoked C1. These results suggest that early auditory processing associated with the construction of spatial maps is mediated by visual experience. The lack of vision might impair the projection of multi-sensory maps on the retinotopic maps used by the visual cortex.
Collapse
|
19
|
How Senses Work Together: Cross-Modal Interactions between Primary Sensory Cortices. Neural Plast 2018; 2018:5380921. [PMID: 30647732 PMCID: PMC6311735 DOI: 10.1155/2018/5380921] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2018] [Accepted: 11/04/2018] [Indexed: 11/17/2022] Open
Abstract
On our way through a town, the things we see can make us change the way we go. The things that we hear can make us stop or walk on, or the things we feel can cause us to wear a warm jacket or just a t-shirt. All these behaviors are mediated by highly complex processing mechanisms in our brain and reflect responses to many important sensory inputs. The mammalian cerebral cortex, which processes the sensory information, consists of largely specialized sensory areas mainly receiving information from their corresponding sensory modalities. The first cortical regions receiving the input from the outer world are the so called primary sensory cortices. Strikingly, there is convincing evidence that primary sensory cortices do not work in isolation but are substantially affected by other sensory modalities. Here, we will review previous and current literature on this cross-modal interplay.
Collapse
|
20
|
Effect of acceleration of auditory inputs on the primary somatosensory cortex in humans. Sci Rep 2018; 8:12883. [PMID: 30150686 PMCID: PMC6110726 DOI: 10.1038/s41598-018-31319-3] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2018] [Accepted: 08/17/2018] [Indexed: 11/09/2022] Open
Abstract
Cross-modal interaction occurs during the early stages of processing in the sensory cortex; however, its effect on neuronal activity speed remains unclear. We used magnetoencephalography to investigate whether auditory stimulation influences the initial cortical activity in the primary somatosensory cortex. A 25-ms pure tone was randomly presented to the left or right side of healthy volunteers at 1000 ms when electrical pulses were applied to the left or right median nerve at 20 Hz for 1500 ms because we did not observe any cross-modal effect elicited by a single pulse. The latency of N20 m originating from Brodmann's area 3b was measured for each pulse. The auditory stimulation significantly shortened the N20 m latency at 1050 and 1100 ms. This reduction in N20 m latency was identical for the ipsilateral and contralateral sounds for both latency points. Therefore, somatosensory-auditory interaction, such as input to the area 3b from the thalamus, occurred during the early stages of synaptic transmission. Auditory information that converged on the somatosensory system was considered to have arisen from the early stages of the feedforward pathway. Acceleration of information processing through the cross-modal interaction seemed to be partly due to faster processing in the sensory cortex.
Collapse
|
21
|
Spatial localization of sound elicits early responses from occipital visual cortex in humans. Sci Rep 2017; 7:10415. [PMID: 28874681 PMCID: PMC5585168 DOI: 10.1038/s41598-017-09142-z] [Citation(s) in RCA: 29] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2017] [Accepted: 07/20/2017] [Indexed: 11/08/2022] Open
Abstract
Much evidence points to an interaction between vision and audition at early cortical sites. However, the functional role of these interactions is not yet understood. Here we show an early response of the occipital cortex to sound that it is strongly linked to the spatial localization task performed by the observer. The early occipital response to a sound, usually absent, increased by more than 10-fold when presented during a space localization task, but not during a time localization task. The response amplification was not only specific to the task, but surprisingly also to the position of the stimulus in the two hemifields. We suggest that early occipital processing of sound is linked to the construction of an audio spatial map that may utilize the visual map of the occipital cortex.
Collapse
|
22
|
Ursino M, Cuppini C, Magosso E. Multisensory Bayesian Inference Depends on Synapse Maturation during Training: Theoretical Analysis and Neural Modeling Implementation. Neural Comput 2017; 29:735-782. [DOI: 10.1162/neco_a_00935] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Recent theoretical and experimental studies suggest that in multisensory conditions, the brain performs a near-optimal Bayesian estimate of external events, giving more weight to the more reliable stimuli. However, the neural mechanisms responsible for this behavior, and its progressive maturation in a multisensory environment, are still insufficiently understood. The aim of this letter is to analyze this problem with a neural network model of audiovisual integration, based on probabilistic population coding—the idea that a population of neurons can encode probability functions to perform Bayesian inference. The model consists of two chains of unisensory neurons (auditory and visual) topologically organized. They receive the corresponding input through a plastic receptive field and reciprocally exchange plastic cross-modal synapses, which encode the spatial co-occurrence of visual-auditory inputs. A third chain of multisensory neurons performs a simple sum of auditory and visual excitations. The work includes a theoretical part and a computer simulation study. We show how a simple rule for synapse learning (consisting of Hebbian reinforcement and a decay term) can be used during training to shrink the receptive fields and encode the unisensory likelihood functions. Hence, after training, each unisensory area realizes a maximum likelihood estimate of stimulus position (auditory or visual). In cross-modal conditions, the same learning rule can encode information on prior probability into the cross-modal synapses. Computer simulations confirm the theoretical results and show that the proposed network can realize a maximum likelihood estimate of auditory (or visual) positions in unimodal conditions and a Bayesian estimate, with moderate deviations from optimality, in cross-modal conditions. Furthermore, the model explains the ventriloquism illusion and, looking at the activity in the multimodal neurons, explains the automatic reweighting of auditory and visual inputs on a trial-by-trial basis, according to the reliability of the individual cues.
Collapse
Affiliation(s)
- Mauro Ursino
- Department of Electrical, Electronic and Information Engineering University of Bologna, I 40136 Bologna, Italy
| | - Cristiano Cuppini
- Department of Electrical, Electronic and Information Engineering University of Bologna, I 40136 Bologna, Italy
| | - Elisa Magosso
- Department of Electrical, Electronic and Information Engineering University of Bologna, I 40136 Bologna, Italy
| |
Collapse
|
23
|
Roy A. The Theory of Localist Representation and of a Purely Abstract Cognitive System: The Evidence from Cortical Columns, Category Cells, and Multisensory Neurons. Front Psychol 2017; 8:186. [PMID: 28261127 PMCID: PMC5311062 DOI: 10.3389/fpsyg.2017.00186] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2016] [Accepted: 01/30/2017] [Indexed: 11/17/2022] Open
Abstract
The debate about representation in the brain and the nature of the cognitive system has been going on for decades now. This paper examines the neurophysiological evidence, primarily from single cell recordings, to get a better perspective on both the issues. After an initial review of some basic concepts, the paper reviews the data from single cell recordings - in cortical columns and of category-selective and multisensory neurons. In neuroscience, columns in the neocortex (cortical columns) are understood to be a basic functional/computational unit. The paper reviews the fundamental discoveries about the columnar organization and finds that it reveals a massively parallel search mechanism. This columnar organization could be the most extensive neurophysiological evidence for the widespread use of localist representation in the brain. The paper also reviews studies of category-selective cells. The evidence for category-selective cells reveals that localist representation is also used to encode complex abstract concepts at the highest levels of processing in the brain. A third major issue is the nature of the cognitive system in the brain and whether there is a form that is purely abstract and encoded by single cells. To provide evidence for a single-cell based purely abstract cognitive system, the paper reviews some of the findings related to multisensory cells. It appears that there is widespread usage of multisensory cells in the brain in the same areas where sensory processing takes place. Plus there is evidence for abstract modality invariant cells at higher levels of cortical processing. Overall, that reveals the existence of a purely abstract cognitive system in the brain. The paper also argues that since there is no evidence for dense distributed representation and since sparse representation is actually used to encode memories, there is actually no evidence for distributed representation in the brain. Overall, it appears that, at an abstract level, the brain is a massively parallel, distributed computing system that is symbolic. The paper also explains how grounded cognition and other theories of the brain are fully compatible with localist representation and a purely abstract cognitive system.
Collapse
Affiliation(s)
- Asim Roy
- Department of Information Systems, Arizona State University, TempeAZ, USA
| |
Collapse
|
24
|
Petro LS, Paton AT, Muckli L. Contextual modulation of primary visual cortex by auditory signals. Philos Trans R Soc Lond B Biol Sci 2017; 372:rstb.2016.0104. [PMID: 28044015 PMCID: PMC5206272 DOI: 10.1098/rstb.2016.0104] [Citation(s) in RCA: 43] [Impact Index Per Article: 6.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/22/2016] [Indexed: 12/04/2022] Open
Abstract
Early visual cortex receives non-feedforward input from lateral and top-down connections (Muckli & Petro 2013 Curr. Opin. Neurobiol.23, 195–201. (doi:10.1016/j.conb.2013.01.020)), including long-range projections from auditory areas. Early visual cortex can code for high-level auditory information, with neural patterns representing natural sound stimulation (Vetter et al. 2014 Curr. Biol.24, 1256–1262. (doi:10.1016/j.cub.2014.04.020)). We discuss a number of questions arising from these findings. What is the adaptive function of bimodal representations in visual cortex? What type of information projects from auditory to visual cortex? What are the anatomical constraints of auditory information in V1, for example, periphery versus fovea, superficial versus deep cortical layers? Is there a putative neural mechanism we can infer from human neuroimaging data and recent theoretical accounts of cortex? We also present data showing we can read out high-level auditory information from the activation patterns of early visual cortex even when visual cortex receives simple visual stimulation, suggesting independent channels for visual and auditory signals in V1. We speculate which cellular mechanisms allow V1 to be contextually modulated by auditory input to facilitate perception, cognition and behaviour. Beyond cortical feedback that facilitates perception, we argue that there is also feedback serving counterfactual processing during imagery, dreaming and mind wandering, which is not relevant for immediate perception but for behaviour and cognition over a longer time frame. This article is part of the themed issue ‘Auditory and visual scene analysis’.
Collapse
Affiliation(s)
- L S Petro
- Centre for Cognitive Neuroimaging, Institute of Neuroscience and Psychology, University of Glasgow, 58 Hillhead Street, Glasgow G12 8QB, UK
| | - A T Paton
- Centre for Cognitive Neuroimaging, Institute of Neuroscience and Psychology, University of Glasgow, 58 Hillhead Street, Glasgow G12 8QB, UK
| | - L Muckli
- Centre for Cognitive Neuroimaging, Institute of Neuroscience and Psychology, University of Glasgow, 58 Hillhead Street, Glasgow G12 8QB, UK
| |
Collapse
|
25
|
Pantev C, Paraskevopoulos E, Kuchenbuch A, Lu Y, Herholz SC. Musical expertise is related to neuroplastic changes of multisensory nature within the auditory cortex. Eur J Neurosci 2015; 41:709-17. [PMID: 25728187 DOI: 10.1111/ejn.12788] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2014] [Revised: 10/15/2014] [Accepted: 10/17/2014] [Indexed: 01/08/2023]
Abstract
Recent neuroscientific evidence indicates that multisensory integration does not only occur in higher level association areas of the cortex as the hierarchical models of sensory perception assumed, but also in regions traditionally thought of as unisensory, such as the auditory cortex. Nevertheless, it is not known whether expertise-induced neuroplasticity can alter the multisensory processing that occurs in these low-level regions. The present study used magnetoencephalography to investigate whether musical training may induce neuroplastic changes of multisensory processing within the human auditory cortex. Magnetoencephalography data of four different experiments were used to demonstrate the effect of long-term and short-term musical training on the integration of auditory, somatosensory and visual stimuli in the auditory cortex. The cross-sectional design of three of the experiments allowed us to infer that long-term musical training is related to a significantly different way of processing multisensory information within the auditory cortex, whereas the short-term training design of the fourth experiment allowed us to causally infer that multisensory music reading training affects the multimodal processing within the auditory cortex.
Collapse
Affiliation(s)
- Christo Pantev
- Institute for Biomagnetism and Biosignalanalysis; University of Münster; Malmedyweg 15 D-48149 Münster Germany
| | - Evangelos Paraskevopoulos
- Institute for Biomagnetism and Biosignalanalysis; University of Münster; Malmedyweg 15 D-48149 Münster Germany
- Faculty of Health Sciences; School of Medicine; Aristotle University of Thessaloniki; Thessaloniki Greece
| | - Anja Kuchenbuch
- Institute for Biomagnetism and Biosignalanalysis; University of Münster; Malmedyweg 15 D-48149 Münster Germany
| | - Yao Lu
- Institute for Biomagnetism and Biosignalanalysis; University of Münster; Malmedyweg 15 D-48149 Münster Germany
| | | |
Collapse
|
26
|
Reading in the dark: neural correlates and cross-modal plasticity for learning to read entire words without visual experience. Neuropsychologia 2015; 83:149-160. [PMID: 26577136 DOI: 10.1016/j.neuropsychologia.2015.11.009] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2015] [Revised: 11/03/2015] [Accepted: 11/09/2015] [Indexed: 12/17/2022]
Abstract
Cognitive neuroscience has long attempted to determine the ways in which cortical selectivity develops, and the impact of nature vs. nurture on it. Congenital blindness (CB) offers a unique opportunity to test this question as the brains of blind individuals develop without visual experience. Here we approach this question through the reading network. Several areas in the visual cortex have been implicated as part of the reading network, and one of the main ones among them is the VWFA, which is selective to the form of letters and words. But what happens in the CB brain? On the one hand, it has been shown that cross-modal plasticity leads to the recruitment of occipital areas, including the VWFA, for linguistic tasks. On the other hand, we have recently demonstrated VWFA activity for letters in contrast to other visual categories when the information is provided via other senses such as touch or audition. Which of these tasks is more dominant? By which mechanism does the CB brain process reading? Using fMRI and visual-to-auditory sensory substitution which transfers the topographical features of the letters we compare reading with semantic and scrambled conditions in a group of CB. We found activation in early auditory and visual cortices during the early processing phase (letter), while the later phase (word) showed VWFA and bilateral dorsal-intraparietal activations for words. This further supports the notion that many visual regions in general, even early visual areas, also maintain a predilection for task processing even when the modality is variable and in spite of putative lifelong linguistic cross-modal plasticity. Furthermore, we find that the VWFA is recruited preferentially for letter and word form, while it was not recruited, and even exhibited deactivation, for an immediately subsequent semantic task suggesting that despite only short sensory substitution experience orthographic task processing can dominate semantic processing in the VWFA. On a wider scope, this implies that at least in some cases cross-modal plasticity which enables the recruitment of areas for new tasks may be dominated by sensory independent task specific activation.
Collapse
|
27
|
Wang X, Peelen MV, Han Z, He C, Caramazza A, Bi Y. How Visual Is the Visual Cortex? Comparing Connectional and Functional Fingerprints between Congenitally Blind and Sighted Individuals. J Neurosci 2015; 35:12545-59. [PMID: 26354920 PMCID: PMC6605405 DOI: 10.1523/jneurosci.3914-14.2015] [Citation(s) in RCA: 53] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2014] [Revised: 07/16/2015] [Accepted: 07/22/2015] [Indexed: 02/02/2023] Open
Abstract
Classical animal visual deprivation studies and human neuroimaging studies have shown that visual experience plays a critical role in shaping the functionality and connectivity of the visual cortex. Interestingly, recent studies have additionally reported circumscribed regions in the visual cortex in which functional selectivity was remarkably similar in individuals with and without visual experience. Here, by directly comparing resting-state and task-based fMRI data in congenitally blind and sighted human subjects, we obtained large-scale continuous maps of the degree to which connectional and functional "fingerprints" of ventral visual cortex depend on visual experience. We found a close agreement between connectional and functional maps, pointing to a strong interdependence of connectivity and function. Visual experience (or the absence thereof) had a pronounced effect on the resting-state connectivity and functional response profile of occipital cortex and the posterior lateral fusiform gyrus. By contrast, connectional and functional fingerprints in the anterior medial and posterior lateral parts of the ventral visual cortex were statistically indistinguishable between blind and sighted individuals. These results provide a large-scale mapping of the influence of visual experience on the development of both functional and connectivity properties of visual cortex, which serves as a basis for the formulation of new hypotheses regarding the functionality and plasticity of specific subregions. Significance statement: How is the functionality and connectivity of the visual cortex shaped by visual experience? By directly comparing resting-state and task-based fMRI data in congenitally blind and sighted subjects, we obtained large-scale continuous maps of the degree to which connectional and functional "fingerprints" of ventral visual cortex depend on visual experience. In addition to revealing regions that are strongly dependent on visual experience (early visual cortex and posterior fusiform gyrus), our results showed regions in which connectional and functional patterns are highly similar in blind and sighted individuals (anterior medial and posterior lateral ventral occipital temporal cortex). These results serve as a basis for the formulation of new hypotheses regarding the functionality and plasticity of specific subregions of the visual cortex.
Collapse
Affiliation(s)
- Xiaoying Wang
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing 100875, China
| | - Marius V Peelen
- Center for Mind/Brain Sciences, University of Trento, 38068 Rovereto, Italy, and
| | - Zaizhu Han
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing 100875, China
| | - Chenxi He
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing 100875, China
| | - Alfonso Caramazza
- Center for Mind/Brain Sciences, University of Trento, 38068 Rovereto, Italy, and Department of Psychology, Harvard University, Cambridge, Massachusetts 02138
| | - Yanchao Bi
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing 100875, China,
| |
Collapse
|
28
|
Brang D, Towle VL, Suzuki S, Hillyard SA, Di Tusa S, Dai Z, Tao J, Wu S, Grabowecky M. Peripheral sounds rapidly activate visual cortex: evidence from electrocorticography. J Neurophysiol 2015; 114:3023-8. [PMID: 26334017 DOI: 10.1152/jn.00728.2015] [Citation(s) in RCA: 28] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2015] [Accepted: 08/28/2015] [Indexed: 11/22/2022] Open
Abstract
Neurophysiological studies with animals suggest that sounds modulate activity in primary visual cortex in the presence of concurrent visual stimulation. Noninvasive neuroimaging studies in humans have similarly shown that sounds modulate activity in visual areas even in the absence of visual stimuli or visual task demands. However, the spatial and temporal limitations of these noninvasive methods prevent the determination of how rapidly sounds activate early visual cortex and what information about the sounds is relayed there. Using spatially and temporally precise measures of local synaptic activity acquired from depth electrodes in humans, we demonstrate that peripherally presented sounds evoke activity in the anterior portion of the contralateral, but not ipsilateral, calcarine sulcus within 28 ms of sound onset. These results suggest that auditory stimuli rapidly evoke spatially specific activity in visual cortex even in the absence of concurrent visual stimulation or visual task demands. This rapid auditory-evoked activation of primary visual cortex is likely to be mediated by subcortical pathways or direct cortical projections from auditory to visual areas.
Collapse
Affiliation(s)
- David Brang
- Department of Psychology, Northwestern University, Evanston, Illinois; Interdepartmental Neuroscience Program, Northwestern University, Evanston, Illinois; Department of Neurology, University of Chicago, Chicago, Illinois; and
| | - Vernon L Towle
- Department of Neurology, University of Chicago, Chicago, Illinois; and
| | - Satoru Suzuki
- Department of Psychology, Northwestern University, Evanston, Illinois; Interdepartmental Neuroscience Program, Northwestern University, Evanston, Illinois
| | - Steven A Hillyard
- Department of Neurosciences, University of California, San Diego, La Jolla, California
| | - Senneca Di Tusa
- Department of Psychology, Northwestern University, Evanston, Illinois
| | - Zhongtian Dai
- Department of Neurology, University of Chicago, Chicago, Illinois; and
| | - James Tao
- Department of Neurology, University of Chicago, Chicago, Illinois; and
| | - Shasha Wu
- Department of Neurology, University of Chicago, Chicago, Illinois; and
| | - Marcia Grabowecky
- Department of Psychology, Northwestern University, Evanston, Illinois; Interdepartmental Neuroscience Program, Northwestern University, Evanston, Illinois
| |
Collapse
|
29
|
Murray MM, Thelen A, Thut G, Romei V, Martuzzi R, Matusz PJ. The multisensory function of the human primary visual cortex. Neuropsychologia 2015; 83:161-169. [PMID: 26275965 DOI: 10.1016/j.neuropsychologia.2015.08.011] [Citation(s) in RCA: 107] [Impact Index Per Article: 11.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2015] [Revised: 08/08/2015] [Accepted: 08/10/2015] [Indexed: 01/20/2023]
Abstract
It has been nearly 10 years since Ghazanfar and Schroeder (2006) proposed that the neocortex is essentially multisensory in nature. However, it is only recently that sufficient and hard evidence that supports this proposal has accrued. We review evidence that activity within the human primary visual cortex plays an active role in multisensory processes and directly impacts behavioural outcome. This evidence emerges from a full pallet of human brain imaging and brain mapping methods with which multisensory processes are quantitatively assessed by taking advantage of particular strengths of each technique as well as advances in signal analyses. Several general conclusions about multisensory processes in primary visual cortex of humans are supported relatively solidly. First, haemodynamic methods (fMRI/PET) show that there is both convergence and integration occurring within primary visual cortex. Second, primary visual cortex is involved in multisensory processes during early post-stimulus stages (as revealed by EEG/ERP/ERFs as well as TMS). Third, multisensory effects in primary visual cortex directly impact behaviour and perception, as revealed by correlational (EEG/ERPs/ERFs) as well as more causal measures (TMS/tACS). While the provocative claim of Ghazanfar and Schroeder (2006) that the whole of neocortex is multisensory in function has yet to be demonstrated, this can now be considered established in the case of the human primary visual cortex.
Collapse
Affiliation(s)
- Micah M Murray
- The Laboratory for Investigative Neurophysiology (The LINE), Neuropsychology and Neurorehabilitation Service and Department of Radiology, University Hospital Center and University of Lausanne, Lausanne, Switzerland; EEG Brain Mapping Core, Center for Biomedical Imaging (CIBM) of Lausanne and Geneva, Lausanne, Switzerland; Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, TN, USA.
| | - Antonia Thelen
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, TN, USA
| | - Gregor Thut
- Centre for Cognitive Neuroimaging, Institute of Neuroscience and Psychology, University of Glasgow, Glasgow G12 8QB, United Kingdom
| | - Vincenzo Romei
- Centre for Brain Science, Department of Psychology, University of Essex, Colchester, United Kingdom
| | - Roberto Martuzzi
- Laboratory of Cognitive Neuroscience, Brain-Mind Institute, Ecole Polytechnique Fédérale de Lausanne, Switzerland
| | - Pawel J Matusz
- The Laboratory for Investigative Neurophysiology (The LINE), Neuropsychology and Neurorehabilitation Service and Department of Radiology, University Hospital Center and University of Lausanne, Lausanne, Switzerland; Attention, Brain, and Cognitive Development Group, Department of Experimental Psychology, University of Oxford, United Kingdom.
| |
Collapse
|
30
|
Ursino M, Cuppini C, Magosso E. Neurocomputational approaches to modelling multisensory integration in the brain: A review. Neural Netw 2014; 60:141-65. [DOI: 10.1016/j.neunet.2014.08.003] [Citation(s) in RCA: 35] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2014] [Revised: 08/05/2014] [Accepted: 08/07/2014] [Indexed: 10/24/2022]
|
31
|
Mun S, Kim ES, Park MC. Effect of mental fatigue caused by mobile 3D viewing on selective attention: an ERP study. Int J Psychophysiol 2014; 94:373-81. [PMID: 25194505 DOI: 10.1016/j.ijpsycho.2014.08.1389] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2014] [Revised: 08/21/2014] [Accepted: 08/26/2014] [Indexed: 12/01/2022]
Abstract
This study investigated behavioral responses to and auditory event-related potential (ERP) correlates of mental fatigue caused by mobile three-dimensional (3D) viewing. Twenty-six participants (14 women) performed a selective attention task in which they were asked to respond to the sounds presented at the attended side while ignoring sounds at the ignored side before and after mobile 3D viewing. Considering different individual susceptibilities to 3D, participants' subjective fatigue data were used to categorize them into two groups: fatigued and unfatigued. The amplitudes of d-ERP components were defined as differences in amplitudes between time-locked brain oscillations of the attended and ignored sounds, and these values were used to calculate the degree to which spatial selective attention was impaired by 3D mental fatigue. The fatigued group showed significantly longer response times after mobile 3D viewing compared to before the viewing. However, response accuracy did not significantly change between the two conditions, implying that the participants used a behavioral strategy to cope with their performance accuracy decrement by increasing their response times. No significant differences were observed for the unfatigued group. Analysis of covariance revealed group differences with significant and trends toward significant decreases in the d-P200 and d-late positive potential (LPP) amplitudes at the occipital electrodes of the fatigued and unfatigued groups. Our findings indicate that mentally fatigued participants did not effectively block out distractors in their information processing mechanism, providing support for the hypothesis that 3D mental fatigue impairs spatial selective attention and is characterized by changes in d-P200 and d-LPP amplitudes.
Collapse
Affiliation(s)
- Sungchul Mun
- Department of Human Computer Interaction and Robotics, Korea University of Science and Technology, Hwarangno 14-gil 5, Seongbuk-gu, Seoul 136-791, South Korea; Sensor System Research Center, Korea Institute of Science and Technology, Hwarangno 14-gil 5, Seongbuk-gu, Seoul 136-791, South Korea
| | - Eun-Soo Kim
- HoloDigilog Human Media Research Center, Kwangwoon University, Gwangun-ro 20, Nowon-gu, Seoul 139-701, South Korea
| | - Min-Chul Park
- Department of Human Computer Interaction and Robotics, Korea University of Science and Technology, Hwarangno 14-gil 5, Seongbuk-gu, Seoul 136-791, South Korea; Sensor System Research Center, Korea Institute of Science and Technology, Hwarangno 14-gil 5, Seongbuk-gu, Seoul 136-791, South Korea.
| |
Collapse
|
32
|
Mainardi M, Di Garbo A, Caleo M, Berardi N, Sale A, Maffei L. Environmental enrichment strengthens corticocortical interactions and reduces amyloid-β oligomers in aged mice. Front Aging Neurosci 2014; 6:1. [PMID: 24478697 PMCID: PMC3899529 DOI: 10.3389/fnagi.2014.00001] [Citation(s) in RCA: 41] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2013] [Accepted: 01/03/2014] [Indexed: 11/13/2022] Open
Abstract
Brain aging is characterized by global changes which are thought to underlie age-related cognitive decline. These include variations in brain activity and the progressive increase in the concentration of soluble amyloid-β (Aβ) oligomers, directly impairing synaptic function and plasticity even in the absence of any neurodegenerative disorder. Considering the high social impact of the decline in brain performance associated to aging, there is an urgent need to better understand how it can be prevented or contrasted. Lifestyle components, such as social interaction, motor exercise and cognitive activity, are thought to modulate brain physiology and its susceptibility to age-related pathologies. However, the precise functional and molecular factors that respond to environmental stimuli and might mediate their protective action again pathological aging still need to be clearly identified. To address this issue, we exploited environmental enrichment (EE), a reliable model for studying the effect of experience on the brain based on the enhancement of cognitive, social and motor experience, in aged wild-type mice. We analyzed the functional consequences of EE on aged brain physiology by performing in vivo local field potential (LFP) recordings with chronic implants. In addition, we also investigated changes induced by EE on molecular markers of neural plasticity and on the levels of soluble Aβ oligomers. We report that EE induced profound changes in the activity of the primary visual and auditory cortices and in their functional interaction. At the molecular level, EE enhanced plasticity by an upward shift of the cortical excitation/inhibition balance. In addition, EE reduced brain Aβ oligomers and increased synthesis of the Aβ-degrading enzyme neprilysin. Our findings strengthen the potential of EE procedures as a non-invasive paradigm for counteracting brain aging processes.
Collapse
Affiliation(s)
- Marco Mainardi
- Neuroscience Institute of the National Research Council Pisa, Italy
| | - Angelo Di Garbo
- Biophysics Institute of the National Research Council Pisa, Italy
| | - Matteo Caleo
- Neuroscience Institute of the National Research Council Pisa, Italy
| | - Nicoletta Berardi
- Neuroscience Institute of the National Research Council Pisa, Italy ; Department of Neuroscience, Psychology, Drug Research and Child Health (NEUROFARBA), University of Florence Florence, Italy
| | - Alessandro Sale
- Neuroscience Institute of the National Research Council Pisa, Italy
| | - Lamberto Maffei
- Neuroscience Institute of the National Research Council Pisa, Italy ; Accademia dei Lincei Roma, Italy
| |
Collapse
|
33
|
Henschke JU, Noesselt T, Scheich H, Budinger E. Possible anatomical pathways for short-latency multisensory integration processes in primary sensory cortices. Brain Struct Funct 2014; 220:955-77. [DOI: 10.1007/s00429-013-0694-4] [Citation(s) in RCA: 61] [Impact Index Per Article: 6.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2013] [Accepted: 12/17/2013] [Indexed: 01/25/2023]
|
34
|
Sarko DK, Ghose D, Wallace MT. Convergent approaches toward the study of multisensory perception. Front Syst Neurosci 2013; 7:81. [PMID: 24265607 PMCID: PMC3820972 DOI: 10.3389/fnsys.2013.00081] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2013] [Accepted: 10/20/2013] [Indexed: 11/13/2022] Open
Abstract
Classical analytical approaches for examining multisensory processing in individual neurons have relied heavily on changes in mean firing rate to assess the presence and magnitude of multisensory interaction. However, neurophysiological studies within individual sensory systems have illustrated that important sensory and perceptual information is encoded in forms that go beyond these traditional spike-based measures. Here we review analytical tools as they are used within individual sensory systems (auditory, somatosensory, and visual) to advance our understanding of how sensory cues are effectively integrated across modalities (e.g., audiovisual cues facilitating speech processing). Specifically, we discuss how methods used to assess response variability (Fano factor, or FF), local field potentials (LFPs), current source density (CSD), oscillatory coherence, spike synchrony, and receiver operating characteristics (ROC) represent particularly promising tools for understanding the neural encoding of multisensory stimulus features. The utility of each approach and how it might optimally be applied toward understanding multisensory processing is placed within the context of exciting new data that is just beginning to be generated. Finally, we address how underlying encoding mechanisms might shape-and be tested alongside with-the known behavioral and perceptual benefits that accompany multisensory processing.
Collapse
Affiliation(s)
- Diana K. Sarko
- Department of Anatomy, Cell Biology and Physiology, Edward Via College of Osteopathic MedicineSpartanburg, SC, USA
| | - Dipanwita Ghose
- Department of Anesthesiology, Vanderbilt University Medical CenterNashville, TN, USA
| | - Mark T. Wallace
- Department of Hearing and Speech Sciences, Vanderbilt UniversityNashville, TN, USA
| |
Collapse
|
35
|
Neural pathways conveying novisual information to the visual cortex. Neural Plast 2013; 2013:864920. [PMID: 23840972 PMCID: PMC3690246 DOI: 10.1155/2013/864920] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2013] [Accepted: 05/22/2013] [Indexed: 11/18/2022] Open
Abstract
The visual cortex has been traditionally considered as a stimulus-driven, unimodal system with a hierarchical organization. However, recent animal and human studies have shown that the visual cortex responds to non-visual stimuli, especially in individuals with visual deprivation congenitally, indicating the supramodal nature of the functional representation in the visual cortex. To understand the neural substrates of the cross-modal processing of the non-visual signals in the visual cortex, we firstly showed the supramodal nature of the visual cortex. We then reviewed how the nonvisual signals reach the visual cortex. Moreover, we discussed if these non-visual pathways are reshaped by early visual deprivation. Finally, the open question about the nature (stimulus-driven or top-down) of non-visual signals is also discussed.
Collapse
|
36
|
Spence C. Just how important is spatial coincidence to multisensory integration? Evaluating the spatial rule. Ann N Y Acad Sci 2013; 1296:31-49. [DOI: 10.1111/nyas.12121] [Citation(s) in RCA: 115] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Affiliation(s)
- Charles Spence
- Department of Experimental Psychology; Oxford University
| |
Collapse
|
37
|
Van Barneveld DCPBM, Van Wanrooij MM. The influence of static eye and head position on the ventriloquist effect. Eur J Neurosci 2013; 37:1501-10. [PMID: 23463919 DOI: 10.1111/ejn.12176] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2012] [Revised: 12/20/2012] [Accepted: 01/30/2013] [Indexed: 11/28/2022]
Abstract
Orienting responses to audiovisual events have shorter reaction times and better accuracy and precision when images and sounds in the environment are aligned in space and time. How the brain constructs an integrated audiovisual percept is a computational puzzle because the auditory and visual senses are represented in different reference frames: the retina encodes visual locations with respect to the eyes; whereas the sound localisation cues are referenced to the head. In the well-known ventriloquist effect, the auditory spatial percept of the ventriloquist's voice is attracted toward the synchronous visual image of the dummy, but does this visual bias on sound localisation operate in a common reference frame by correctly taking into account eye and head position? Here we studied this question by independently varying initial eye and head orientations, and the amount of audiovisual spatial mismatch. Human subjects pointed head and/or gaze to auditory targets in elevation, and were instructed to ignore co-occurring visual distracters. Results demonstrate that different initial head and eye orientations are accurately and appropriately incorporated into an audiovisual response. Effectively, sounds and images are perceptually fused according to their physical locations in space independent of an observer's point of view. Implications for neurophysiological findings and modelling efforts that aim to reconcile sensory and motor signals for goal-directed behaviour are discussed.
Collapse
Affiliation(s)
- Denise C P B M Van Barneveld
- Department of Biophysics, Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, P.O. Box 9010, 6500 GL, Nijmegen, The Netherlands
| | | |
Collapse
|
38
|
Sound Improves the Discrimination of Low-Intensity Light in the Visual Cortex of Rabbits. ACTA ACUST UNITED AC 2013. [DOI: 10.1007/s11055-013-9709-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
39
|
Pérez-Bellido A, Soto-Faraco S, López-Moliner J. Sound-driven enhancement of vision: disentangling detection-level from decision-level contributions. J Neurophysiol 2012; 109:1065-77. [PMID: 23221404 DOI: 10.1152/jn.00226.2012] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Cross-modal enhancement can be mediated both by higher-order effects due to attention and decision making and by detection-level stimulus-driven interactions. However, the contribution of each of these sources to behavioral improvements has not been conclusively determined and quantified separately. Here, we apply psychophysical analysis based on Piéron functions in order to separate stimulus-dependent changes from those accounted by decision-level contributions. Participants performed a simple visual speeded detection task on Gabor patches of different spatial frequencies and contrast values, presented with and without accompanying sounds. On one hand, we identified an additive cross-modal improvement in mean reaction times across all types of visual stimuli that would be well explained by interactions not strictly based on stimulus-driven modulations (e.g., due to reduction of temporal uncertainty and motor times). On the other hand, we singled out an audio-visual benefit that strongly depended on stimulus features such as frequency and contrast. This particular enhancement was selective to low-visual spatial frequency stimuli, optimized for magnocellular sensitivity. We therefore conclude that interactions at detection stages and at decisional processes in response selection that contribute to audio-visual enhancement can be separated online and express on partly different aspects of visual processing.
Collapse
|
40
|
Abstract
Playing a musical instrument requires a complex skill set that depends on the brain's ability to quickly integrate information from multiple senses. It has been well documented that intensive musical training alters brain structure and function within and across multisensory brain regions, supporting the experience-dependent plasticity model. Here, we argue that this experience-dependent plasticity occurs because of the multisensory nature of the brain and may be an important contributing factor to musical learning. This review highlights key multisensory regions within the brain and discusses their role in the context of music learning and rehabilitation.
Collapse
Affiliation(s)
- Emily Zimmerman
- Department of Newborn Medicine, Brigham and Women's Hospital, Boston, Massachusetts, USA
| | | |
Collapse
|
41
|
Cross-modal recruitment of primary visual cortex by auditory stimuli in the nonhuman primate brain: a molecular mapping study. Neural Plast 2012; 2012:197264. [PMID: 22792489 PMCID: PMC3388421 DOI: 10.1155/2012/197264] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2012] [Revised: 04/17/2012] [Accepted: 05/07/2012] [Indexed: 11/26/2022] Open
Abstract
Recent studies suggest that exposure to only one component of audiovisual events can lead to cross-modal cortical activation. However, it is not certain whether such crossmodal recruitment can occur in the absence of explicit conditioning, semantic factors, or long-term associations. A recent study demonstrated that crossmodal cortical recruitment can occur even after a brief exposure to bimodal stimuli without semantic association. In addition, the authors showed that the primary visual cortex is under such crossmodal influence. In the present study, we used molecular activity mapping of the immediate early gene zif268. We found that animals, which had previously been exposed to a combination of auditory and visual stimuli, showed increased number of active neurons in the primary visual cortex when presented with sounds alone. As previously implied, this crossmodal activation appears to be the result of implicit associations of the two stimuli, likely driven by their spatiotemporal characteristics; it was observed after a relatively short period of exposure (~45 min) and lasted for a relatively long period after the initial exposure (~1 day). These results suggest that the previously reported findings may be directly rooted in the increased activity of the neurons occupying the primary visual cortex.
Collapse
|
42
|
Iurilli G, Ghezzi D, Olcese U, Lassi G, Nazzaro C, Tonini R, Tucci V, Benfenati F, Medini P. Sound-driven synaptic inhibition in primary visual cortex. Neuron 2012; 73:814-28. [PMID: 22365553 PMCID: PMC3315003 DOI: 10.1016/j.neuron.2011.12.026] [Citation(s) in RCA: 241] [Impact Index Per Article: 20.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/09/2011] [Indexed: 11/12/2022]
Abstract
Multimodal objects and events activate many sensory cortical areas simultaneously. This is possibly reflected in reciprocal modulations of neuronal activity, even at the level of primary cortical areas. However, the synaptic character of these interareal interactions, and their impact on synaptic and behavioral sensory responses are unclear. Here, we found that activation of auditory cortex by a noise burst drove local GABAergic inhibition on supragranular pyramids of the mouse primary visual cortex, via cortico-cortical connections. This inhibition was generated by sound-driven excitation of a limited number of cells in infragranular visual cortical neurons. Consequently, visually driven synaptic and spike responses were reduced upon bimodal stimulation. Also, acoustic stimulation suppressed conditioned behavioral responses to a dim flash, an effect that was prevented by acute blockade of GABAergic transmission in visual cortex. Thus, auditory cortex activation by salient stimuli degrades potentially distracting sensory processing in visual cortex by recruiting local, translaminar, inhibitory circuits.
Collapse
Affiliation(s)
- Giuliano Iurilli
- Department of Neuroscience and Brain Technologies, Istituto Italiano di Tecnologia, Via Morego 30, 16163 Genova, Italy
| | | | | | | | | | | | | | | | | |
Collapse
|
43
|
Visuotactile interactions in the congenitally acallosal brain: Evidence for early cerebral plasticity. Neuropsychologia 2011; 49:3908-16. [DOI: 10.1016/j.neuropsychologia.2011.10.008] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2011] [Revised: 09/21/2011] [Accepted: 10/07/2011] [Indexed: 11/20/2022]
|
44
|
Clemo H, Keniston L, Meredith M. Structural Basis of Multisensory Processing. Front Neurosci 2011. [DOI: 10.1201/b11092-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022] Open
|
45
|
|
46
|
|
47
|
Luo H, Liu Z, Poeppel D. Auditory cortex tracks both auditory and visual stimulus dynamics using low-frequency neuronal phase modulation. PLoS Biol 2010; 8:e1000445. [PMID: 20711473 PMCID: PMC2919416 DOI: 10.1371/journal.pbio.1000445] [Citation(s) in RCA: 155] [Impact Index Per Article: 11.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2010] [Accepted: 06/24/2010] [Indexed: 11/23/2022] Open
Abstract
How is naturalistic multisensory information combined in the human brain? Based on MEG data we show that phase modulation of visual and auditory signals captures the dynamics of complex scenes. Integrating information across sensory domains to construct a unified representation of multi-sensory signals is a fundamental characteristic of perception in ecological contexts. One provocative hypothesis deriving from neurophysiology suggests that there exists early and direct cross-modal phase modulation. We provide evidence, based on magnetoencephalography (MEG) recordings from participants viewing audiovisual movies, that low-frequency neuronal information lies at the basis of the synergistic coordination of information across auditory and visual streams. In particular, the phase of the 2–7 Hz delta and theta band responses carries robust (in single trials) and usable information (for parsing the temporal structure) about stimulus dynamics in both sensory modalities concurrently. These experiments are the first to show in humans that a particular cortical mechanism, delta-theta phase modulation across early sensory areas, plays an important “active” role in continuously tracking naturalistic audio-visual streams, carrying dynamic multi-sensory information, and reflecting cross-sensory interaction in real time. When faced with ecologically relevant stimuli in natural scenes, our brains need to coordinate information from multiple sensory systems in order to create accurate internal representations of the outside world. Unfortunately, we currently have little information about the neuronal mechanisms for this cross-modal processing during online sensory perception under natural conditions. Neurophysiological and human imaging studies are increasingly exploring the response properties elicited by natural scenes. In this study, we recorded magnetoencephalography (MEG) data from participants viewing audiovisual movie clips. We developed a phase coherence analysis technique that captures—in single trials of watching a movie—how the phase of cortical responses is tightly coupled to key aspects of stimulus dynamics. Remarkably, auditory cortex not only tracks auditory stimulus dynamics but also reflects dynamic aspects of the visual signal. Similarly, visual cortex mainly follows the visual properties of a stimulus, but also shows sensitivity to the auditory aspects of a scene. The critical finding is that cross-modal phase modulation appears to lie at the basis of this integrative processing. Continuous cross-modal phase modulation may permit the internal construction of behaviorally relevant stimuli. Our work therefore contributes to the understanding of how multi-sensory information is analyzed and represented in the human brain.
Collapse
Affiliation(s)
- Huan Luo
- State Key Laboratory of Brain and Cognitive Science, Institute of Biophysics, Chinese Academy of Sciences, Beijing, China
- * E-mail: (HL); (DP)
| | - Zuxiang Liu
- State Key Laboratory of Brain and Cognitive Science, Institute of Biophysics, Chinese Academy of Sciences, Beijing, China
| | - David Poeppel
- Department of Psychology, New York University, New York, New York, United States of America
- * E-mail: (HL); (DP)
| |
Collapse
|
48
|
Blakemore C, Papaioannou J. Does the vestibular apparatus play a role in the development of the visual system? J Physiol 2010; 236:373-85. [PMID: 16992440 PMCID: PMC1350807 DOI: 10.1113/jphysiol.1974.sp010440] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022] Open
Abstract
1. The receptive field properties of visual cortical neurones were investigated in kittens that had been subjected to either unilateral or bilateral labyrinthectomy shortly after birth.2. Two kittens were reared in a normal visual environment. Another two were reared in the dark with recurrent exposures to vertically oriented black and white stripes, which in normal kittens is known to bias the distribution of receptive field orientations.3. For both normally reared and stripe-reared labyrinthectomized kittens, no differences were detected in cell types, preferred orientations, binocularity, columnar organization, or any other neuronal properties, compared with similarly reared intact kittens.4. The failure to detect deficits in visual development after labyrinthectomy is discussed in relation to other reports of vestibular influences on the visual system of the adult cat.
Collapse
|
49
|
Cappe C, Rouiller EM, Barone P. Multisensory anatomical pathways. Hear Res 2009; 258:28-36. [PMID: 19410641 DOI: 10.1016/j.heares.2009.04.017] [Citation(s) in RCA: 138] [Impact Index Per Article: 9.2] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/06/2009] [Revised: 04/21/2009] [Accepted: 04/21/2009] [Indexed: 11/16/2022]
Affiliation(s)
- C Cappe
- The Functional Electrical Neuroimaging Laboratory, Neuropsychology and Neurorehabilitation Service and Radiology Service, Centre Hospitalier Universitaire Vaudois and University of Lausanne, rue du Bugnon 46, 1011 Lausanne, Switzerland.
| | | | | |
Collapse
|
50
|
Zangenehpour S, Zatorre RJ. Crossmodal recruitment of primary visual cortex following brief exposure to bimodal audiovisual stimuli. Neuropsychologia 2009; 48:591-600. [PMID: 19883668 DOI: 10.1016/j.neuropsychologia.2009.10.022] [Citation(s) in RCA: 41] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2009] [Revised: 10/14/2009] [Accepted: 10/22/2009] [Indexed: 10/20/2022]
Abstract
Several lines of evidence suggest that exposure to only one component of typically audiovisual events can lead to crossmodal cortical activation. These effects are likely explained by long-term associations formed between the auditory and visual components of such events. It is not certain whether such crossmodal recruitment can occur in the absence of explicit conditioning, semantic factors, or long-term association; nor is it clear whether primary sensory cortices can be recruited in such paradigms. In the present study we tested the hypothesis that crossmodal cortical recruitment would occur even after a brief exposure to bimodal stimuli without semantic association. We used positron emission tomography, and an apparatus allowing presentation of spatially and temporally congruous audiovisual stimuli (noise bursts and light flashes). When presented with only the auditory or visual components of the bimodal stimuli, naïve subjects showed only modality-specific cortical activation, as expected. However, subjects who had previously been exposed to the audiovisual stimuli showed increased cerebral blood flow in the primary visual cortex when presented with sounds alone. Functional connectivity analysis suggested that the auditory cortex was the source of visual cortex activity. This crossmodal activation appears to be the result of implicit associations of the two stimuli, likely driven by their spatiotemporal characteristics; it was observed after a relatively short period of exposure (approximately 45 min), and lasted for a relatively long period after the initial exposure (approximately 1 day). The findings indicate that auditory and visual cortices interact with one another to a larger degree than typically assumed.
Collapse
|