1
|
Vastano R, Costantini M, Widerstrom-Noga E. ERPs evidence of multisensory integration deficits in spinal cord injury. Neuroscience 2025; 576:263-276. [PMID: 40320235 DOI: 10.1016/j.neuroscience.2025.04.048] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2024] [Revised: 04/09/2025] [Accepted: 04/28/2025] [Indexed: 05/12/2025]
Abstract
Spinal cord injury (SCI) is associated with deficits in multisensory integration-the ability to synthesize cross-modal information. This study explores the neural mechanisms underlying these deficits using EEG and a detection task incorporating unisensory and multisensory stimuli: audio-visual, visuo-tactile, and audio-tactile. Behaviorally, participants with SCI showed reduced multisensory integration across all modalities, consistent with prior findings. Neurally, ERPs were analyzed in three conditions: audio-tactile (N100, P200), visuo-tactile (P170), and audio-visual (P100, N200). Higher ERP amplitudes for multisensory versus unisensory stimuli were only observed in the control group, whereas the SCI group showed similar amplitudes across both. In the SCI group, multisensory ERPs were significantly lower for audio-tactile P200, visuo-tactile P170, and audio-visual P100, indicating a deficit in multisensory processing. Auditory ERPs were preserved in SCI participants, while visual and tactile responses were reduced, suggesting an auditory dominance post-SCI. Cluster-based analysis on residual effects showed that the control group exhibited greater multisensory gain compared to SCI participants, with significant centro-parietal clusters observed for audio-tactile (50-100 ms, 120-180 ms, 300-500 ms), visuo-tactile (80-120 ms, 120-180 ms), and audio-visual (280-480 ms) residual effects. Overall, these results highlight that SCI has detrimental effects not only on the motor system, but also on the ability to process multisensory information. This study advances our understanding of multisensory integration mechanisms following sensorimotor deficits and highlights the need for targeted interventions to address multisensory impairments in this population.
Collapse
Affiliation(s)
- Roberta Vastano
- University of Miami, Department of Neurological Surgery, The Miami Project to Cure Paralysis, Miami, FL, USA.
| | - Marcello Costantini
- Department of Psychology, "G. d'Annunzio" University of Chieti-Pescara, Chieti, Italy; Institute for Advanced Biomedical Technologies, ITAB, "G. d'Annunzio" University of Chieti-Pescara, Chieti, Italy
| | - Eva Widerstrom-Noga
- University of Miami, Department of Neurological Surgery, The Miami Project to Cure Paralysis, Miami, FL, USA
| |
Collapse
|
2
|
Isenstein EL, Freedman EG, Rico GA, Brown Z, Tadin D, Foxe JJ. Adults on the autism spectrum differ from neurotypical peers when self-generating but not passively-experiencing somatosensation: a high-density electrophysiological (EEG) mapping and virtual reality study. Neuroimage 2025; 311:121215. [PMID: 40228683 DOI: 10.1016/j.neuroimage.2025.121215] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2024] [Revised: 04/02/2025] [Accepted: 04/11/2025] [Indexed: 04/16/2025] Open
Abstract
Little is known about how different features of tactile inputs affect somatosensory perception in autism. In this study we combined high-density electroencephalography (EEG) and virtual reality (VR) to assess how the volition and pattern consistency of somatosensory stimulation influenced the electrophysiological responses in neurotypical (n = 30) and autistic (n = 30) adults. Specifically, we compared N1 and P300 amplitudes when vibrotactile stimulation were actively triggered by self-motion (Active) versus passively triggered by target-motion (Passive). We also measured the mismatch negativity (MMN) to assess how deviations in the pattern of stimulus duration affected the electrophysiological responses. We observed comparable responses regardless of pattern deviation in the MMN time window between groups, but different patterns of amplitude in this time frame based on whether the stimulation was Active or Passive. In the autism group we observed smaller N1 amplitudes in response to Passive, but not Active, vibrations as compared to the control group. Conversely, there were overall larger magnitude P300 amplitudes in the autism group, but comparable levels of Passive-to-Active attenuation between groups. Overall, the autism cohort demonstrated variation from the neurotypical cohort with respect to the volition of the stimuli, but there were comparable results between groups in response to pattern deviation. These findings suggest that there are subtle differences in how adults with and without autism handle self-generated and externally-generated somatosensory sensations.
Collapse
Affiliation(s)
- Emily L Isenstein
- The Frederick J. and Marion A. Schindler Cognitive Neurophysiology Laboratory, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA; Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA; Center for Visual Science, University of Rochester, Rochester, NY, USA
| | - Edward G Freedman
- The Frederick J. and Marion A. Schindler Cognitive Neurophysiology Laboratory, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA; The Ernest J. Del Monte Institute for Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| | - Grace A Rico
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA
| | - Zakilya Brown
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA
| | - Duje Tadin
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA; Center for Visual Science, University of Rochester, Rochester, NY, USA; The Ernest J. Del Monte Institute for Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| | - John J Foxe
- The Frederick J. and Marion A. Schindler Cognitive Neurophysiology Laboratory, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA; Center for Visual Science, University of Rochester, Rochester, NY, USA; The Ernest J. Del Monte Institute for Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA.
| |
Collapse
|
3
|
Kent RD. The Feel of Speech: Multisystem and Polymodal Somatosensation in Speech Production. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2024; 67:1424-1460. [PMID: 38593006 DOI: 10.1044/2024_jslhr-23-00575] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 04/11/2024]
Abstract
PURPOSE The oral structures such as the tongue and lips have remarkable somatosensory capacities, but understanding the roles of somatosensation in speech production requires a more comprehensive knowledge of somatosensation in the speech production system in its entirety, including the respiratory, laryngeal, and supralaryngeal subsystems. This review was conducted to summarize the system-wide somatosensory information available for speech production. METHOD The search was conducted with PubMed/Medline and Google Scholar for articles published until November 2023. Numerous search terms were used in conducting the review, which covered the topics of psychophysics, basic and clinical behavioral research, neuroanatomy, and neuroscience. RESULTS AND CONCLUSIONS The current understanding of speech somatosensation rests primarily on the two pillars of psychophysics and neuroscience. The confluence of polymodal afferent streams supports the development, maintenance, and refinement of speech production. Receptors are both canonical and noncanonical, with the latter occurring especially in the muscles innervated by the facial nerve. Somatosensory representation in the cortex is disproportionately large and provides for sensory interactions. Speech somatosensory function is robust over the lifespan, with possible declines in advanced aging. The understanding of somatosensation in speech disorders is largely disconnected from research and theory on speech production. A speech somatoscape is proposed as the generalized, system-wide sensation of speech production, with implications for speech development, speech motor control, and speech disorders.
Collapse
|
4
|
Bingham MA, Cummins ML, Tong A, Purcell P, Sangari A, Sood A, Schlesinger JJ. Effects of altering harmonic structure on the recognition of simulated auditory arterial pressure alarms. Br J Anaesth 2023; 131:e178-e180. [PMID: 37758624 DOI: 10.1016/j.bja.2023.08.037] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2023] [Revised: 08/20/2023] [Accepted: 08/29/2023] [Indexed: 09/29/2023] Open
Affiliation(s)
- Molly A Bingham
- Department of Biomedical Engineering, Vanderbilt University, Nashville, TN, USA
| | - Mabel L Cummins
- Department of Neuroscience, Vanderbilt University, Nashville, TN, USA
| | - Anqy Tong
- Department of Neuroscience, Vanderbilt University, Nashville, TN, USA
| | | | - Ayush Sangari
- Renaissance School of Medicine, Stony Brook University, Stony Brook, NY, USA
| | - Aditya Sood
- Emory University School of Medicine, Atlanta, GA, USA
| | - Joseph J Schlesinger
- Division of Critical Care Medicine, Department of Anesthesiology, Vanderbilt University Medical Center, Nashville, TN, USA.
| |
Collapse
|
5
|
Sangari A, Bingham MA, Cummins M, Sood A, Tong A, Purcell P, Schlesinger JJ. A Spatiotemporal and Multisensory Approach to Designing Wearable Clinical ICU Alarms. J Med Syst 2023; 47:105. [PMID: 37847469 DOI: 10.1007/s10916-023-01997-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2023] [Accepted: 09/23/2023] [Indexed: 10/18/2023]
Abstract
In health care, auditory alarms are an important aspect of an informatics system that monitors patients and alerts clinicians attending to multiple concurrent tasks. However, the volume, design, and pervasiveness of existing Intensive Care Unit (ICU) alarms can make it difficult to quickly distinguish their meaning and importance. In this study, we evaluated the effectiveness of two design approaches not yet explored in a smartwatch-based alarm system designed for ICU use: (1) using audiovisual spatial colocalization and (2) adding haptic (i.e., touch) information. We compared the performance of 30 study participants using ICU smartwatch alarms containing auditory icons in two implementations of the audio modality: colocalized with the visual cue on the smartwatch's low-quality speaker versus delivered from a higher quality speaker located two feet away from participants (like a stationary alarm bay situated near patients in the ICU). Additionally, we compared participant performance using alarms with two sensory modalities (visual and audio) against alarms with three sensory modalities (adding haptic cues). Participants were 10.1% (0.24s) faster at responding to alarms when auditory information was delivered from the smartwatch instead of the higher quality external speaker. Meanwhile, adding haptic information to alarms improved response times to alarms by 12.2% (0.23s) and response times on their primary task by 10.3% (0.08s). Participants rated learnability and ease of use higher for alarms with haptic information. These small but statistically significant improvements demonstrate that audiovisual colocalization and multisensory alarm design can improve user response times.
Collapse
Affiliation(s)
- Ayush Sangari
- Renaissance School of Medicine, Stony Brook University, 100 Nicolls Rd, Stony Brook, NY, 11790, USA.
| | - Molly A Bingham
- Department of Biomedical Engineering, Vanderbilt University, Nashville, TN, USA
| | - Mabel Cummins
- Department of Neuroscience, Vanderbilt University, Nashville, TN, USA
| | - Aditya Sood
- Long Island Jewish Medical Center, New Hyde Park, New York, USA
| | - Anqy Tong
- Department of Neuroscience, Vanderbilt University, Nashville, TN, USA
| | | | - Joseph J Schlesinger
- Division of Critical Care Medicine, Department of Anesthesiology, Vanderbilt University Medical Center, Nashville, TN, USA
| |
Collapse
|
6
|
Parra S, Díaz H, Zainos A, Alvarez M, Zizumbo J, Rivera-Yoshida N, Pujalte S, Bayones L, Romo R, Rossi-Pool R. Hierarchical unimodal processing within the primary somatosensory cortex during a bimodal detection task. Proc Natl Acad Sci U S A 2022; 119:e2213847119. [PMID: 36534792 PMCID: PMC9907144 DOI: 10.1073/pnas.2213847119] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2022] [Accepted: 11/02/2022] [Indexed: 12/24/2022] Open
Abstract
Do sensory cortices process more than one sensory modality? To answer these questions, scientists have generated a wide variety of studies at distinct space-time scales in different animal models, and often shown contradictory conclusions. Some conclude that this process occurs in early sensory cortices, but others that this occurs in areas central to sensory cortices. Here, we sought to determine whether sensory neurons process and encode physical stimulus properties of different modalities (tactile and acoustic). For this, we designed a bimodal detection task where the senses of touch and hearing compete from trial to trial. Two Rhesus monkeys performed this novel task, while neural activity was recorded in areas 3b and 1 of the primary somatosensory cortex (S1). We analyzed neurons' coding properties and variability, organizing them by their receptive field's position relative to the stimulation zone. Our results indicate that neurons of areas 3b and 1 are unimodal, encoding only the tactile modality in both the firing rate and variability. Moreover, we found that neurons in area 3b carried more information about the periodic stimulus structure than those in area 1, possessed lower response and coding latencies, and had a lower intrinsic time scale. In sum, these differences reveal a hidden processing-based hierarchy. Finally, using a powerful nonlinear dimensionality reduction algorithm, we show that the activity from areas 3b and 1 can be separated, establishing a clear division in the functionality of these two subareas of S1.
Collapse
Affiliation(s)
- Sergio Parra
- Instituto de Fisiología Celular, Departamento de Neurociencia Cognitiva, Universidad Nacional Autónoma de México, 04510México City, Mexico
| | - Héctor Díaz
- Instituto de Fisiología Celular, Departamento de Neurociencia Cognitiva, Universidad Nacional Autónoma de México, 04510México City, Mexico
| | - Antonio Zainos
- Instituto de Fisiología Celular, Departamento de Neurociencia Cognitiva, Universidad Nacional Autónoma de México, 04510México City, Mexico
| | - Manuel Alvarez
- Instituto de Fisiología Celular, Departamento de Neurociencia Cognitiva, Universidad Nacional Autónoma de México, 04510México City, Mexico
| | - Jerónimo Zizumbo
- Instituto de Fisiología Celular, Departamento de Neurociencia Cognitiva, Universidad Nacional Autónoma de México, 04510México City, Mexico
| | - Natsuko Rivera-Yoshida
- Instituto de Fisiología Celular, Departamento de Neurociencia Cognitiva, Universidad Nacional Autónoma de México, 04510México City, Mexico
| | - Sebastián Pujalte
- Instituto de Fisiología Celular, Departamento de Neurociencia Cognitiva, Universidad Nacional Autónoma de México, 04510México City, Mexico
| | - Lucas Bayones
- Instituto de Fisiología Celular, Departamento de Neurociencia Cognitiva, Universidad Nacional Autónoma de México, 04510México City, Mexico
| | - Ranulfo Romo
- Instituto de Fisiología Celular, Departamento de Neurociencia Cognitiva, Universidad Nacional Autónoma de México, 04510México City, Mexico
- Centro de Ciencias de la Complejidad, Universidad Nacional Autónoma de México, México City04510, Mexico
- El Colegio Nacional, Mexico City06020, Mexico
| | - Román Rossi-Pool
- Instituto de Fisiología Celular, Departamento de Neurociencia Cognitiva, Universidad Nacional Autónoma de México, 04510México City, Mexico
- Centro de Ciencias de la Complejidad, Universidad Nacional Autónoma de México, México City04510, Mexico
| |
Collapse
|
7
|
Fossataro C, Galigani M, Rossi Sebastiano A, Bruno V, Ronga I, Garbarini F. Spatial proximity to others induces plastic changes in the neural representation of the peripersonal space. iScience 2022; 26:105879. [PMID: 36654859 PMCID: PMC9840938 DOI: 10.1016/j.isci.2022.105879] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2022] [Revised: 11/21/2022] [Accepted: 12/22/2022] [Indexed: 12/28/2022] Open
Abstract
Peripersonal space (PPS) is a highly plastic "invisible bubble" surrounding the body whose boundaries are mapped through multisensory integration. Yet, it is unclear how the spatial proximity to others alters PPS boundaries. Across five experiments (N = 80), by recording behavioral and electrophysiological responses to visuo-tactile stimuli, we demonstrate that the proximity to others induces plastic changes in the neural PPS representation. The spatial proximity to someone else's hand shrinks the portion of space within which multisensory responses occur, thus reducing the PPS boundaries. This suggests that PPS representation, built from bodily and multisensory signals, plastically adapts to the presence of conspecifics to define the self-other boundaries, so that what is usually coded as "my space" is recoded as "your space". When the space is shared with conspecifics, it seems adaptive to move the other-space away from the self-space to discriminate whether external events pertain to the self-body or to other-bodies.
Collapse
Affiliation(s)
- Carlotta Fossataro
- MANIBUS Lab, Psychology Department, University of Turin, Turin 10123, Italy
| | - Mattia Galigani
- MANIBUS Lab, Psychology Department, University of Turin, Turin 10123, Italy
| | | | - Valentina Bruno
- MANIBUS Lab, Psychology Department, University of Turin, Turin 10123, Italy
| | - Irene Ronga
- MANIBUS Lab, Psychology Department, University of Turin, Turin 10123, Italy
| | - Francesca Garbarini
- MANIBUS Lab, Psychology Department, University of Turin, Turin 10123, Italy,Neuroscience Institute of Turin (NIT), Turin 10123, Italy,Corresponding author
| |
Collapse
|
8
|
Multisensory-driven facilitation within the peripersonal space is modulated by the expectations about stimulus location on the body. Sci Rep 2022; 12:20061. [PMID: 36414633 PMCID: PMC9681840 DOI: 10.1038/s41598-022-21469-w] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/06/2022] [Accepted: 09/27/2022] [Indexed: 11/24/2022] Open
Abstract
Compelling evidence from human and non-human studies suggests that responses to multisensory events are fastened when stimuli occur within the space surrounding the bodily self (i.e., peripersonal space; PPS). However, some human studies did not find such effect. We propose that these dissonant voices might actually uncover a specific mechanism, modulating PPS boundaries according to sensory regularities. We exploited a visuo-tactile paradigm, wherein participants provided speeded responses to tactile stimuli and rated their perceived intensity while ignoring simultaneous visual stimuli, appearing near the stimulated hand (VTNear) or far from it (VTFar; near the non-stimulated hand). Tactile stimuli could be delivered only to one hand (unilateral task) or to both hands randomly (bilateral task). Results revealed that a space-dependent multisensory enhancement (i.e., faster responses and higher perceived intensity in VTNear than VTFar) was present when highly predictable tactile stimulation induced PPS to be circumscribed around the stimulated hand (unilateral task). Conversely, when stimulus location was unpredictable (bilateral task), participants showed a comparable multisensory enhancement in both bimodal conditions, suggesting a PPS widening to include both hands. We propose that the detection of environmental regularities actively shapes PPS boundaries, thus optimizing the detection and reaction to incoming sensory stimuli.
Collapse
|
9
|
Crosse MJ, Foxe JJ, Tarrit K, Freedman EG, Molholm S. Resolution of impaired multisensory processing in autism and the cost of switching sensory modality. Commun Biol 2022; 5:601. [PMID: 35773473 PMCID: PMC9246932 DOI: 10.1038/s42003-022-03519-1] [Citation(s) in RCA: 16] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2021] [Accepted: 05/23/2022] [Indexed: 11/09/2022] Open
Abstract
Children with autism spectrum disorders (ASD) exhibit alterations in multisensory processing, which may contribute to the prevalence of social and communicative deficits in this population. Resolution of multisensory deficits has been observed in teenagers with ASD for complex, social speech stimuli; however, whether this resolution extends to more basic multisensory processing deficits remains unclear. Here, in a cohort of 364 participants we show using simple, non-social audiovisual stimuli that deficits in multisensory processing observed in high-functioning children and teenagers with ASD are not evident in adults with the disorder. Computational modelling indicated that multisensory processing transitions from a default state of competition to one of facilitation, and that this transition is delayed in ASD. Further analysis revealed group differences in how sensory channels are weighted, and how this is impacted by preceding cross-sensory inputs. Our findings indicate that there is a complex and dynamic interplay among the sensory systems that differs considerably in individuals with ASD. Crosse et al. study a cohort of 364 participants with autism spectrum disorders (ASD) and matched controls, and show that deficits in multisensory processing observed in high-functioning children and teenagers with ASD are not evident in adults with the disorder. Using computational modelling they go on to demonstrate that there is a delayed transition of multisensory processing from a default state of competition to one of facilitation in ASD, as well as differences in sensory weighting and the ability to switch between sensory modalities, which sheds light on the interplay among sensory systems that differ in ASD individuals.
Collapse
Affiliation(s)
- Michael J Crosse
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics, Albert Einstein College of Medicine, Bronx, NY, USA. .,The Dominick P. Purpura Department of Neuroscience, Rose F. Kennedy Intellectual and Developmental Disabilities Research Center, Albert Einstein College of Medicine, Bronx, NY, USA. .,Trinity Centre for Biomedical Engineering, Department of Mechanical, Manufacturing & Biomedical Engineering, Trinity College Dublin, Dublin, Ireland.
| | - John J Foxe
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics, Albert Einstein College of Medicine, Bronx, NY, USA.,The Dominick P. Purpura Department of Neuroscience, Rose F. Kennedy Intellectual and Developmental Disabilities Research Center, Albert Einstein College of Medicine, Bronx, NY, USA.,The Cognitive Neurophysiology Laboratory, Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| | - Katy Tarrit
- The Cognitive Neurophysiology Laboratory, Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| | - Edward G Freedman
- The Cognitive Neurophysiology Laboratory, Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| | - Sophie Molholm
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics, Albert Einstein College of Medicine, Bronx, NY, USA. .,The Dominick P. Purpura Department of Neuroscience, Rose F. Kennedy Intellectual and Developmental Disabilities Research Center, Albert Einstein College of Medicine, Bronx, NY, USA. .,The Cognitive Neurophysiology Laboratory, Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA.
| |
Collapse
|
10
|
Fernandez DC, Komal R, Langel J, Ma J, Duy PQ, Penzo MA, Zhao H, Hattar S. Retinal innervation tunes circuits that drive nonphotic entrainment to food. Nature 2020; 581:194-198. [PMID: 32404998 PMCID: PMC7291822 DOI: 10.1038/s41586-020-2204-1] [Citation(s) in RCA: 32] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2019] [Accepted: 02/21/2020] [Indexed: 12/14/2022]
Abstract
Daily changes in light and food availability are major time cues that influence circadian timing1. However, little is known about the circuits that integrate these time cues to drive a coherent circadian output1-3. Here we investigate whether retinal inputs modulate entrainment to nonphotic cues such as time-restricted feeding. Photic information is relayed to the suprachiasmatic nucleus (SCN)-the central circadian pacemaker-and the intergeniculate leaflet (IGL) through intrinsically photosensitive retinal ganglion cells (ipRGCs)4. We show that adult mice that lack ipRGCs from the early postnatal stages have impaired entrainment to time-restricted feeding, whereas ablation of ipRGCs at later stages had no effect. Innervation of ipRGCs at early postnatal stages influences IGL neurons that express neuropeptide Y (NPY) (hereafter, IGLNPY neurons), guiding the assembly of a functional IGLNPY-SCN circuit. Moreover, silencing IGLNPY neurons in adult mice mimicked the deficits that were induced by ablation of ipRGCs in the early postnatal stages, and acute inhibition of IGLNPY terminals in the SCN decreased food-anticipatory activity. Thus, innervation of ipRGCs in the early postnatal period tunes the IGLNPY-SCN circuit to allow entrainment to time-restricted feeding.
Collapse
Affiliation(s)
- Diego Carlos Fernandez
- National Institute of Mental Health (NIMH), National Institutes of Health (NIH), Bethesda, MD, USA.
| | - Ruchi Komal
- National Institute of Mental Health (NIMH), National Institutes of Health (NIH), Bethesda, MD, USA
| | - Jennifer Langel
- National Institute of Mental Health (NIMH), National Institutes of Health (NIH), Bethesda, MD, USA
| | - Jun Ma
- National Institute of Mental Health (NIMH), National Institutes of Health (NIH), Bethesda, MD, USA
| | - Phan Q Duy
- National Institute of Mental Health (NIMH), National Institutes of Health (NIH), Bethesda, MD, USA.,MSTP, Yale University, New Haven, CT, USA
| | - Mario A Penzo
- National Institute of Mental Health (NIMH), National Institutes of Health (NIH), Bethesda, MD, USA
| | - Haiqing Zhao
- Department of Biology, Johns Hopkins University, Baltimore, MD, USA
| | - Samer Hattar
- National Institute of Mental Health (NIMH), National Institutes of Health (NIH), Bethesda, MD, USA.
| |
Collapse
|
11
|
Shaw LH, Freedman EG, Crosse MJ, Nicholas E, Chen AM, Braiman MS, Molholm S, Foxe JJ. Operating in a Multisensory Context: Assessing the Interplay Between Multisensory Reaction Time Facilitation and Inter-sensory Task-switching Effects. Neuroscience 2020; 436:122-135. [PMID: 32325100 DOI: 10.1016/j.neuroscience.2020.04.013] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2019] [Revised: 04/03/2020] [Accepted: 04/06/2020] [Indexed: 11/28/2022]
Abstract
Individuals respond faster to presentations of bisensory stimuli (e.g. audio-visual targets) than to presentations of either unisensory constituent in isolation (i.e. to the auditory-alone or visual-alone components of an audio-visual stimulus). This well-established multisensory speeding effect, termed the redundant signals effect (RSE), is not predicted by simple linear summation of the unisensory response time probability distributions. Rather, the speeding is typically faster than this prediction, leading researchers to ascribe the RSE to a so-called co-activation account. According to this account, multisensory neural processing occurs whereby the unisensory inputs are integrated to produce more effective sensory-motor activation. However, the typical paradigm used to test for RSE involves random sequencing of unisensory and bisensory inputs in a mixed design, raising the possibility of an alternate attention-switching account. This intermixed design requires participants to switch between sensory modalities on many task trials (e.g. from responding to a visual stimulus to an auditory stimulus). Here we show that much, if not all, of the RSE under this paradigm can be attributed to slowing of reaction times to unisensory stimuli resulting from modality switching, and is not in fact due to speeding of responses to AV stimuli. As such, the present data do not support a co-activation account, but rather suggest that switching and mixing costs akin to those observed during classic task-switching paradigms account for the observed RSE.
Collapse
Affiliation(s)
- Luke H Shaw
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA
| | - Edward G Freedman
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA
| | - Michael J Crosse
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics & Neuroscience, Albert Einstein College of Medicine & Montefiore Medical Center, Bronx, NY 10461, USA
| | - Eric Nicholas
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA
| | - Allen M Chen
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA
| | - Matthew S Braiman
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA
| | - Sophie Molholm
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA; The Cognitive Neurophysiology Laboratory, Department of Pediatrics & Neuroscience, Albert Einstein College of Medicine & Montefiore Medical Center, Bronx, NY 10461, USA
| | - John J Foxe
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA; The Cognitive Neurophysiology Laboratory, Department of Pediatrics & Neuroscience, Albert Einstein College of Medicine & Montefiore Medical Center, Bronx, NY 10461, USA.
| |
Collapse
|
12
|
Revealing the body in the brain: An ERP method to examine sensorimotor activity during visual perception of body-related information. Cortex 2020; 125:332-344. [DOI: 10.1016/j.cortex.2020.01.017] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2019] [Revised: 01/27/2020] [Accepted: 01/27/2020] [Indexed: 01/13/2023]
|
13
|
Foxe JJ, Del Bene VA, Ross LA, Ridgway EM, Francisco AA, Molholm S. Multisensory Audiovisual Processing in Children With a Sensory Processing Disorder (II): Speech Integration Under Noisy Environmental Conditions. Front Integr Neurosci 2020; 14:39. [PMID: 32765229 PMCID: PMC7381232 DOI: 10.3389/fnint.2020.00039] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2020] [Accepted: 06/16/2020] [Indexed: 12/02/2022] Open
Abstract
Background: There exists a cohort of children and adults who exhibit an inordinately high degree of discomfort when experiencing what would be considered moderate and manageable levels of sensory input. That is, they show over-responsivity in the face of entirely typical sound, light, touch, taste, or smell inputs, and this occurs to such an extent that it interferes with their daily functioning and reaches clinical levels of dysfunction. What marks these individuals apart is that this sensory processing disorder (SPD) is observed in the absence of other symptom clusters that would result in a diagnosis of Autism, ADHD, or other neurodevelopmental disorders more typically associated with sensory processing difficulties. One major theory forwarded to account for these SPDs posits a deficit in multisensory integration, such that the various sensory inputs are not appropriately integrated into the central nervous system, leading to an overwhelming sensory-perceptual environment, and in turn to the sensory-defensive phenotype observed in these individuals. Methods: We tested whether children (6-16 years) with an over-responsive SPD phenotype (N = 12) integrated multisensory speech differently from age-matched typically-developing controls (TD: N = 12). Participants identified monosyllabic words while background noise level and sensory modality (auditory-alone, visual-alone, audiovisual) were varied in pseudorandom order. Improved word identification when speech was both seen and heard compared to when it was simply heard served to index multisensory speech integration. Results: School-aged children with an SPD show a deficit in the ability to benefit from the combination of both seen and heard speech inputs under noisy environmental conditions, suggesting that these children do not benefit from multisensory integrative processing to the same extent as their typically developing peers. In contrast, auditory-alone performance did not differ between the groups, signifying that this multisensory deficit is not simply due to impaired processing of auditory speech. Conclusions: Children with an over-responsive SPD show a substantial reduction in their ability to benefit from complementary audiovisual speech, to enhance speech perception in a noisy environment. This has clear implications for performance in the classroom and other learning environments. Impaired multisensory integration may contribute to sensory over-reactivity that is the definitional of SPD.
Collapse
Affiliation(s)
- John J Foxe
- The Cognitive Neurophysiology Laboratory, Department of Neuroscience, The Ernest J. Del Monte Institute for Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, United States.,The Cognitive Neurophysiology Laboratory, Department of Pediatrics, Albert Einstein College of Medicine and Montefiore Medical Center, Bronx, NY, United States.,The Dominic P. Purpura Department of Neuroscience, Albert Einstein College of Medicine, Bronx, NY, United States
| | - Victor A Del Bene
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics, Albert Einstein College of Medicine and Montefiore Medical Center, Bronx, NY, United States
| | - Lars A Ross
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics, Albert Einstein College of Medicine and Montefiore Medical Center, Bronx, NY, United States
| | - Elizabeth M Ridgway
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics, Albert Einstein College of Medicine and Montefiore Medical Center, Bronx, NY, United States
| | - Ana A Francisco
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics, Albert Einstein College of Medicine and Montefiore Medical Center, Bronx, NY, United States
| | - Sophie Molholm
- The Cognitive Neurophysiology Laboratory, Department of Neuroscience, The Ernest J. Del Monte Institute for Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY, United States.,The Cognitive Neurophysiology Laboratory, Department of Pediatrics, Albert Einstein College of Medicine and Montefiore Medical Center, Bronx, NY, United States.,The Dominic P. Purpura Department of Neuroscience, Albert Einstein College of Medicine, Bronx, NY, United States
| |
Collapse
|
14
|
Forsberg A, O'Dowd A, Gherri E. Tool use modulates early stages of visuo-tactile integration in far space: Evidence from event-related potentials. Biol Psychol 2019; 145:42-54. [PMID: 30970269 DOI: 10.1016/j.biopsycho.2019.03.020] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2018] [Revised: 01/15/2019] [Accepted: 03/28/2019] [Indexed: 10/27/2022]
Abstract
The neural representation of multisensory space near the body is modulated by the active use of long tools in non-human primates. Here, we investigated whether the electrophysiological correlates of visuo-tactile integration in near and far space were modulated by active tool use in healthy humans. Participants responded to a tactile target delivered to one hand while an irrelevant visual stimulus was presented ipsilaterally in near or far space. This crossmodal task was performed after the use of either short or long tools. Crucially, the P100 components elicited by visuo-tactile stimuli was enhanced on far as compared to near space trials after the use of long tools, while no such difference was present after short tool use. Thus, we found increased neural responses in brain areas encoding tactile stimuli to the body when visual stimuli were presented close to the tip of the tool after long tool use. This increased visuo-tactile integration on far space trials following the use of long tools might indicate a transient remapping of multisensory space. We speculate that performing voluntary actions with long tools strengthens the representation of sensory information arising within portions of space (i.e. the hand and the tip of the tool) that are most functionally relevant to one's behavioural goals.
Collapse
Affiliation(s)
- Alicia Forsberg
- Human Cognitive Neuroscience, Psychology, University of Edinburgh, UK
| | - Alan O'Dowd
- Human Cognitive Neuroscience, Psychology, University of Edinburgh, UK
| | - Elena Gherri
- Human Cognitive Neuroscience, Psychology, University of Edinburgh, UK.
| |
Collapse
|
15
|
Krueger Fister J, Stevenson RA, Nidiffer AR, Barnett ZP, Wallace MT. Stimulus intensity modulates multisensory temporal processing. Neuropsychologia 2016; 88:92-100. [PMID: 26920937 DOI: 10.1016/j.neuropsychologia.2016.02.016] [Citation(s) in RCA: 41] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2015] [Revised: 01/20/2016] [Accepted: 02/22/2016] [Indexed: 12/18/2022]
Abstract
One of the more challenging feats that multisensory systems must perform is to determine which sensory signals originate from the same external event, and thus should be integrated or "bound" into a singular perceptual object or event, and which signals should be segregated. Two important stimulus properties impacting this process are the timing and effectiveness of the paired stimuli. It has been well established that the more temporally aligned two stimuli are, the greater the degree to which they influence one another's processing. In addition, the less effective the individual unisensory stimuli are in eliciting a response, the greater the benefit when they are combined. However, the interaction between stimulus timing and stimulus effectiveness in driving multisensory-mediated behaviors has never been explored - which was the purpose of the current study. Participants were presented with either high- or low-intensity audiovisual stimuli in which stimulus onset asynchronies (SOAs) were parametrically varied, and were asked to report on the perceived synchrony/asynchrony of the paired stimuli. Our results revealed an interaction between the temporal relationship (SOA) and intensity of the stimuli. Specifically, individuals were more tolerant of larger temporal offsets (i.e., more likely to call them synchronous) when the paired stimuli were less effective. This interaction was also seen in response time (RT) distributions. Behavioral gains in RTs were seen with synchronous relative to asynchronous presentations, but this effect was more pronounced with high-intensity stimuli. These data suggest that stimulus effectiveness plays an underappreciated role in the perception of the timing of multisensory events, and reinforces the interdependency of the principles of multisensory integration in determining behavior and shaping perception.
Collapse
Affiliation(s)
- Juliane Krueger Fister
- Neuroscience Graduate Program, Vanderbilt University Medical Center, United States; Vanderbilt Brain Institute, United States.
| | - Ryan A Stevenson
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, United States; Vanderbilt Brain Institute, United States; Vanderbilt University Kennedy Center, United States; Department of Psychology, University of Toronto, Canada
| | - Aaron R Nidiffer
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, United States
| | - Zachary P Barnett
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, United States
| | - Mark T Wallace
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, United States; Vanderbilt Brain Institute, United States; Vanderbilt University Kennedy Center, United States; Department of Psychology, Vanderbilt University, United States; Department of Psychiatry, Vanderbilt University, United States
| |
Collapse
|
16
|
Abstract
Ventriloquism is a well-studied multisensory illusion of audiovisual spatial perception in which the perceived location of an auditory stimulus is shifted in the direction of a synchronous, but spatially discrepant visual stimulus. This effect is because of vision's superior acuity in the spatial dimension, but has also been shown to be influenced by the perception of unity of the two signals. We sought to investigate whether a similar phenomenon may occur between vision and somatosensation along the surface of the body as vision is known to possess superior spatial acuity to somatosensation. We report the first demonstration of the visuotactile ventriloquist illusion: individuals were instructed to localize visual stimuli (small white disks) or tactile stimuli (brief localized vibrations) that were presented concurrently or individually along the surface of the forearm, where bimodal presentations included spatially congruent and incongruent stimuli. Participants showed strong visual-tactile interactions. The tactile localization was strongly biased in the direction of the visual stimulus and the magnitude of this bias decreased as the spatial disparity between the two stimuli increased. The Bayesian causal inference model that has previously been shown to account for auditory-visual spatial localization and the ventriloquism effect also accounted well for the present data. Therefore, crossmodal interactions involving spatial representation along the surface of the body follow the same rules as crossmodal interactions involving representations of external space (auditory-visual).
Collapse
|