1
|
Yang T, Fan X, Hou B, Wang J, Chen X. Linguistic network in early deaf individuals: A neuroimaging meta-analysis. Neuroimage 2024; 299:120720. [PMID: 38971484 DOI: 10.1016/j.neuroimage.2024.120720] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2024] [Revised: 07/01/2024] [Accepted: 07/03/2024] [Indexed: 07/08/2024] Open
Abstract
This meta-analysis summarizes evidence from 44 neuroimaging experiments and characterizes the general linguistic network in early deaf individuals. Meta-analytic comparisons with hearing individuals found that a specific set of regions (in particular the left inferior frontal gyrus and posterior middle temporal gyrus) participates in supramodal language processing. In addition to previously described modality-specific differences, the present study showed that the left calcarine gyrus and the right caudate were additionally recruited in deaf compared with hearing individuals. In addition, this study showed that the bilateral posterior superior temporal gyrus is shaped by cross-modal plasticity, whereas the left frontotemporal areas are shaped by early language experience. Although an overall left-lateralized pattern for language processing was observed in the early deaf individuals, regional lateralization was altered in the inferior frontal gyrus and anterior temporal lobe. These findings indicate that the core language network functions in a modality-independent manner, and provide a foundation for determining the contributions of sensory and linguistic experiences in shaping the neural bases of language processing.
Collapse
Affiliation(s)
- Tengyu Yang
- Department of Otolaryngology, Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Peking Union Medical College, Beijing, PR China
| | - Xinmiao Fan
- Department of Otolaryngology, Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Peking Union Medical College, Beijing, PR China
| | - Bo Hou
- Department of Radiology, Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Peking Union Medical College, Beijing, PR China
| | - Jian Wang
- Department of Otolaryngology, Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Peking Union Medical College, Beijing, PR China.
| | - Xiaowei Chen
- Department of Otolaryngology, Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Peking Union Medical College, Beijing, PR China.
| |
Collapse
|
2
|
Banaszkiewicz A, Costello B, Marchewka A. Early language experience and modality affect parietal cortex activation in different hemispheres: Insights from hearing bimodal bilinguals. Neuropsychologia 2024; 204:108973. [PMID: 39151687 DOI: 10.1016/j.neuropsychologia.2024.108973] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2023] [Revised: 08/12/2024] [Accepted: 08/12/2024] [Indexed: 08/19/2024]
Abstract
The goal of this study was to investigate the impact of the age of acquisition (AoA) on functional brain representations of sign language in two exceptional groups of hearing bimodal bilinguals: native signers (simultaneous bilinguals since early childhood) and late signers (proficient sequential bilinguals, who learnt a sign language after puberty). We asked whether effects of AoA would be present across languages - signed and audiovisual spoken - and thus observed only in late signers as they acquired each language at different life stages, and whether effects of AoA would be present during sign language processing across groups. Moreover, we aimed to carefully control participants' level of sign language proficiency by implementing a battery of language tests developed for the purpose of the project, which confirmed that participants had high competences of sign language. Between-group analyses revealed a hypothesized modulatory effect of AoA in the right inferior parietal lobule (IPL) in native signers, compared to late signers. With respect to within-group differences across languages we observed greater involvement of the left IPL in response to sign language in comparison to spoken language in both native and late signers, indicating language modality effects. Overall, our results suggest that the neural underpinnings of language are molded by the linguistic characteristics of the language as well as by when in life the language is learnt.
Collapse
Affiliation(s)
- A Banaszkiewicz
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland; Laboratory of Language Neurobiology, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland.
| | - B Costello
- Basque Center of Cognition, Brain and Language, Donostia-San Sebstián, Spain; Ikerbasque, Basque Foundation for Science, Bilbao, Spain
| | - A Marchewka
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| |
Collapse
|
3
|
Alderson-Day B, Pearson A. What can neurodiversity tell us about inner speech, and vice versa? A theoretical perspective. Cortex 2023; 168:193-202. [PMID: 37769592 DOI: 10.1016/j.cortex.2023.08.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2023] [Revised: 07/10/2023] [Accepted: 08/24/2023] [Indexed: 10/03/2023]
Abstract
Inner speech refers to the experience of talking to oneself in one's head. While notoriously challenging to investigate, it has also been central to a range of questions concerning mind, brain, and behaviour. Posited as a key component in executive function and self-regulation, inner speech has been claimed to be crucial in higher cognitive operations, self-knowledge and self-awareness. Such arguments have traditionally been supported with examples of atypical development. But variations in inner speech - and in some cases, significant diversity - in fact pose several key challenges to such claims, and raises many more questions for, language, thought and mental health more generally. In this review, we will summarise evidence on the experience and operation of inner speech in child and adult neurotypical populations, autistic people and other neurodivergent groups, and people with diverse experiences of linguistic and sensory development, including deafness. We will demonstrate that the relationship between inner speech and cognitive operations may be more complex than first assumed when explored through the lens of cognitive and neurological diversity, and the implications of that for understanding the developing brain in all populations. We discuss why and how the experience of inner speech in neurodivergent groups has often been assumed rather than investigated, making it an important opportunity for researchers to develop innovative future work that integrates participatory insights with cognitive methodology. Finally, we will outline why variations in inner speech - in neurotypical and neurodivergent populations alike - nevertheless have a range of important implications for mental health vulnerability and unmet need. In this sense, the example of inner speech offers us both a way of looking back at the logic of developmental psychology and neuropsychology, and a clue to its future in a neurodiverse world.
Collapse
Affiliation(s)
| | - Amy Pearson
- Department of Psychology, University of Sunderland, UK
| |
Collapse
|
4
|
Pontecorvo E, Higgins M, Mora J, Lieberman AM, Pyers J, Caselli NK. Learning a Sign Language Does Not Hinder Acquisition of a Spoken Language. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2023; 66:1291-1308. [PMID: 36972338 PMCID: PMC10187967 DOI: 10.1044/2022_jslhr-22-00505] [Citation(s) in RCA: 8] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/30/2022] [Revised: 12/02/2022] [Accepted: 12/12/2022] [Indexed: 05/18/2023]
Abstract
PURPOSE The purpose of this study is to determine whether and how learning American Sign Language (ASL) is associated with spoken English skills in a sample of ASL-English bilingual deaf and hard of hearing (DHH) children. METHOD This cross-sectional study of vocabulary size included 56 DHH children between 8 and 60 months of age who were learning both ASL and spoken English and had hearing parents. English and ASL vocabulary were independently assessed via parent report checklists. RESULTS ASL vocabulary size positively correlated with spoken English vocabulary size. Spoken English vocabulary sizes in the ASL-English bilingual DHH children in the present sample were comparable to those in previous reports of monolingual DHH children who were learning only English. ASL-English bilingual DHH children had total vocabularies (combining ASL and English) that were equivalent to same-age hearing monolingual children. Children with large ASL vocabularies were more likely to have spoken English vocabularies in the average range based on norms for hearing monolingual children. CONCLUSIONS Contrary to predictions often cited in the literature, acquisition of sign language does not harm spoken vocabulary acquisition. This retrospective, correlational study cannot determine whether there is a causal relationship between sign language and spoken language vocabulary acquisition, but if a causal relationship exists, the evidence here suggests that the effect would be positive. Bilingual DHH children have age-expected vocabularies when considering the entirety of their language skills. We found no evidence to support recommendations that families with DHH children avoid learning sign language. Rather, our findings show that children with early ASL exposure can develop age-appropriate vocabulary skills in both ASL and spoken English.
Collapse
|
5
|
Šantić IŠ, Bonetti L. Language Intervention Instead of Speech Intervention for Children With Cochlear Implants. J Audiol Otol 2023; 27:55-62. [PMID: 37073450 PMCID: PMC10126584 DOI: 10.7874/jao.2022.00584] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2022] [Revised: 02/18/2023] [Accepted: 02/24/2023] [Indexed: 04/20/2023] Open
Abstract
Cochlear implants are a standard rehabilitation option for children with severe hearing loss or deafness, allowing access to speech sounds necessary for the development of spoken language. However, the speech-language outcomes of pediatric cochlear implant users vary widely and are not directly or exclusively linked to technology but to combinations of individual audiological, personal, technical, and habilitational factors. These combinations may not favor spoken language development, which may further be linked to the issue of prior insistence on spoken language learning and associated with a high risk of language deprivation. Here, we discuss the outcomes of cochlear implantation from a habilitative perspective and lay down the efforts and resources necessary for the development of communication competence after cochlear implantation rather than the achievement of specific hearing, language, or speech skills that have limited socioemotional and educational contributions and do not guarantee an independent or productive life.
Collapse
Affiliation(s)
- Ivana Šimić Šantić
- Department of Speech and Language Pathology, Faculty of Education and Rehabilitation Sciences, University of Zagreb, Zagreb, Croatia
| | - Luka Bonetti
- Department of Hearing Impairments, Faculty of Education and Rehabilitation Sciences, University of Zagreb, Zagreb, Croatia
| |
Collapse
|
6
|
Cui W, Wang S, Chen B, Fan G. White matter structural network alterations in congenital bilateral profound sensorineural hearing loss children: A graph theory analysis. Hear Res 2022; 422:108521. [PMID: 35660126 DOI: 10.1016/j.heares.2022.108521] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/14/2021] [Revised: 03/22/2022] [Accepted: 05/14/2022] [Indexed: 11/25/2022]
Abstract
Functional magnetic resonance imaging (fMRI) studies have revealed a functional reorganization in patients with sensorineural hearing loss (SNHL). The structural basement of functional changes has also been investigated recently. Graph theory analysis brings a new understanding of the structural connectome and topological features in central neural system diseases. However, little is known about the structural network connectome changes in SNHL patients, especially in children. We explored the differences in topologic organization, rich-club organization, and structural connection between children with congenital bilateral profound SNHL and normal hearing under the age of three using graph theory analysis and probabilistic tractography. Compared with the normal-hearing (NH) group, the SNHL group showed no difference in global and nodal topological parameters. Increased structural connection strength were found in the right cortico-striatal-thalamus-cortical circuity. Decreased cross-hemisphere connections were found between the right precuneus and the left auditory cortex as well as the left subcortical regions. Rich-club organization analysis found increased local connection in the SNHL group. These results revealed structural organizations after hearing deprivation in congenital bilateral profound SNHL children.
Collapse
Affiliation(s)
- Wenzhuo Cui
- Department of Radiology, The First Affiliated Hospital of China Medical University, Shenyang, LN, China
| | - Shanshan Wang
- Department of Radiology, The First Affiliated Hospital of China Medical University, Shenyang, LN, China
| | - Boyu Chen
- Department of Radiology, The First Affiliated Hospital of China Medical University, Shenyang, LN, China
| | - Guoguang Fan
- Department of Radiology, The First Affiliated Hospital of China Medical University, Shenyang, LN, China.
| |
Collapse
|
7
|
Holmer E, Schönström K, Andin J. Associations Between Sign Language Skills and Resting-State Functional Connectivity in Deaf Early Signers. Front Psychol 2022; 13:738866. [PMID: 35369269 PMCID: PMC8975249 DOI: 10.3389/fpsyg.2022.738866] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2021] [Accepted: 02/03/2022] [Indexed: 11/13/2022] Open
Abstract
The processing of a language involves a neural language network including temporal, parietal, and frontal cortical regions. This applies to spoken as well as signed languages. Previous research suggests that spoken language proficiency is associated with resting-state functional connectivity (rsFC) between language regions and other regions of the brain. Given the similarities in neural activation for spoken and signed languages, rsFC-behavior associations should also exist for sign language tasks. In this study, we explored the associations between rsFC and two types of linguistic skills in sign language: phonological processing skill and accuracy in elicited sentence production. Fifteen adult, deaf early signers were enrolled in a resting-state functional magnetic resonance imaging (fMRI) study. In addition to fMRI data, behavioral tests of sign language phonological processing and sentence reproduction were administered. Using seed-to-voxel connectivity analysis, we investigated associations between behavioral proficiency and rsFC from language-relevant nodes: bilateral inferior frontal gyrus (IFG) and posterior superior temporal gyrus (STG). Results showed that worse sentence processing skill was associated with stronger positive rsFC between the left IFG and left sensorimotor regions. Further, sign language phonological processing skill was associated with positive rsFC from right IFG to middle frontal gyrus/frontal pole although this association could possibly be explained by domain-general cognitive functions. Our findings suggest a possible connection between rsFC and developmental language outcomes in deaf individuals.
Collapse
Affiliation(s)
- Emil Holmer
- Linnaeus Centre HEAD, Swedish Institute for Disability Research, Department of Behavioural Sciences and Learning, Linköping University, Linköping, Sweden
- Center for Medical Image Science and Visualization, Linköping, Sweden
- *Correspondence: Emil Holmer,
| | | | - Josefine Andin
- Linnaeus Centre HEAD, Swedish Institute for Disability Research, Department of Behavioural Sciences and Learning, Linköping University, Linköping, Sweden
| |
Collapse
|
8
|
Abstract
The first 40 years of research on the neurobiology of sign languages (1960-2000) established that the same key left hemisphere brain regions support both signed and spoken languages, based primarily on evidence from signers with brain injury and at the end of the 20th century, based on evidence from emerging functional neuroimaging technologies (positron emission tomography and fMRI). Building on this earlier work, this review focuses on what we have learned about the neurobiology of sign languages in the last 15-20 years, what controversies remain unresolved, and directions for future research. Production and comprehension processes are addressed separately in order to capture whether and how output and input differences between sign and speech impact the neural substrates supporting language. In addition, the review includes aspects of language that are unique to sign languages, such as pervasive lexical iconicity, fingerspelling, linguistic facial expressions, and depictive classifier constructions. Summary sketches of the neural networks supporting sign language production and comprehension are provided with the hope that these will inspire future research as we begin to develop a more complete neurobiological model of sign language processing.
Collapse
|
9
|
Kumar U, Keshri A, Mishra M. Alteration of brain resting-state networks and functional connectivity in prelingual deafness. J Neuroimaging 2021; 31:1135-1145. [PMID: 34189809 DOI: 10.1111/jon.12904] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2021] [Revised: 06/12/2021] [Accepted: 06/15/2021] [Indexed: 11/29/2022] Open
Abstract
BACKGROUND AND PURPOSE Early hearing loss causes several changes in the brain structure and function at multiple levels; these changes can be observed through neuroimaging. These changes are directly associated with sensory loss (hearing) and the acquisition of alternative communication strategies. Such plasticity changes in the brain might establish a different connectivity pattern with resting-state networks (RSNs) and other brain regions. We performed resting-state functional magnetic resonance imaging (rsfMRI) to evaluate these intrinsic modifications. METHODS We used two methods to characterize the functional connectivity (FC) of RSN components in 20 prelingual deaf adults and 20 demographic-matched hearing adults. rsfMRI data were analyzed using independent component analysis (ICA) and region-of-interest seed-to-voxel correlation analysis. RESULTS In ICA, we identified altered FC of RSNs in the deaf group. RSNs with altered FC were observed in higher visual, auditory, default mode, salience, and sensorimotor networks. The findings of seed-to-voxel correlation analysis suggested increased temporal coherence with other neural networks in the deaf group compared with the hearing control group. CONCLUSION These findings suggest a highly diverse resting-state connectivity pattern in prelingual deaf adults resulting from compensatory cross-modal plasticity that includes both auditory and nonauditory regions.
Collapse
Affiliation(s)
- Uttam Kumar
- Centre of Bio-Medical Research, Sanjay Gandhi Postgraduate Institute of Medical Sciences Campus, Lucknow, India
| | - Amit Keshri
- Department of Neuro-otology, Sanjay Gandhi Postgraduate Institute of Medical Sciences Campus, Lucknow, India
| | - Mrutyunjaya Mishra
- Department of Special Education (Hearing Impairments), Dr. Shakuntala Misra National Rehabilitation University, Lucknow, India
| |
Collapse
|
10
|
Quantitative EEG measures in profoundly deaf and normal hearing individuals while performing a vibrotactile temporal discrimination task. Int J Psychophysiol 2021; 166:71-82. [PMID: 34023377 DOI: 10.1016/j.ijpsycho.2021.05.007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2020] [Revised: 05/10/2021] [Accepted: 05/16/2021] [Indexed: 11/22/2022]
Abstract
Challenges in early oral language acquisition in profoundly deaf individuals have an impact on cognitive neurodevelopment. This has led to the exploration of alternative sound perception methods involving training of vibrotactile discrimination of sounds within the language spectrum. In particular, stimulus duration plays an important role in linguistic categorical perception. We comparatively evaluated vibrotactile temporal discrimination of sound and how specific training can modify the underlying electrical brain activity. Fifteen profoundly deaf (PD) and 15 normal-hearing (NH) subjects performed a vibrotactile oddball task with simultaneous EEG recording, before and after a short training period (5 one-hour sessions; in 2.5-3 weeks). The stimuli consisted of 700 Hz pure-tones with different duration (target: long 500 ms; non-target: short 250 ms). The sound-wave stimuli were delivered by a small device worn on the right index finger. A similar behavioral training effect was observed in both groups showing significant improvement in sound-duration discrimination. However, quantitative EEG measurements reveal distinct neurophysiological patterns characterized by higher and more diffuse delta band magnitudes in the PD group, together with a generalized decrement in absolute power in both groups that might reflect a facilitating process associated to learning. Furthermore, training-related changes were found in the beta-band in NH. Findings suggest PD have different cognitive adaptive mechanisms which are not a mere amplification effect due to greater cortical excitability.
Collapse
|
11
|
Abstract
Natural sign languages of deaf communities are acquired on the same time scale as that of spoken languages if children have access to fluent signers providing input from birth. Infants are sensitive to linguistic information provided visually, and early milestones show many parallels. The modality may affect various areas of language acquisition; such effects include the form of signs (sign phonology), the potential advantage presented by visual iconicity, and the use of spatial locations to represent referents, locations, and movement events. Unfortunately, the vast majority of deaf children do not receive accessible linguistic input in infancy, and these children experience language deprivation. Negative effects on language are observed when first-language acquisition is delayed. For those who eventually begin to learn a sign language, earlier input is associated with better language and academic outcomes. Further research is especially needed with a broader diversity of participants.
Collapse
Affiliation(s)
- Diane Lillo-Martin
- Department of Linguistics, University of Connecticut, Storrs, Connecticut 06269-1145, USA
- Haskins Laboratories, New Haven, Connecticut 06511, USA
| | - Jonathan Henner
- Department of Specialized Education Services, University of North Carolina, Greensboro, North Carolina 27412, USA
| |
Collapse
|
12
|
Banaszkiewicz A, Bola Ł, Matuszewski J, Szczepanik M, Kossowski B, Mostowski P, Rutkowski P, Śliwińska M, Jednoróg K, Emmorey K, Marchewka A. The role of the superior parietal lobule in lexical processing of sign language: Insights from fMRI and TMS. Cortex 2020; 135:240-254. [PMID: 33401098 DOI: 10.1016/j.cortex.2020.10.025] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2020] [Revised: 09/24/2020] [Accepted: 10/22/2020] [Indexed: 11/29/2022]
Abstract
There is strong evidence that neuronal bases for language processing are remarkably similar for sign and spoken languages. However, as meanings and linguistic structures of sign languages are coded in movement and space and decoded through vision, differences are also present, predominantly in occipitotemporal and parietal areas, such as superior parietal lobule (SPL). Whether the involvement of SPL reflects domain-general visuospatial attention or processes specific to sign language comprehension remains an open question. Here we conducted two experiments to investigate the role of SPL and the laterality of its engagement in sign language lexical processing. First, using unique longitudinal and between-group designs we mapped brain responses to sign language in hearing late learners and deaf signers. Second, using transcranial magnetic stimulation (TMS) in both groups we tested the behavioural relevance of SPL's engagement and its lateralisation during sign language comprehension. SPL activation in hearing participants was observed in the right hemisphere before and bilaterally after the sign language course. Additionally, after the course hearing learners exhibited greater activation in the occipital cortex and left SPL than deaf signers. TMS applied to the right SPL decreased accuracy in both hearing learners and deaf signers. Stimulation of the left SPL decreased accuracy only in hearing learners. Our results suggest that right SPL might be involved in visuospatial attention while left SPL might support phonological decoding of signs in non-proficient signers.
Collapse
Affiliation(s)
- A Banaszkiewicz
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - Ł Bola
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - J Matuszewski
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - M Szczepanik
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - B Kossowski
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - P Mostowski
- Section for Sign Linguistics, Faculty of Polish Studies, University of Warsaw, Warsaw, Poland
| | - P Rutkowski
- Section for Sign Linguistics, Faculty of Polish Studies, University of Warsaw, Warsaw, Poland
| | - M Śliwińska
- Department of Psychology, University of York, Heslington, UK
| | - K Jednoróg
- Laboratory of Language Neurobiology, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - K Emmorey
- Laboratory for Language and Cognitive Neuroscience, San Diego State University, San Diego, USA
| | - A Marchewka
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland.
| |
Collapse
|
13
|
Cheng Q, Silvano E, Bedny M. Sensitive periods in cortical specialization for language: insights from studies with Deaf and blind individuals. Curr Opin Behav Sci 2020; 36:169-176. [PMID: 33718533 PMCID: PMC7945734 DOI: 10.1016/j.cobeha.2020.10.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
Studies with Deaf and blind individuals demonstrate that linguistic and sensory experiences during sensitive periods have potent effects on neurocognitive basis of language. Native users of sign and spoken languages recruit similar fronto-temporal systems during language processing. By contrast, delays in sign language access impact proficiency and the neural basis of language. Analogously, early but not late-onset blindness modifies the neural basis of language. People born blind recruit 'visual' areas during language processing, show reduced left-lateralization of language and enhanced performance on some language tasks. Sensitive period plasticity in and outside fronto-temporal language systems shapes the neural basis of language.
Collapse
Affiliation(s)
- Qi Cheng
- University of California San Diego
- University of Washington
| | - Emily Silvano
- Federal University of Rio de Janeiro
- Johns Hopkins University
| | | |
Collapse
|