1
|
Cardin V, Kremneva E, Komarova A, Vinogradova V, Davidenko T, Zmeykina E, Kopnin PN, Iriskhanova K, Woll B. Resting-state functional connectivity in deaf and hearing individuals and its link to executive processing. Neuropsychologia 2023; 185:108583. [PMID: 37142052 DOI: 10.1016/j.neuropsychologia.2023.108583] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2022] [Revised: 04/23/2023] [Accepted: 04/27/2023] [Indexed: 05/06/2023]
Abstract
Sensory experience shapes brain structure and function, and it is likely to influence the organisation of functional networks of the brain, including those involved in cognitive processing. Here we investigated the influence of early deafness on the organisation of resting-state networks of the brain and its relation to executive processing. We compared resting-state connectivity between deaf and hearing individuals across 18 functional networks and 400 ROIs. Our results showed significant group differences in connectivity between seeds of the auditory network and most large-scale networks of the brain, in particular the somatomotor and salience/ventral attention networks. When we investigated group differences in resting-state fMRI and their link to behavioural performance in executive function tasks (working memory, inhibition and switching), differences between groups were found in the connectivity of association networks of the brain, such as the salience/ventral attention and default-mode networks. These findings indicate that sensory experience influences not only the organisation of sensory networks, but that it also has a measurable impact on the organisation of association networks supporting cognitive processing. Overall, our findings suggest that different developmental pathways and functional organisation can support executive processing in the adult brain.
Collapse
Affiliation(s)
- Velia Cardin
- Deafness, Cognition and Language Research Centre, UCL, London, UK.
| | - Elena Kremneva
- Department of Radiology, Research Center of Neurology, Moscow, Russia
| | - Anna Komarova
- Galina Zaitseva Centre for Deaf Studies and Sign Language, Moscow, Russia; Language Department, Moscow State Linguistics University, Moscow, Russia
| | - Valeria Vinogradova
- Deafness, Cognition and Language Research Centre, UCL, London, UK; Galina Zaitseva Centre for Deaf Studies and Sign Language, Moscow, Russia; School of Psychology, University of East Anglia, Norwich, UK
| | - Tatiana Davidenko
- Galina Zaitseva Centre for Deaf Studies and Sign Language, Moscow, Russia
| | - Elina Zmeykina
- Department of Radiology, Research Center of Neurology, Moscow, Russia; Department of Neurology, University Medical Center Göttingen, Germany
| | - Petr N Kopnin
- Department of Neurorehabilitation and Physiotherapy, Research Center of Neurology, Moscow, Russia
| | - Kira Iriskhanova
- Language Department, Moscow State Linguistics University, Moscow, Russia
| | - Bencie Woll
- Deafness, Cognition and Language Research Centre, UCL, London, UK
| |
Collapse
|
2
|
Gremp MA, Deocampo JA, Walk AM, Conway CM. Visual sequential processing and language ability in children who are deaf or hard of hearing. JOURNAL OF CHILD LANGUAGE 2019; 46:785-799. [PMID: 30803455 PMCID: PMC6633907 DOI: 10.1017/s0305000918000569] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
This study investigated the role of sequential processing in spoken language outcomes for children who are deaf or hard of hearing (DHH), ages 5;3-11;4, by comparing them to children with typical hearing (TH), ages 6;3-9;7, on sequential learning and memory tasks involving easily nameable and difficult-to-name visual stimuli. Children who are DHH performed more poorly on easily nameable sequencing tasks, which positively predicted receptive vocabulary scores. Results suggest sequential learning and memory may underlie delayed language skills of many children who are DHH. Implications for language development in children who are DHH are discussed.
Collapse
|
3
|
Miozzo M, Petrova A, Fischer-Baum S, Peressotti F. Serial position encoding of signs. Cognition 2016; 154:69-80. [DOI: 10.1016/j.cognition.2016.05.008] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2014] [Revised: 04/29/2016] [Accepted: 05/15/2016] [Indexed: 10/21/2022]
|
4
|
Marschark M, Sarchet T, Trani A. Effects of Hearing Status and Sign Language Use on Working Memory. JOURNAL OF DEAF STUDIES AND DEAF EDUCATION 2016; 21:148-155. [PMID: 26755684 PMCID: PMC4886321 DOI: 10.1093/deafed/env070] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/02/2014] [Revised: 12/08/2015] [Accepted: 12/11/2015] [Indexed: 05/29/2023]
Abstract
Deaf individuals have been found to score lower than hearing individuals across a variety of memory tasks involving both verbal and nonverbal stimuli, particularly those requiring retention of serial order. Deaf individuals who are native signers, meanwhile, have been found to score higher on visual-spatial memory tasks than on verbal-sequential tasks and higher on some visual-spatial tasks than hearing nonsigners. However, hearing status and preferred language modality (signed or spoken) frequently are confounded in such studies. That situation is resolved in the present study by including deaf students who use spoken language and sign language interpreting students (hearing signers) as well as deaf signers and hearing nonsigners. Three complex memory span tasks revealed overall advantages for hearing signers and nonsigners over both deaf signers and deaf nonsigners on 2 tasks involving memory for verbal stimuli (letters). There were no differences among the groups on the task involving visual-spatial stimuli. The results are consistent with and extend recent findings concerning the effects of hearing status and language on memory and are discussed in terms of language modality, hearing status, and cognitive abilities among deaf and hearing individuals.
Collapse
Affiliation(s)
- Marc Marschark
- National Technical Institute for the Deaf-Rochester Institute of Technology and
| | - Thomastine Sarchet
- National Technical Institute for the Deaf-Rochester Institute of Technology and
| | | |
Collapse
|
5
|
Preexisting semantic representation improves working memory performance in the visuospatial domain. Mem Cognit 2016; 44:608-20. [DOI: 10.3758/s13421-016-0585-z] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/12/2023]
|
6
|
Marschark M, Spencer LJ, Durkin A, Borgna G, Convertino C, Machmer E, Kronenberger WG, Trani A. Understanding Language, Hearing Status, and Visual-Spatial Skills. JOURNAL OF DEAF STUDIES AND DEAF EDUCATION 2015; 20:310-330. [PMID: 26141071 PMCID: PMC4836709 DOI: 10.1093/deafed/env025] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/06/2015] [Revised: 06/09/2015] [Accepted: 06/12/2015] [Indexed: 05/29/2023]
Abstract
It is frequently assumed that deaf individuals have superior visual-spatial abilities relative to hearing peers and thus, in educational settings, they are often considered visual learners. There is some empirical evidence to support the former assumption, although it is inconsistent, and apparently none to support the latter. Three experiments examined visual-spatial and related cognitive abilities among deaf individuals who varied in their preferred language modality and use of cochlear implants (CIs) and hearing individuals who varied in their sign language skills. Sign language and spoken language assessments accompanied tasks involving visual-spatial processing, working memory, nonverbal logical reasoning, and executive function. Results were consistent with other recent studies indicating no generalized visual-spatial advantage for deaf individuals and suggested that their performance in that domain may be linked to the strength of their preferred language skills regardless of modality. Hearing individuals performed more strongly than deaf individuals on several visual-spatial and self-reported executive functioning measures, regardless of sign language skills or use of CIs. Findings are inconsistent with assumptions that deaf individuals are visual learners or are superior to hearing individuals across a broad range of visual-spatial tasks. Further, performance of deaf and hearing individuals on the same visual-spatial tasks was associated with differing cognitive abilities, suggesting that different cognitive processes may be involved in visual-spatial processing in these groups.
Collapse
|
7
|
García-Orza J, Carratalá P. Sign recall by hearing signers: Evidences of dual coding. JOURNAL OF COGNITIVE PSYCHOLOGY 2012. [DOI: 10.1080/20445911.2012.682054] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
|
8
|
Hirshorn EA, Fernandez NM, Bavelier D. Routes to short-term memory indexing: lessons from deaf native users of American Sign Language. Cogn Neuropsychol 2012; 29:85-103. [PMID: 22871205 DOI: 10.1080/02643294.2012.704354] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
Abstract
Models of working memory (WM) have been instrumental in understanding foundational cognitive processes and sources of individual differences. However, current models cannot conclusively explain the consistent group differences between deaf signers and hearing speakers on a number of short-term memory (STM) tasks. Here we take the perspective that these results are not due to a temporal order-processing deficit in deaf individuals, but rather reflect different biases in how different types of memory cues are used to do a given task. We further argue that the main driving force behind the shifts in relative biasing is a consequence of language modality (sign vs. speech) and the processing they afford, and not deafness, per se.
Collapse
|
9
|
López-Crespo G, Daza MT, Méndez-López M. Visual working memory in deaf children with diverse communication modes: improvement by differential outcomes. RESEARCH IN DEVELOPMENTAL DISABILITIES 2012; 33:362-368. [PMID: 22119682 DOI: 10.1016/j.ridd.2011.10.022] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/26/2011] [Accepted: 10/26/2011] [Indexed: 05/31/2023]
Abstract
Although visual functions have been proposed to be enhanced in deaf individuals, empirical studies have not yet established clear evidence on this issue. The present study aimed to determine whether deaf children with diverse communication modes had superior visual memory and whether their performance was improved by the use of differential outcomes. Severely or profoundly deaf children who employed spoken Spanish, Spanish Sign Language (SSL), and both spoken Spanish and SSL modes of communication were tested in a delayed matching-to-sample task for visual working memory assessment. Hearing controls were used to compare performance. Participants were tested in two conditions, differential outcome and non-differential outcome conditions. Deaf groups with either oral or SSL modes of communication completed the task with less accuracy than bilingual and control hearing children. In addition, the performances of all groups improved through the use of differential outcomes.
Collapse
Affiliation(s)
- Ginesa López-Crespo
- Departamento de Psicología y Sociología, Facultad de Ciencias Sociales y Humanas, Universidad de Zaragoza, 44003 Teruel, Spain.
| | | | | |
Collapse
|
10
|
Hall ML, Bavelier D. Short-term memory stages in sign vs. speech: the source of the serial span discrepancy. Cognition 2011; 120:54-66. [PMID: 21450284 DOI: 10.1016/j.cognition.2011.02.014] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2010] [Revised: 02/18/2011] [Accepted: 02/23/2011] [Indexed: 10/18/2022]
Abstract
Speakers generally outperform signers when asked to recall a list of unrelated verbal items. This phenomenon is well established, but its source has remained unclear. In this study, we evaluate the relative contribution of the three main processing stages of short-term memory--perception, encoding, and recall--in this effect. The present study factorially manipulates whether American Sign Language (ASL) or English is used for perception, memory encoding, and recall in hearing ASL-English bilinguals. Results indicate that using ASL during both perception and encoding contributes to the serial span discrepancy. Interestingly, performing recall in ASL slightly increased span, ruling out the view that signing is in general a poor choice for short-term memory. These results suggest that despite the general equivalence of sign and speech in other memory domains, speech-based representations are better suited for the specific task of perception and memory encoding of a series of unrelated verbal items in serial order through the phonological loop. This work suggests that interpretation of performance on serial recall tasks in English may not translate straightforwardly to serial tasks in sign language.
Collapse
Affiliation(s)
- Matthew L Hall
- Department of Psychology, UC San Diego, 9500 Gilman Dr., La Jolla, CA 92039-0109, USA.
| | | |
Collapse
|
11
|
RUDNER MARY, ANDIN JOSEFINE, RÖNNBERG JERKER. Working memory, deafness and sign language. Scand J Psychol 2009; 50:495-505. [DOI: 10.1111/j.1467-9450.2009.00744.x] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
12
|
MacSweeney M, Brammer MJ, Waters D, Goswami U. Enhanced activation of the left inferior frontal gyrus in deaf and dyslexic adults during rhyming. Brain 2009; 132:1928-40. [PMID: 19467990 PMCID: PMC2702837 DOI: 10.1093/brain/awp129] [Citation(s) in RCA: 72] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2008] [Revised: 03/10/2009] [Accepted: 04/17/2009] [Indexed: 11/14/2022] Open
Abstract
Hearing developmental dyslexics and profoundly deaf individuals both have difficulties processing the internal structure of words (phonological processing) and learning to read. In hearing non-impaired readers, the development of phonological representations depends on audition. In hearing dyslexics, many argue, auditory processes may be impaired. In congenitally profoundly deaf individuals, auditory speech processing is essentially absent. Two separate literatures have previously reported enhanced activation in the left inferior frontal gyrus in both deaf and dyslexic adults when contrasted with hearing non-dyslexics during reading or phonological tasks. Here, we used a rhyme judgement task to compare adults from these two special populations to a hearing non-dyslexic control group. All groups were matched on non-verbal intelligence quotient, reading age and rhyme performance. Picture stimuli were used since this requires participants to generate their own phonological representations, rather than have them partially provided via text. By testing well-matched groups of participants on the same task, we aimed to establish whether previous literatures reporting differences between individuals with and without phonological processing difficulties have identified the same regions of differential activation in these two distinct populations. The data indicate greater activation in the deaf and dyslexic groups than in the hearing non-dyslexic group across a large portion of the left inferior frontal gyrus. This includes the pars triangularis, extending superiorly into the middle frontal gyrus and posteriorly to include the pars opercularis, and the junction with the ventral precentral gyrus. Within the left inferior frontal gyrus, there was variability between the two groups with phonological processing difficulties. The superior posterior tip of the left pars opercularis, extending into the precentral gyrus, was activated to a greater extent by deaf than dyslexic participants, whereas the superior posterior portion of the pars triangularis extending into the ventral pars opercularis, was activated to a greater extent by dyslexic than deaf participants. Whether these regions play differing roles in compensating for poor phonological processing is not clear. However, we argue that our main finding of greater inferior frontal gyrus activation in both groups with phonological processing difficulties in contrast to controls suggests greater reliance on the articulatory component of speech during phonological processing when auditory processes are absent (deaf group) or impaired (dyslexic group). Thus, the brain appears to develop a similar solution to a processing problem that has different antecedents in these two populations.
Collapse
Affiliation(s)
- Mairéad MacSweeney
- Institute of Cognitive Neuroscience, University College London, London WC1N 3AR, UK.
| | | | | | | |
Collapse
|
13
|
Archbold S, Harris M, O'Donoghue G, Nikolopoulos T, White A, Richmond HL. Reading abilities after cochlear implantation: the effect of age at implantation on outcomes at 5 and 7 years after implantation. Int J Pediatr Otorhinolaryngol 2008; 72:1471-8. [PMID: 18703236 DOI: 10.1016/j.ijporl.2008.06.016] [Citation(s) in RCA: 78] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/15/2008] [Revised: 06/16/2008] [Accepted: 06/18/2008] [Indexed: 11/26/2022]
Abstract
OBJECTIVES The reading skills of deaf children have typically been delayed and this delay has been found to increase with age. This study explored the reading ability of a large group of children who had received cochlear implants 7 years earlier and investigated the relationship between reading ability and age at implantation. METHODS The reading ages of 105 children, with age at implantation less than 7 years and onset of deafness below the age of three, were assessed 5 and 7 years after implantation using the Edinburgh reading test. Net reading age was calculated by using the difference between chronological age and reading age. Non-verbal intelligence was measured for a subset of 71 children, using Raven's coloured progressive matrices. Further investigation of this subset looked at the association of nonverbal intelligence, age at implantation and reading ability. RESULTS There was a strong negative correlation at both 5 and 7 years after implant between net reading score and age at implantation. In the subset of 71 children who had an IQ score within normal range, those implanted at or before 42 months had age-appropriate reading both 5 and 7 years post-implant. This was not the case for children implanted after 42 months. Reading progress at the two post-implant assessment intervals were found to be highly related. CONCLUSIONS Age at implantation was a significant factor in the development of reading skills in this group. In children implanted below the age of 42 months, reading progress was in line with chronological age, which has not been the case previously with profoundly deaf children. With earlier implantation more common in present groups, and improved technology, there is every reason to be optimistic about the influence of cochlear implantation on the development of reading skills in deaf children.
Collapse
Affiliation(s)
- Sue Archbold
- The Ear Foundation, 83 Sherwin Road, Nottingham NG7 2FB, UK.
| | | | | | | | | | | |
Collapse
|
14
|
Bavelier D, Newman AJ, Mukherjee M, Hauser P, Kemeny S, Braun A, Boutla M. Encoding, rehearsal, and recall in signers and speakers: shared network but differential engagement. ACTA ACUST UNITED AC 2008; 18:2263-74. [PMID: 18245041 DOI: 10.1093/cercor/bhm248] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
Short-term memory (STM), or the ability to hold verbal information in mind for a few seconds, is known to rely on the integrity of a frontoparietal network of areas. Here, we used functional magnetic resonance imaging to ask whether a similar network is engaged when verbal information is conveyed through a visuospatial language, American Sign Language, rather than speech. Deaf native signers and hearing native English speakers performed a verbal recall task, where they had to first encode a list of letters in memory, maintain it for a few seconds, and finally recall it in the order presented. The frontoparietal network described to mediate STM in speakers was also observed in signers, with its recruitment appearing independent of the modality of the language. This finding supports the view that signed and spoken STM rely on similar mechanisms. However, deaf signers and hearing speakers differentially engaged key structures of the frontoparietal network as the stages of STM unfold. In particular, deaf signers relied to a greater extent than hearing speakers on passive memory storage areas during encoding and maintenance, but on executive process areas during recall. This work opens new avenues for understanding similarities and differences in STM performance in signers and speakers.
Collapse
Affiliation(s)
- D Bavelier
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY 14627-0268, USA.
| | | | | | | | | | | | | |
Collapse
|
15
|
Marschark M. Intellectual functioning of deaf adults and children: Answers and questions. ACTA ACUST UNITED AC 2006. [DOI: 10.1080/09541440500216028] [Citation(s) in RCA: 16] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
16
|
Boutla M, Supalla T, Newport EL, Bavelier D. Short-term memory span: insights from sign language. Nat Neurosci 2004; 7:997-1002. [PMID: 15311279 PMCID: PMC2945821 DOI: 10.1038/nn1298] [Citation(s) in RCA: 89] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2004] [Accepted: 07/06/2004] [Indexed: 11/09/2022]
Abstract
Short-term memory (STM), or the ability to hold information in mind for a few seconds, is thought to be limited in its capacity to about 7 +/- 2 items. Notably, the average STM capacity when using American Sign Language (ASL) rather than English is only 5 +/- 1 items. Here we show that, contrary to previous interpretations, this difference cannot be attributed to phonological factors, item duration or reduced memory abilities in deaf people. We also show that, despite this difference in STM span, hearing speakers and deaf ASL users have comparable working memory resources during language use, indicating similar abilities to maintain and manipulate linguistic information. The shorter STM span in ASL users therefore confirms the view that the spoken span of 7 +/- 2 is an exception, probably owing to the reliance of speakers on auditory-based rather than visually based representations in linguistic STM, and calls for adjustments in the norms used with deaf individuals.
Collapse
Affiliation(s)
- Mrim Boutla
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, New York 14627, USA
| | | | | | | |
Collapse
|
17
|
Marschark M. Memory for language in deaf adults and children. SCANDINAVIAN AUDIOLOGY. SUPPLEMENTUM 1999; 49:87-92. [PMID: 10209782 DOI: 10.1080/010503998420702] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
Abstract
Cognitive psychologists have known for a long time that language and memory are intimately related in people who can hear. Why should the situation be any different for deaf children and deaf adults? This article considers the results of previous studies and some new findings in examining the possible impact of spoken language and sign language fluencies/preferences on the structure and process of memory in deaf individuals. Current evidence suggests that there are some differences in the organization of long-term memory in deaf as compared to hearing people, but no one has yet demonstrated such differences to be so large that they qualitatively or quantitatively affect learning in any real sense. In contrast, there is now abundant evidence to suggest that variation in spoken language abilities have a direct impact on memory span and perhaps on working memory more generally. These findings are discussed in terms of their implications for the education of students who are deaf or hard of hearing.
Collapse
Affiliation(s)
- M Marschark
- National Technical Institute for the Deaf, Rochester Institute of Technology, New York, USA.
| |
Collapse
|