1
|
Fingerspelling and Its Role in Translanguaging. LANGUAGES (BASEL, SWITZERLAND) 2022; 7:278. [PMID: 37920277 PMCID: PMC10622114 DOI: 10.3390/languages7040278] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 11/04/2023]
Abstract
Fingerspelling is a critical component of many sign languages. This manual representation of orthographic code is one key way in which signers engage in translanguaging, drawing from all of their linguistic and semiotic resources to support communication. Translanguaging in bimodal bilinguals is unique because it involves drawing from languages in different modalities, namely a signed language like American Sign Language and a spoken language like English (or its written form). Fingerspelling can be seen as a unique product of the unified linguistic system that translanguaging theories purport, as it blends features of both sign and print. The goals of this paper are twofold: to integrate existing research on fingerspelling in order to characterize it as a cognitive-linguistic phenomenon and to discuss the role of fingerspelling in translanguaging and communication. We will first review and synthesize research from linguistics and cognitive neuroscience to summarize our current understanding of fingerspelling, its production, comprehension, and acquisition. We will then discuss how fingerspelling relates to translanguaging theories and how it can be incorporated into translanguaging practices to support literacy and other communication goals.
Collapse
|
2
|
Contribution of Lexical Quality and Sign Language Variables to Reading Comprehension. JOURNAL OF DEAF STUDIES AND DEAF EDUCATION 2022; 27:355-372. [PMID: 35775152 DOI: 10.1093/deafed/enac018] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/18/2022] [Revised: 05/17/2022] [Accepted: 05/21/2022] [Indexed: 06/15/2023]
Abstract
The lexical quality hypothesis proposes that the quality of phonological, orthographic, and semantic representations impacts reading comprehension. In Study 1, we evaluated the contributions of lexical quality to reading comprehension in 97 deaf and 98 hearing adults matched for reading ability. While phonological awareness was a strong predictor for hearing readers, for deaf readers, orthographic precision and semantic knowledge, not phonology, predicted reading comprehension (assessed by two different tests). For deaf readers, the architecture of the reading system adapts by shifting reliance from (coarse-grained) phonological representations to high-quality orthographic and semantic representations. In Study 2, we examined the contribution of American Sign Language (ASL) variables to reading comprehension in 83 deaf adults. Fingerspelling (FS) and ASL comprehension skills predicted reading comprehension. We suggest that FS might reinforce orthographic-to-semantic mappings and that sign language comprehension may serve as a linguistic basis for the development of skilled reading in deaf signers.
Collapse
|
3
|
Cross-modal and cross-language activation in bilinguals reveals lexical competition even when words or signs are unheard or unseen. Proc Natl Acad Sci U S A 2022; 119:e2203906119. [PMID: 36037359 PMCID: PMC9457174 DOI: 10.1073/pnas.2203906119] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
We exploit the phenomenon of cross-modal, cross-language activation to examine the dynamics of language processing. Previous within-language work showed that seeing a sign coactivates phonologically related signs, just as hearing a spoken word coactivates phonologically related words. In this study, we conducted a series of eye-tracking experiments using the visual world paradigm to investigate the time course of cross-language coactivation in hearing bimodal bilinguals (Spanish-Spanish Sign Language) and unimodal bilinguals (Spanish/Basque). The aim was to gauge whether (and how) seeing a sign could coactivate words and, conversely, how hearing a word could coactivate signs and how such cross-language coactivation patterns differ from within-language coactivation. The results revealed cross-language, cross-modal activation in both directions. Furthermore, comparison with previous findings of within-language lexical coactivation for spoken and signed language showed how the impact of temporal structure changes in different modalities. Spoken word activation follows the temporal structure of that word only when the word itself is heard; for signs, the temporal structure of the sign does not govern the time course of lexical access (location coactivation precedes handshape coactivation)-even when the sign is seen. We provide evidence that, instead, this pattern of activation is motivated by how common in the lexicon the sublexical units of the signs are. These results reveal the interaction between the perceptual properties of the explicit signal and structural linguistic properties. Examining languages across modalities illustrates how this interaction impacts language processing.
Collapse
|
4
|
Bi-directional cross-language activation in Chinese Sign Language (CSL)-Chinese bimodal bilinguals. Acta Psychol (Amst) 2022; 229:103693. [PMID: 35933798 DOI: 10.1016/j.actpsy.2022.103693] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/01/2022] [Revised: 07/27/2022] [Accepted: 07/28/2022] [Indexed: 11/20/2022] Open
Abstract
In bilingual word recognition, cross-language activation has been found in unimodal bilinguals (e.g., Chinese-English bilinguals) and bimodal bilinguals (e.g., American Sign language-English bilinguals). However, it remains unclear how signs' phonological parameters, spoken words' orthographic and phonological representation, and language proficiency affect cross-language activation in bimodal bilinguals. To resolve the issues, we recruited deaf Chinese sign language (CSL)-Chinese bimodal bilinguals as participants. We conducted two experiments with the implicit priming paradigm and the semantic relatedness decision task. Experiment 1 first showed cross-language activation from Chinese to CSL, and the CSL words' phonological parameter affected the cross-language activation. Experiment 2 further revealed inverse cross-language activation from CSL to Chinese. The Chinese words' orthographic and phonological representation played a similar role in the cross-language activation. Moreover, a comparison between Experiments 1 and 2 indicated that language proficiency influenced cross-language activation. The findings were further discussed with the Bilingual Interactive Activation Plus (BIA+) model, the deaf BIA+ model, and the Bilingual Language Interaction Network for Comprehension of Speech (BLINCS) model.
Collapse
|
5
|
Ongoing Sign Processing Facilitates Written Word Recognition in Deaf Native Signing Children. Front Psychol 2022; 13:917700. [PMID: 35992405 PMCID: PMC9390089 DOI: 10.3389/fpsyg.2022.917700] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2022] [Accepted: 06/24/2022] [Indexed: 11/13/2022] Open
Abstract
Signed and written languages are intimately related in proficient signing readers. Here, we tested whether deaf native signing beginning readers are able to make rapid use of ongoing sign language to facilitate recognition of written words. Deaf native signing children (mean 10 years, 7 months) received prime target pairs with sign word onsets as primes and written words as targets. In a control group of hearing children (matched in their reading abilities to the deaf children, mean 8 years, 8 months), spoken word onsets were instead used as primes. Targets (written German words) either were completions of the German signs or of the spoken word onsets. Task of the participants was to decide whether the target word was a possible German word. Sign onsets facilitated processing of written targets in deaf children similarly to spoken word onsets facilitating processing of written targets in hearing children. In both groups, priming elicited similar effects in the simultaneously recorded event related potentials (ERPs), starting as early as 200 ms after the onset of the written target. These results suggest that beginning readers can use ongoing lexical processing in their native language - be it signed or spoken - to facilitate written word recognition. We conclude that intimate interactions between sign and written language might in turn facilitate reading acquisition in deaf beginning readers.
Collapse
|
6
|
Predictive Processing in Sign Languages: A Systematic Review. Front Psychol 2022; 13:805792. [PMID: 35496220 PMCID: PMC9047358 DOI: 10.3389/fpsyg.2022.805792] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2021] [Accepted: 03/03/2022] [Indexed: 01/12/2023] Open
Abstract
The objective of this article was to review existing research to assess the evidence for predictive processing (PP) in sign language, the conditions under which it occurs, and the effects of language mastery (sign language as a first language, sign language as a second language, bimodal bilingualism) on the neural bases of PP. This review followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) framework. We searched peer-reviewed electronic databases (SCOPUS, Web of Science, PubMed, ScienceDirect, and EBSCO host) and gray literature (dissertations in ProQuest). We also searched the reference lists of records selected for the review and forward citations to identify all relevant publications. We searched for records based on five criteria (original work, peer-reviewed, published in English, research topic related to PP or neural entrainment, and human sign language processing). To reduce the risk of bias, the remaining two authors with expertise in sign language processing and a variety of research methods reviewed the results. Disagreements were resolved through extensive discussion. In the final review, 7 records were included, of which 5 were published articles and 2 were dissertations. The reviewed records provide evidence for PP in signing populations, although the underlying mechanism in the visual modality is not clear. The reviewed studies addressed the motor simulation proposals, neural basis of PP, as well as the development of PP. All studies used dynamic sign stimuli. Most of the studies focused on semantic prediction. The question of the mechanism for the interaction between one’s sign language competence (L1 vs. L2 vs. bimodal bilingual) and PP in the manual-visual modality remains unclear, primarily due to the scarcity of participants with varying degrees of language dominance. There is a paucity of evidence for PP in sign languages, especially for frequency-based, phonetic (articulatory), and syntactic prediction. However, studies published to date indicate that Deaf native/native-like L1 signers predict linguistic information during sign language processing, suggesting that PP is an amodal property of language processing.
Collapse
|
7
|
Lexical Competition Without Phonology: Masked Orthographic Neighbor Priming With Deaf Readers. JOURNAL OF DEAF STUDIES AND DEAF EDUCATION 2022; 27:151-165. [PMID: 34877600 DOI: 10.1093/deafed/enab040] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/15/2021] [Revised: 10/28/2021] [Accepted: 11/02/2021] [Indexed: 06/13/2023]
Abstract
Skilled reading is thought to rely on well-specified lexical representations that compete during visual word recognition. The establishment of these lexical representations is assumed to be driven by phonology. To test the role of phonology, we examined the prime lexicality effect (PLE), the index of lexical competition in signing deaf (N = 28) and hearing (N = 28) adult readers of Hungarian matched in age and education. We found no PLE for deaf readers even when reading skills were controlled for. Surprisingly, the hearing controls also showed reduced PLE; however, the effect was modulated by reading skill. More skilled hearing readers showed PLE, while more skilled deaf readers did not. These results suggest that phonology contributes to lexical competition; however, high-quality lexical representations are not necessarily built through phonology in deaf readers.
Collapse
|
8
|
Language control in bimodal bilinguals: Evidence from ERPs. Neuropsychologia 2021; 161:108019. [PMID: 34487737 DOI: 10.1016/j.neuropsychologia.2021.108019] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2021] [Revised: 09/02/2021] [Accepted: 09/02/2021] [Indexed: 11/29/2022]
Abstract
It is currently unclear to what degree language control, which minimizes non-target language interference and increases the probability of selecting target-language words, is similar for sign-speech (bimodal) bilinguals and spoken language (unimodal) bilinguals. To further investigate the nature of language control processes in bimodal bilinguals, we conducted the first event-related potential (ERP) language switching study with hearing American Sign Language (ASL)-English bilinguals. The results showed a pattern that has not been observed in any unimodal language switching study: a switch-related positivity over anterior sites and a switch-related negativity over posterior sites during ASL production in both early and late time windows. No such pattern was found during English production. We interpret these results as evidence that bimodal bilinguals uniquely engage language control at the level of output modalities.
Collapse
|
9
|
Motor Cortex Causally Contributes to Vocabulary Translation following Sensorimotor-Enriched Training. J Neurosci 2021; 41:8618-8631. [PMID: 34429380 DOI: 10.1523/jneurosci.2249-20.2021] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2020] [Revised: 07/19/2021] [Accepted: 07/20/2021] [Indexed: 11/21/2022] Open
Abstract
The role of the motor cortex in perceptual and cognitive functions is highly controversial. Here, we investigated the hypothesis that the motor cortex can be instrumental for translating foreign language vocabulary. Human participants of both sexes were trained on foreign language (L2) words and their native language translations over 4 consecutive days. L2 words were accompanied by complementary gestures (sensorimotor enrichment) or pictures (sensory enrichment). Following training, participants translated the auditorily presented L2 words that they had learned. During translation, repetitive transcranial magnetic stimulation was applied bilaterally to a site within the primary motor cortex (Brodmann area 4) located in the vicinity of the arm functional compartment. Responses within the stimulated motor region have previously been found to correlate with behavioral benefits of sensorimotor-enriched L2 vocabulary learning. Compared to sham stimulation, effective perturbation by repetitive transcranial magnetic stimulation slowed down the translation of sensorimotor-enriched L2 words, but not sensory-enriched L2 words. This finding suggests that sensorimotor-enriched training induced changes in L2 representations within the motor cortex, which in turn facilitated the translation of L2 words. The motor cortex may play a causal role in precipitating sensorimotor-based learning benefits, and may directly aid in remembering the native language translations of foreign language words following sensorimotor-enriched training. These findings support multisensory theories of learning while challenging reactivation-based theories.SIGNIFICANCE STATEMENT Despite the potential for sensorimotor enrichment to serve as a powerful tool for learning in many domains, its underlying brain mechanisms remain largely unexplored. Using transcranial magnetic stimulation and a foreign language (L2) learning paradigm, we found that sensorimotor-enriched training can induce changes in L2 representations within the motor cortex, which in turn causally facilitate the translation of L2 words. The translation of recently acquired L2 words may therefore rely not only on auditory information stored in memory or on modality-independent L2 representations, but also on the sensorimotor context in which the words have been experienced.
Collapse
|
10
|
The organization of the American Sign Language lexicon: Comparing one- and two-parameter ERP phonological priming effects across tasks. BRAIN AND LANGUAGE 2021; 218:104960. [PMID: 33940343 PMCID: PMC8543839 DOI: 10.1016/j.bandl.2021.104960] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/08/2020] [Revised: 01/29/2021] [Accepted: 04/13/2021] [Indexed: 06/12/2023]
Abstract
We used phonological priming and ERPs to investigate the organization of the lexicon in American Sign Language. Across go/no-go repetition detection and semantic categorization tasks, targets in related pairs that shared handshape and location elicited smaller N400s than targets in unrelated pairs, indicative of facilitated processing. Handshape-related targets also elicited smaller N400s than unrelated targets, but only in the repetition task. The location priming effect reversed direction across tasks, with slightlylargeramplitude N400s for targets in related versus unrelated pairs in the semantic task, indicative of interference. These patterns imply that handshape and location play different roles during sign recognition and that there is a hierarchical organization for the sign lexicon. Similar to interactive-activation models of word recognition, we argue for differentiation between sublexical facilitation and lexical competition. Lexical competition is primarily driven by the location parameter and is more engaged when identification of single lexico-semantic entries is required.
Collapse
|
11
|
Automaticity of lexical access in deaf and hearing bilinguals: Cross-linguistic evidence from the color Stroop task across five languages. Cognition 2021; 212:104659. [PMID: 33798950 DOI: 10.1016/j.cognition.2021.104659] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2020] [Revised: 12/08/2020] [Accepted: 03/07/2021] [Indexed: 11/15/2022]
Abstract
The well-known Stroop interference effect has been instrumental in revealing the highly automated nature of lexical processing as well as providing new insights to the underlying lexical organization of first and second languages within proficient bilinguals. The present cross-linguistic study had two goals: 1) to examine Stroop interference for dynamic signs and printed words in deaf ASL-English bilinguals who report no reliance on speech or audiological aids; 2) to compare Stroop interference effects in several groups of bilinguals whose two languages range from very distinct to very similar in their shared orthographic patterns: ASL-English bilinguals (very distinct), Chinese-English bilinguals (low similarity), Korean-English bilinguals (moderate similarity), and Spanish-English bilinguals (high similarity). Reaction time and accuracy were measured for the Stroop color naming and word reading tasks, for congruent and incongruent color font conditions. Results confirmed strong Stroop interference for both dynamic ASL stimuli and English printed words in deaf bilinguals, with stronger Stroop interference effects in ASL for deaf bilinguals who scored higher in a direct assessment of ASL proficiency. Comparison of the four groups of bilinguals revealed that the same-script bilinguals (Spanish-English bilinguals) exhibited significantly greater Stroop interference effects for color naming than the other three bilingual groups. The results support three conclusions. First, Stroop interference effects are found for both signed and spoken languages. Second, contrary to some claims in the literature about deaf signers who do not use speech being poor readers, deaf bilinguals' lexical processing of both signs and written words is highly automated. Third, cross-language similarity is a critical factor shaping bilinguals' experience of Stroop interference in their two languages. This study represents the first comparison of both deaf and hearing bilinguals on the Stroop task, offering a critical test of theories about bilingual lexical access and cognitive control.
Collapse
|
12
|
Language development in deaf bilinguals: Deaf middle school students co-activate written English and American Sign Language during lexical processing. Cognition 2021; 211:104642. [PMID: 33752155 DOI: 10.1016/j.cognition.2021.104642] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2020] [Revised: 02/07/2021] [Accepted: 02/19/2021] [Indexed: 11/20/2022]
Abstract
Bilinguals, both hearing and deaf, activate multiple languages simultaneously even in contexts that require only one language. To date, the point in development at which bilingual signers experience cross-language activation of a signed and a spoken language remains unknown. We investigated the processing of written words by ASL-English bilingual deaf middle school students. Deaf bilinguals were faster to respond to English word pairs with phonologically related translations in ASL than to English word pairs with unrelated translations, but no difference was found for hearing controls with no knowledge of ASL. The results indicate that co-activation of signs and written words is not the outcome of years of bilingual experience, but instead characterizes bilingual language development.
Collapse
|
13
|
The neurocognitive basis of skilled reading in prelingually and profoundly deaf adults. LANGUAGE AND LINGUISTICS COMPASS 2021; 15:e12407. [PMID: 34306178 PMCID: PMC8302003 DOI: 10.1111/lnc3.12407] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/13/2020] [Accepted: 02/03/2021] [Indexed: 05/09/2023]
Abstract
Deaf individuals have unique sensory and linguistic experiences that influence how they read and become skilled readers. This review presents our current understanding of the neurocognitive underpinnings of reading skill in deaf adults. Key behavioural and neuroimaging studies are integrated to build a profile of skilled adult deaf readers and to examine how changes in visual attention and reduced access to auditory input and phonology shape how they read both words and sentences. Crucially, the behaviours, processes, and neural circuity of deaf readers are compared to those of hearing readers with similar reading ability to help identify alternative pathways to reading success. Overall, sensitivity to orthographic and semantic information is comparable for skilled deaf and hearing readers, but deaf readers rely less on phonology and show greater engagement of the right hemisphere in visual word processing. During sentence reading, deaf readers process visual word forms more efficiently and may have a greater reliance on and altered connectivity to semantic information compared to their hearing peers. These findings highlight the plasticity of the reading system and point to alternative pathways to reading success.
Collapse
|
14
|
Cross-linguistic metaphor priming in ASL-English bilinguals: Effects of the Double Mapping Constraint. SIGN LANGUAGE AND LINGUISTICS 2020; 23:96-111. [PMID: 33994844 PMCID: PMC8115326 DOI: 10.1075/sll.00045.sch] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Meir's (2010) Double Mapping Constraint (DMC) states the use of iconic signs in metaphors is restricted to signs that preserve the structural correspondence between the articulators and the concrete source domain and between the concrete and metaphorical domains. We investigated ASL signers' comprehension of English metaphors whose translations complied with the DMC (Communication collapsed during the meeting) or violated the DMC (The acid ate the metal). Metaphors were preceded by the ASL translation of the English verb, an unrelated sign, or a still video. Participants made sensibility judgments. Response times (RTs) were faster for DMC-compliant sentences with verb primes compared to unrelated primes or the still baseline. RTs for DMC-violation sentences were longer when preceded by verb primes. We propose the structured iconicity of the ASL verbs primed the semantic features involved in the iconic mapping and these primed semantic features facilitated comprehension of DMC-compliant metaphors and slowed comprehension of DMC-violation metaphors.
Collapse
|
15
|
Lexical selection in bimodal bilinguals: ERP evidence from picture-word interference. LANGUAGE, COGNITION AND NEUROSCIENCE 2020; 36:840-853. [PMID: 34485589 PMCID: PMC8411899 DOI: 10.1080/23273798.2020.1821905] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/25/2019] [Accepted: 09/04/2020] [Indexed: 06/13/2023]
Abstract
The picture word interference (PWI) paradigm and ERPs were used to investigate whether lexical selection in deaf and hearing ASL-English bilinguals occurs via lexical competition or whether the response exclusion hypothesis (REH) for PWI effects is supported. The REH predicts that semantic interference should not occur for bimodal bilinguals because sign and word responses do not compete within an output buffer. Bimodal bilinguals named pictures in ASL, preceded by either a translation equivalent, semantically-related, or unrelated English written word. In both the translation and semantically-related conditions bimodal bilinguals showed facilitation effects: reduced RTs and N400 amplitudes for related compared to unrelated prime conditions. We also observed an unexpected focal left anterior positivity that was stronger in the translation condition, which we speculate may be due to articulatory priming. Overall, the results support the REH and models of bilingual language production that assume lexical selection occurs without competition between languages.
Collapse
|
16
|
Picture-naming in American Sign Language: an electrophysiological study of the effects of iconicity and structured alignment. LANGUAGE, COGNITION AND NEUROSCIENCE 2020; 36:199-210. [PMID: 33732747 PMCID: PMC7959108 DOI: 10.1080/23273798.2020.1804601] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/22/2020] [Accepted: 07/25/2020] [Indexed: 06/12/2023]
Abstract
A picture-naming task and ERPs were used to investigate effects of iconicity and visual alignment between signs and pictures in American Sign Language (ASL). For iconic signs, half the pictures visually overlapped with phonological features of the sign (e.g., the fingers of CAT align with a picture of a cat with prominent whiskers), while half did not (whiskers are not shown). Iconic signs were produced numerically faster than non-iconic signs and were associated with larger N400 amplitudes, akin to concreteness effects. Pictures aligned with iconic signs were named faster than non-aligned pictures, and there was a reduction in N400 amplitude. No behavioral effects were observed for the control group (English speakers). We conclude that sensory-motoric semantic features are represented more robustly for iconic than non-iconic signs (eliciting a concreteness-like N400 effect) and visual overlap between pictures and the phonological form of iconic signs facilitates lexical retrieval (eliciting a reduced N400).
Collapse
|
17
|
How Bilingualism Contributes to Healthy Development in Deaf Children: A Public Health Perspective. Matern Child Health J 2020; 24:1330-1338. [PMID: 32632844 DOI: 10.1007/s10995-020-02976-6] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
The aim of this article is to increase awareness of language practices in the deaf community that affect communication needs and health outcomes, focusing particularly on the prevalence of bilingualism among deaf adults. Language deprivation and poor health outcomes in the deaf population are risks that cannot be addressed solely by hearing intervention. We propose that bilingualism acts as a protective measure to minimize the health risks faced by deaf individuals. Provision of culturally and linguistically appropriate services to deaf stakeholders, and particularly hearing families of deaf children, requires familiarity with the developmental and social ramifications of bilingualism.
Collapse
|
18
|
Sign phonological parameters modulate parafoveal preview effects in deaf readers. Cognition 2020; 201:104286. [PMID: 32521285 DOI: 10.1016/j.cognition.2020.104286] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2019] [Revised: 03/23/2020] [Accepted: 03/30/2020] [Indexed: 11/16/2022]
Abstract
Research has found that deaf readers unconsciously activate sign translations of written words while reading. However, the ways in which different sign phonological parameters associated with these sign translations tie into reading processes have received little attention in the literature. In this study on Chinese reading, we used a parafoveal preview paradigm to investigate how four different types of sign phonologically related preview affect reading processes in adult deaf signers of Hong Kong Sign Language (HKSL). The four types of sign phonologically related preview-target pair were: (1) pairs with HKSL translations that overlapped in three parameters-handshape, location, and movement; (2) pairs that overlapped in only handshape and location; (3) pairs that only overlapped in handshape and movement; and (4) pairs that only overlapped in location and movement. Results showed that the handshape parameter was of particular importance as only sign translation pairs that had handshape among their overlapping sign phonological parameters led to early sign activation. Furthermore, we found that, compared to control previews, deaf readers took longer to read targets when the sign translation previews overlapped with targets in either handshape and movement or handshape, movement, and location. In contrast, fixation times on targets were shorter when previews and targets overlapped location and any single additional parameter-either handshape or movement. These results indicate that the phonological parameters of handshape, location, and movement are activated via orthography during Chinese reading and can have different effects on parafoveal processing in deaf signers of HKSL.
Collapse
|
19
|
Abstract
The study of deaf users of signed languages, who often experience delays in primary language (L1) acquisition, permits a unique opportunity to examine the effects of aging on the processing of an L1 acquired under delayed or protracted development. A cohort of 107 congenitally deaf adult signers ages 45-85 years who were exposed to American Sign Language (ASL) either in infancy, early childhood, or late childhood were tested using an ASL sentence repetition test. Participants repeated 20 sentences that gradually increased in length and complexity. Logistic mixed-effects regression with the variables of chronological age (CA) and age of acquisition (AoA) was used to assess sentence repetition accuracy. Results showed that CA was a significant predictor, with increased age being associated with decreased likelihood to reproduce a sentence correctly (odds ratio [OR] = 0.56, p = .010). In addition, effects of AoA were observed. Relative to native deaf signers, those who acquired ASL in early childhood were less likely to successfully reproduce a sentence (OR = 0.42, p = .003), as were subjects who learned ASL in late childhood (OR = 0.27, p < .001). These data show that aging affects verbatim recall in deaf users of ASL and that the age of sign language acquisition has a significant and lasting effect on repetition ability, even after decades of sign language use. These data show evidence for life-span continuity of early life effects. (PsycInfo Database Record (c) 2020 APA, all rights reserved).
Collapse
|
20
|
Abstract
Seeing an object is a natural source for learning about the object's configuration. We show that language can also shape our knowledge about visual objects. We investigated sign language that enables deaf individuals to communicate through hand movements with as much expressive power as any other natural language. A few signs represent objects in a specific orientation. Sign-language users (signers) recognized visual objects faster when oriented as in the sign, and this match in orientation elicited specific brain responses in signers, as measured by event-related potentials (ERPs). Further analyses suggested that signers' responsiveness to object orientation derived from changes in the visual object representations induced by the signs. Our results also show that language facilitates discrimination between objects of the same kind (e.g., different cars), an effect never reported before with spoken languages. By focusing on sign language we could better characterize the impact of language (a uniquely human ability) on object visual processing.
Collapse
|
21
|
Second language acquisition of American Sign Language influences co-speech gesture production. BILINGUALISM (CAMBRIDGE, ENGLAND) 2020; 23:473-482. [PMID: 32733161 PMCID: PMC7392225] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Previous work indicates that 1) adults with native sign language experience produce more manual co-speech gestures than monolingual non-signers, and 2) one year of ASL instruction increases gesture production in adults, but not enough to differentiate them from non-signers. To elucidate these effects, we asked early ASL-English bilinguals, fluent late second language (L2) signers (≥ 10 years of experience signing), and monolingual non-signers to retell a story depicted in cartoon clips to a monolingual partner. Early and L2 signers produced manual gestures at higher rates compared to non-signers, particularly iconic gestures, and used a greater variety of handshapes. These results indicate susceptibility of the co-speech gesture system to modification by extensive sign language experience, regardless of the age of acquisition. L2 signers produced more ASL signs and more handshape varieties than early signers, suggesting less separation between the ASL lexicon and the co-speech gesture system for L2 signers.
Collapse
|
22
|
Studying bilingual learners and users of spoken and signed languages: A neuro-cognitive approach. PSYCHOLOGY OF LEARNING AND MOTIVATION 2020. [DOI: 10.1016/bs.plm.2020.03.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
23
|
Neurophysiological Correlates of Frequency, Concreteness, and Iconicity in American Sign Language. NEUROBIOLOGY OF LANGUAGE (CAMBRIDGE, MASS.) 2020; 1:249-267. [PMID: 33043298 PMCID: PMC7544239 DOI: 10.1162/nol_a_00012] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/07/2019] [Accepted: 04/16/2020] [Indexed: 05/21/2023]
Abstract
To investigate possible universal and modality-specific factors that influence the neurophysiological response during lexical processing, we recorded event-related potentials while a large group of deaf adults (n = 40) viewed 404 signs in American Sign Language (ASL) that varied in ASL frequency, concreteness, and iconicity. Participants performed a go/no-go semantic categorization task (does the sign refer to people?) to videoclips of ASL signs (clips began with the signer's hands at rest). Linear mixed-effects regression models were fit with per-participant, per-trial, and per-electrode data, allowing us to identify unique effects of each lexical variable. We observed an early effect of frequency (greater negativity for less frequent signs) beginning at 400 ms postvideo onset at anterior sites, which we interpreted as reflecting form-based lexical processing. This effect was followed by a more widely distributed posterior response that we interpreted as reflecting lexical-semantic processing. Paralleling spoken language, more concrete signs elicited greater negativities, beginning 600 ms postvideo onset with a wide scalp distribution. Finally, there were no effects of iconicity (except for a weak effect in the latest epochs; 1,000-1,200 ms), suggesting that iconicity does not modulate the neural response during sign recognition. Despite the perceptual and sensorimotoric differences between signed and spoken languages, the overall results indicate very similar neurophysiological processes underlie lexical access for both signs and words.
Collapse
|
24
|
What is the Source of Bilingual Cross-Language Activation in Deaf Bilinguals? JOURNAL OF DEAF STUDIES AND DEAF EDUCATION 2019; 24:356-365. [PMID: 31398721 DOI: 10.1093/deafed/enz024] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/09/2019] [Revised: 04/16/2019] [Accepted: 05/12/2019] [Indexed: 06/10/2023]
Abstract
When deaf bilinguals are asked to make semantic similarity judgments of two written words, their responses are influenced by the sublexical relationship of the signed language translations of the target words. This study investigated whether the observed effects of American Sign Language (ASL) activation on English print depend on (a) an overlap in syllabic structure of the signed translations or (b) on initialization, an effect of contact between ASL and English that has resulted in a direct representation of English orthographic features in ASL sublexical form. Results demonstrate that neither of these conditions is required or enhances effects of cross-language activation. The experimental outcomes indicate that deaf bilinguals discover the optimal mapping between their two languages in a manner that is not constrained by privileged sublexical associations.
Collapse
|
25
|
Back to the future? How Chinese-English bilinguals switch between front and back orientation for time. Neuroimage 2019; 203:116180. [PMID: 31520745 DOI: 10.1016/j.neuroimage.2019.116180] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2019] [Revised: 09/05/2019] [Accepted: 09/08/2019] [Indexed: 10/26/2022] Open
Abstract
The ability to conceive time is a corner stone of human cognition. It is unknown, however, whether time conceptualisation differs depending on language of operation in bilinguals. Whilst both Chinese and English cultures associate the future with the front space, some temporal expressions of Chinese involve a configuration reversal due to historic reasons. For instance, Chinese refers to the day after tomorrow using the spatiotemporal metaphor hou-tian - 'back-day' and to the day before yesterday using qian-tian - 'front-day'. Here, we show that native metaphors interfere with time conceptualisation when bilinguals operate in the second language. We asked Chinese-English bilinguals to indicate whether an auditory stimulus depicted a day of the week either one or two days away from the present day, irrespective of whether it referred to the past or the future, and ignoring whether it was presented through loudspeakers situated in the back or the front space. Stimulus configurations incongruent with spatiotemporal metaphors of Chinese (e.g., "Friday" presented in the front of the participant during a session held on a Wednesday) were conceptually more challenging than congruent configurations (e.g., the same stimulus presented in their back), as indexed by N400 modulations of event-related brain potentials. The same pattern obtained for days or years as stimuli, but surprisingly, it was found only when participants operated in English, not in Chinese. We contend that the task was easier and less prone to induce cross-language activation when conducted in the native language. We thus show that, when they operate in the second language, bilinguals unconsciously retrieve irrelevant native language representations that shape time conceptualisation in real time.
Collapse
|
26
|
ERP Evidence for Co-Activation of English Words during Recognition of American Sign Language Signs. Brain Sci 2019; 9:E148. [PMID: 31234356 PMCID: PMC6627215 DOI: 10.3390/brainsci9060148] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2019] [Revised: 06/18/2019] [Accepted: 06/20/2019] [Indexed: 11/17/2022] Open
Abstract
Event-related potentials (ERPs) were used to investigate co-activation of English words during recognition of American Sign Language (ASL) signs. Deaf and hearing signers viewed pairs of ASL signs and judged their semantic relatedness. Half of the semantically unrelated signs had English translations that shared an orthographic and phonological rime (e.g., BAR-STAR) and half did not (e.g., NURSE-STAR). Classic N400 and behavioral semantic priming effects were observed in both groups. For hearing signers, targets in sign pairs with English rime translations elicited a smaller N400 compared to targets in pairs with unrelated English translations. In contrast, a reversed N400 effect was observed for deaf signers: target signs in English rime translation pairs elicited a larger N400 compared to targets in pairs with unrelated English translations. This reversed effect was overtaken by a later, more typical ERP priming effect for deaf signers who were aware of the manipulation. These findings provide evidence that implicit language co-activation in bimodal bilinguals is bidirectional. However, the distinct pattern of effects in deaf and hearing signers suggests that it may be modulated by differences in language proficiency and dominance as well as by asymmetric reliance on orthographic versus phonological representations.
Collapse
|
27
|
Bilingualism and language similarity modify the neural mechanisms of selective attention. Sci Rep 2019; 9:8204. [PMID: 31160645 PMCID: PMC6547874 DOI: 10.1038/s41598-019-44782-3] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2019] [Accepted: 05/21/2019] [Indexed: 11/09/2022] Open
Abstract
Learning and using multiple languages places major demands on our neurocognitive system, which can impact the way the brain processes information. Here we investigated how early bilingualism influences the neural mechanisms of auditory selective attention, and whether this is further affected by the typological similarity between languages. We tested the neural encoding of continuous attended speech in early balanced bilinguals of typologically similar (Dutch-English) and dissimilar languages (Spanish-English) and compared them to results from English monolinguals we reported earlier. In a dichotic listening paradigm, participants attended to a narrative in their native language while ignoring different types of interference in the other ear. The results revealed that bilingualism modulates the neural mechanisms of selective attention even in the absence of consistent behavioural differences between monolinguals and bilinguals. They also suggested that typological similarity between languages helps fine-tune this modulation, reflecting life-long experiences with resolving competition between more or less similar candidates. The effects were consistent over the time-course of the narrative and suggest that learning a second language at an early age triggers neuroplastic adaptation of the attentional processing system.
Collapse
|
28
|
Testing for Nonselective Bilingual Lexical Access Using L1 Attrited Bilinguals. Brain Sci 2019; 9:brainsci9060126. [PMID: 31159405 PMCID: PMC6628369 DOI: 10.3390/brainsci9060126] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2019] [Revised: 05/12/2019] [Accepted: 05/27/2019] [Indexed: 11/26/2022] Open
Abstract
Research in the past few decades generally supported a nonselective view of bilingual lexical access, where a bilingual’s two languages are both active during monolingual processing. However, recent work by Costa et al. (2017) brought this into question by reinterpreting evidence for nonselectivity in a selective manner. We manipulated the factor of first language (L1) attrition in an event-related potential (ERP) experiment to disentangle Costa and colleagues’ selective processing proposal versus the traditional nonselective processing view of bilingual lexical access. Spanish–English bilinguals demonstrated an N400 effect of L1 attrition during implicit L1 processing in a second language (L2) semantic judgment task, indicating the contribution of variable L1 lexical access during L2 processing. These results are incompatible with Costa and colleagues’ selective model, adding to the literature supporting a nonselective view of bilingual lexical access.
Collapse
|
29
|
Sensorimotor characteristics of sign translations modulate EEG when deaf signers read English. BRAIN AND LANGUAGE 2018; 187:9-17. [PMID: 30399489 DOI: 10.1016/j.bandl.2018.10.001] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/01/2018] [Revised: 10/19/2018] [Accepted: 10/19/2018] [Indexed: 06/08/2023]
Abstract
Bilingual individuals automatically translate written words from one language to another. While this process is established in spoken-language bilinguals, there is less known about its occurrence in deaf bilinguals who know signed and spoken languages. Since sign language uses motion and space to convey linguistic content, it is possible that action simulation in the brain's sensorimotor system plays a role in this process. We recorded EEG from deaf participants fluent in ASL as they read individual English words and found significant differences in alpha and beta EEG at central electrode sites during the reading of English words whose ASL translations use two hands, compared to English words whose ASL translations use one hand. Hearing non-signers did not show any differences between conditions. These results demonstrate the involvement of the sensorimotor system in cross-linguistic, cross-modal translation, and suggest that covert action simulation processes are involved when deaf signers read.
Collapse
|
30
|
Electrophysiological dynamics of Chinese phonology during visual word recognition in Chinese-English bilinguals. Sci Rep 2018; 8:6869. [PMID: 29720729 PMCID: PMC5931991 DOI: 10.1038/s41598-018-25072-w] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2017] [Accepted: 03/23/2018] [Indexed: 11/24/2022] Open
Abstract
Silent word reading leads to the activation of orthographic (spelling), semantic (meaning), as well as phonological (sound) information. For bilinguals, native language information can also be activated automatically when they read words in their second language. For example, when Chinese-English bilinguals read words in their second language (English), the phonology of the Chinese translations is automatically activated. Chinese phonology, however, consists of consonants and vowels (segmental) and tonal information. To what extent these two aspects of Chinese phonology are activated is yet unclear. Here, we used behavioural measures, event-related potentials and oscillatory EEG to investigate Chinese segmental and tonal activation during word recognition. Evidence of Chinese segmental activation was found when bilinguals read English words (faster responses, reduced N400, gamma-band power reduction) and when they read Chinese words (increased LPC, gamma-band power reduction). In contrast, evidence for Chinese tonal activation was only found when bilinguals read Chinese words (gamma-band power increase). Together, our converging behavioural and electrophysiological evidence indicates that Chinese segmental information is activated during English word reading, whereas both segmental and tonal information are activated during Chinese word reading. Importantly, gamma-band oscillations are modulated differently by tonal and segmental activation, suggesting independent processing of Chinese tones and segments.
Collapse
|
31
|
Phonological and semantic priming in American Sign Language: N300 and N400 effects. LANGUAGE, COGNITION AND NEUROSCIENCE 2018; 33:1092-1106. [PMID: 30662923 PMCID: PMC6335044 DOI: 10.1080/23273798.2018.1446543] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/09/2017] [Accepted: 02/20/2018] [Indexed: 05/29/2023]
Abstract
This study investigated the electrophysiological signatures of phonological and semantic priming in American Sign Language (ASL). Deaf signers made semantic relatedness judgments to pairs of ASL signs separated by a 1300 ms prime-target SOA. Phonologically related sign pairs shared two of three phonological parameters (handshape, location, and movement). Target signs preceded by phonologically related and semantically related prime signs elicited smaller negativities within the N300 and N400 windows than those preceded by unrelated primes. N300 effects, typically reported in studies of picture processing, are interpreted to reflect the mapping from the visual features of the signs to more abstract linguistic representations. N400 effects, consistent with rhyme priming effects in the spoken language literature, are taken to index lexico-semantic processes that appear to be largely modality independent. Together, these results highlight both the unique visual-manual nature of sign languages and the linguistic processing characteristics they share with spoken languages.
Collapse
|