1
|
Meng Y, Liang C, Chen W, Liu Z, Yang C, Hu J, Gao Z, Gao S. Neural basis of language familiarity effects on voice recognition: An fNIRS study. Cortex 2024; 176:1-10. [PMID: 38723449 DOI: 10.1016/j.cortex.2024.04.007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2024] [Revised: 03/18/2024] [Accepted: 04/10/2024] [Indexed: 06/11/2024]
Abstract
Recognizing talkers' identity via speech is an important social skill in interpersonal interaction. Behavioral evidence has shown that listeners can identify better the voices of their native language than those of a non-native language, which is known as the language familiarity effect (LFE). However, its underlying neural mechanisms remain unclear. This study therefore investigated how the LFE occurs at the neural level by employing functional near-infrared spectroscopy (fNIRS). Late unbalanced bilinguals were first asked to learn to associate strangers' voices with their identities and then tested for recognizing the talkers' identities based on their voices speaking a language either highly familiar (i.e., native language Chinese), or moderately familiar (i.e., second language English), or completely unfamiliar (i.e., Ewe) to participants. Participants identified talkers the most accurately in Chinese and the least accurately in Ewe. Talker identification was quicker in Chinese than in English and Ewe but reaction time did not differ between the two non-native languages. At the neural level, recognizing voices speaking Chinese relative to English/Ewe produced less activity in the inferior frontal gyrus, precentral/postcentral gyrus, supramarginal gyrus, and superior temporal sulcus/gyrus while no difference was found between English and Ewe, indicating facilitation of voice identification by the automatic phonological encoding in the native language. These findings shed new light on the interrelations between language ability and voice recognition, revealing that the brain activation pattern of the LFE depends on the automaticity of language processing.
Collapse
Affiliation(s)
- Yuan Meng
- School of Foreign Languages, University of Electronic Science and Technology of China, Chengdu, China
| | - Chunyan Liang
- School of Foreign Languages, University of Electronic Science and Technology of China, Chengdu, China; Zhuojin Branch of Yandaojie Primary School, Chengdu, China
| | - Wenjing Chen
- School of Foreign Languages, University of Electronic Science and Technology of China, Chengdu, China
| | - Zhaoning Liu
- School of Foreign Languages, University of Electronic Science and Technology of China, Chengdu, China
| | - Chaoqing Yang
- School of Foreign Languages, University of Electronic Science and Technology of China, Chengdu, China
| | - Jiehui Hu
- School of Foreign Languages, University of Electronic Science and Technology of China, Chengdu, China; The Clinical Hospital of Chengdu Brain Science Institute, MOE Key Lab for NeuroInformation, University of Electronic Science and Technology of China, Chengdu, China
| | - Zhao Gao
- School of Foreign Languages, University of Electronic Science and Technology of China, Chengdu, China; The Clinical Hospital of Chengdu Brain Science Institute, MOE Key Lab for NeuroInformation, University of Electronic Science and Technology of China, Chengdu, China.
| | - Shan Gao
- School of Foreign Languages, University of Electronic Science and Technology of China, Chengdu, China; The Clinical Hospital of Chengdu Brain Science Institute, MOE Key Lab for NeuroInformation, University of Electronic Science and Technology of China, Chengdu, China.
| |
Collapse
|
2
|
Landsiedel J, Koldewyn K. Auditory dyadic interactions through the "eye" of the social brain: How visual is the posterior STS interaction region? IMAGING NEUROSCIENCE (CAMBRIDGE, MASS.) 2023; 1:1-20. [PMID: 37719835 PMCID: PMC10503480 DOI: 10.1162/imag_a_00003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/16/2023] [Accepted: 05/17/2023] [Indexed: 09/19/2023]
Abstract
Human interactions contain potent social cues that meet not only the eye but also the ear. Although research has identified a region in the posterior superior temporal sulcus as being particularly sensitive to visually presented social interactions (SI-pSTS), its response to auditory interactions has not been tested. Here, we used fMRI to explore brain response to auditory interactions, with a focus on temporal regions known to be important in auditory processing and social interaction perception. In Experiment 1, monolingual participants listened to two-speaker conversations (intact or sentence-scrambled) and one-speaker narrations in both a known and an unknown language. Speaker number and conversational coherence were explored in separately localised regions-of-interest (ROI). In Experiment 2, bilingual participants were scanned to explore the role of language comprehension. Combining univariate and multivariate analyses, we found initial evidence for a heteromodal response to social interactions in SI-pSTS. Specifically, right SI-pSTS preferred auditory interactions over control stimuli and represented information about both speaker number and interactive coherence. Bilateral temporal voice areas (TVA) showed a similar, but less specific, profile. Exploratory analyses identified another auditory-interaction sensitive area in anterior STS. Indeed, direct comparison suggests modality specific tuning, with SI-pSTS preferring visual information while aSTS prefers auditory information. Altogether, these results suggest that right SI-pSTS is a heteromodal region that represents information about social interactions in both visual and auditory domains. Future work is needed to clarify the roles of TVA and aSTS in auditory interaction perception and further probe right SI-pSTS interaction-selectivity using non-semantic prosodic cues.
Collapse
Affiliation(s)
- Julia Landsiedel
- Department of Psychology, School of Human and Behavioural Sciences, Bangor University, Bangor, United Kingdom
| | - Kami Koldewyn
- Department of Psychology, School of Human and Behavioural Sciences, Bangor University, Bangor, United Kingdom
| |
Collapse
|
3
|
Tarchi L, Damiani S, Fantoni T, Pisano T, Castellini G, Politi P, Ricca V. Centrality and interhemispheric coordination are related to different clinical/behavioral factors in attention deficit/hyperactivity disorder: a resting-state fMRI study. Brain Imaging Behav 2022; 16:2526-2542. [PMID: 35859076 PMCID: PMC9712307 DOI: 10.1007/s11682-022-00708-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/10/2022] [Indexed: 11/26/2022]
Abstract
Eigenvector-Centrality (EC) has shown promising results in the field of Psychiatry, with early results also pertaining to ADHD. Parallel efforts have focused on the description of aberrant interhemispheric coordination in ADHD, as measured by Voxel-Mirrored-Homotopic-Connectivity (VMHC), with early evidence of altered Resting-State fMRI. A sample was collected from the ADHD200-NYU initiative: 86 neurotypicals and 89 participants with ADHD between 7 and 18 years old were included after quality control for motion. After preprocessing, voxel-wise EC and VMHC values between diagnostic groups were compared, and network-level values from 15 functional networks extracted. Age, ADHD severity (Connor's Parent Rating-Scale), IQ (Wechsler-Abbreviated-Scale), and right-hand dominance were correlated with EC/VMHC values in the whole sample and within groups, both at the voxel-wise and network-level. Motion was controlled by censoring time-points with Framewise-Displacement > 0.5 mm, as well as controlling for group differences in mean Framewise-Displacement values. EC was significantly higher in ADHD compared to neurotypicals in the left inferior Frontal lobe, Lingual gyri, Peri-Calcarine cortex, superior and middle Occipital lobes, right inferior Occipital lobe, right middle Temporal gyrus, Fusiform gyri, bilateral Cuneus, right Precuneus, and Cerebellum (FDR-corrected-p = 0.05). No differences were observed between groups in voxel-wise VMHC. EC was positively correlated with ADHD severity scores at the network level (at p-value < 0.01, Inattentive: Cerebellum rho = 0.273; Hyper/Impulsive: High-Visual Network rho = 0.242, Cerebellum rho = 0.273; Global Index Severity: High-Visual Network rho = 0.241, Cerebellum rho = 0.293). No differences were observed between groups for motion (p = 0.443). While EC was more related to ADHD psychopathology, VMHC was consistently and negatively correlated with age across all networks.
Collapse
Affiliation(s)
- Livio Tarchi
- Psychiatry Unit, Department of Health Sciences, University of Florence, Florence, FI, Italy.
| | - Stefano Damiani
- Department of Brain and Behavioral Science, University of Pavia, 27100, Pavia, Italy
| | - Teresa Fantoni
- Pediatric Neurology, Neurogenetics and Neurobiology Unit and Laboratories, Neuroscience Department, Meyer Children's Hospital, University of Florence, Florence, Italy
| | - Tiziana Pisano
- Pediatric Neurology, Neurogenetics and Neurobiology Unit and Laboratories, Neuroscience Department, Meyer Children's Hospital, University of Florence, Florence, Italy
| | - Giovanni Castellini
- Psychiatry Unit, Department of Health Sciences, University of Florence, Florence, FI, Italy
| | - Pierluigi Politi
- Department of Brain and Behavioral Science, University of Pavia, 27100, Pavia, Italy
| | - Valdo Ricca
- Psychiatry Unit, Department of Health Sciences, University of Florence, Florence, FI, Italy
| |
Collapse
|
4
|
Garcia A, Cohen RA, Porges EC, Williamson JB, Woods AJ. Functional connectivity of brain networks during semantic processing in older adults. Front Aging Neurosci 2022; 14:814882. [PMID: 36337702 PMCID: PMC9627037 DOI: 10.3389/fnagi.2022.814882] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2021] [Accepted: 09/14/2022] [Indexed: 12/03/2022] Open
Abstract
The neural systems underlying semantic processing have been characterized with functional neuroimaging in young adults. Whether the integrity of these systems degrade with advanced age remains unresolved. The current study examined functional connectivity during abstract and concrete word processing. Thirty-eight adults, aged 55–91, engaged in semantic association decision tasks during a mixed event-related block functional magnetic resonance imaging (fMRI) paradigm. During the semantic trials, the task required participants to make a judgment as to whether pairs were semantically associated. During the rhyme trials, the task required participants to determine if non-word pairs rhymed. Seeds were placed in putative semantic hubs of the left anterior middle temporal gyrus (aMTG) and the angular gyrus (AG), and also in the left inferior frontal gyrus (IFG), an area considered important for semantic control. Greater connectivity between aMTG, AG, and IFG and multiple cortical areas occurred during semantic processing. Connectivity from the three seeds differed during semantic processing: the left AG and aMTG were strongly connected with frontal, parietal, and occipital areas bilaterally, whereas the IFG was most strongly connected with other frontal cortical areas and the AG in the ipsilateral left hemisphere. Notably, the strength and extent of connectivity differed for abstract and concrete semantic processing; connectivity from the left aMTG and AG to bilateral cortical areas was greater during abstract processing, whereas IFG connectivity with left cortical areas was greater during concrete processing. With advanced age, greater connectivity occurred only between the left AG and supramarginal gyrus during the processing of concrete word-pairs, but not abstract word-pairs. Among older adults, robust functional connectivity of the aMTG, AG, and IFG to widely distributed bilateral cortical areas occurs during abstract and concrete semantic processing in a manner consistent with reports from past studies of young adults. There was not a significant degradation of functional connectivity during semantic processing between the ages of 55 and 85 years. As the study focused on semantic functioning in older adults, a comparison group of young adults was not included, limiting generalizability. Future longitudinal neuroimaging studies that compare functional connectivity of young and older adults under different semantic demands will be valuable.
Collapse
|
5
|
Li Z, Hong B, Wang D, Nolte G, Engel AK, Zhang D. Speaker-listener neural coupling reveals a right-lateralized mechanism for non-native speech-in-noise comprehension. Cereb Cortex 2022; 33:3701-3714. [PMID: 35975617 DOI: 10.1093/cercor/bhac302] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2022] [Revised: 07/08/2022] [Accepted: 07/09/2022] [Indexed: 11/14/2022] Open
Abstract
While the increasingly globalized world has brought more and more demands for non-native language communication, the prevalence of background noise in everyday life poses a great challenge to non-native speech comprehension. The present study employed an interbrain approach based on functional near-infrared spectroscopy (fNIRS) to explore how people adapt to comprehend non-native speech information in noise. A group of Korean participants who acquired Chinese as their non-native language was invited to listen to Chinese narratives at 4 noise levels (no noise, 2 dB, -6 dB, and - 9 dB). These narratives were real-life stories spoken by native Chinese speakers. Processing of the non-native speech was associated with significant fNIRS-based listener-speaker neural couplings mainly over the right hemisphere at both the listener's and the speaker's sides. More importantly, the neural couplings from the listener's right superior temporal gyrus, the right middle temporal gyrus, as well as the right postcentral gyrus were found to be positively correlated with their individual comprehension performance at the strongest noise level (-9 dB). These results provide interbrain evidence in support of the right-lateralized mechanism for non-native speech processing and suggest that both an auditory-based and a sensorimotor-based mechanism contributed to the non-native speech-in-noise comprehension.
Collapse
Affiliation(s)
- Zhuoran Li
- Department of Psychology, School of Social Sciences, Tsinghua University, Beijing 100084, China.,Tsinghua Laboratory of Brain and Intelligence, Tsinghua University, Beijing 100084, China
| | - Bo Hong
- Tsinghua Laboratory of Brain and Intelligence, Tsinghua University, Beijing 100084, China.,Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing 100084, China
| | - Daifa Wang
- School of Biological Science and Medical Engineering, Beihang University, Beijing 100083, China
| | - Guido Nolte
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg Eppendorf, 20246 Hamburg, Germany
| | - Andreas K Engel
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg Eppendorf, 20246 Hamburg, Germany
| | - Dan Zhang
- Department of Psychology, School of Social Sciences, Tsinghua University, Beijing 100084, China.,Tsinghua Laboratory of Brain and Intelligence, Tsinghua University, Beijing 100084, China
| |
Collapse
|