1
|
Ren H, Li YZ, Bi HY, Yang Y. The shared neurobiological basis of developmental dyslexia and developmental stuttering: A meta-analysis of functional and structural MRI studies. Int J Clin Health Psychol 2024; 24:100519. [PMID: 39582485 PMCID: PMC11585698 DOI: 10.1016/j.ijchp.2024.100519] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2024] [Accepted: 11/02/2024] [Indexed: 11/26/2024] Open
Abstract
Background Developmental dyslexia (DD) and persistent developmental stuttering (PDS) are the most representative written and spoken language disorders, respectively, and both significantly hinder life success. Although widespread brain alterations are evident in both DD and PDS, it remains unclear to what extent these two language disorders share common neural substrates. Methods A systematic review and meta-analysis of task-based functional magnetic resonance imaging (fMRI) and voxel-based morphometry (VBM) studies of PDS and DD were conducted to explore the shared functional and anatomical alterations across these disorders. Results The results of fMRI studies indicated shared hypoactivation in the left inferior temporal gyrus and inferior parietal gyrus across PDS and DD compared to healthy controls. When examined separately for children and adults, we found that child participants exhibited reduced activation in the left inferior temporal gyrus, inferior parietal gyrus, precentral gyrus, middle temporal gyrus, and inferior frontal gyrus, possibly reflecting the universal causes of written and spoken language disorders. In contrast, adult participants exhibited hyperactivation in the right precentral gyrus and left cingulate motor cortex, possibly reflecting common compensatory mechanisms. Anatomically, the analysis of VBM studies revealed decreased gray matter volume in the left inferior frontal gyrus across DD and PDS, which was exclusively observed in children. Finally, meta-analytic connectivity modeling and brain-behavior correlation analyses were conducted to explore functional connectivity patterns and related cognitive functions of the brain regions commonly involved in DD and PDS. Conclusions This study identified concordances in brain abnormalities across DD and PDS, suggesting common neural substrates for written and spoken language disorders and providing new insights into the transdiagnostic neural signatures of language disorders.
Collapse
Affiliation(s)
- Huan Ren
- Key Laboratory of Behavioral Science, Center for Brain Science and Learning Difficulties, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing 100049, China
| | - Yi zhen Li
- Key Laboratory of Behavioral Science, Center for Brain Science and Learning Difficulties, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing 100049, China
| | - Hong-Yan Bi
- Key Laboratory of Behavioral Science, Center for Brain Science and Learning Difficulties, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing 100049, China
| | - Yang Yang
- Key Laboratory of Behavioral Science, Center for Brain Science and Learning Difficulties, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing 100049, China
- Center for Language and Brain, Shenzhen Institute of Neuroscience, Shenzhen 518057, China
| |
Collapse
|
2
|
Lazaro MJ, Lee J, Chun J, Yun MH, Kim S. Multimodal interaction: Input-output modality combinations for identification tasks in augmented reality. APPLIED ERGONOMICS 2022; 105:103842. [PMID: 35868052 DOI: 10.1016/j.apergo.2022.103842] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/04/2022] [Revised: 05/11/2022] [Accepted: 06/25/2022] [Indexed: 06/15/2023]
Abstract
Multimodal interaction (MMI) is being widely implemented, especially in new technologies such as augmented reality (AR) systems since it is presumed to support a more natural, efficient, and flexible form of interaction. However, limited research has been done to investigate the proper application of MMI in AR. More specifically, the effects of combining different input and output modalities during MMI in AR are still not fully understood. Therefore, this study aims to examine the independent and combined effects of different input and output modalities during a typical AR task. 20 young adults participated in a controlled experiment in which they were asked to perform a simple identification task using an AR device in different input (speech, gesture, multimodal) and output (VV-VA, VV-NA, NV-VA, NV-NA) conditions. Results showed that there were differences in the influence of input and output modalities on task performance, workload, perceived appropriateness, and user preference. Interaction effects between the input and output conditions on the performance metrics were also evident in this study, suggesting that although multimodal input is generally preferred by the users, it should be implemented with caution since its effectiveness is highly influenced by the processing code of the system output. This study, which is the first of its kind, has revealed several new implications regarding the application of MMI in AR systems.
Collapse
Affiliation(s)
- May Jorella Lazaro
- Interdisciplinary Program in Cognitive Science, Seoul National University, Seoul, South Korea
| | - Jaeyong Lee
- Samsung Electronics Co, Ltd, Seoul, South Korea
| | - Jaemin Chun
- Samsung Electronics Co, Ltd, Seoul, South Korea
| | - Myung Hwan Yun
- Department of Industrial Engineering & Institute for Industrial System Innovation, Seoul National University, Seoul, South Korea.
| | - Sungho Kim
- Department of Systems Engineering, Republic of Korea Air Force Academy, Cheongju, South Korea.
| |
Collapse
|
3
|
Rann JC, Almor A. Effects of verbal tasks on driving simulator performance. Cogn Res Princ Implic 2022; 7:12. [PMID: 35119569 PMCID: PMC8817015 DOI: 10.1186/s41235-022-00357-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2021] [Accepted: 01/08/2022] [Indexed: 11/10/2022] Open
Abstract
We report results from a driving simulator paradigm we developed to test the fine temporal effects of verbal tasks on simultaneous tracking performance. A total of 74 undergraduate students participated in two experiments in which they controlled a cursor using the steering wheel to track a moving target and where the dependent measure was overall deviation from target. Experiment 1 tested tracking performance during slow and fast target speeds under conditions involving either no verbal input or output, passive listening to spoken prompts via headphones, or responding to spoken prompts. Experiment 2 was similar except that participants read written prompts overlain on the simulator screen instead of listening to spoken prompts. Performance in both experiments was worse during fast speeds and worst overall during responding conditions. Most significantly, fine scale time-course analysis revealed deteriorating tracking performance as participants prepared and began speaking and steadily improving performance while speaking. Additionally, post-block survey data revealed that conversation recall was best in responding conditions, and perceived difficulty increased with task complexity. Our study is the first to track temporal changes in interference at high resolution during the first hundreds of milliseconds of verbal production and comprehension. Our results are consistent with load-based theories of multitasking performance and show that language production, and, to a lesser extent, language comprehension tap resources also used for tracking. More generally, our paradigm provides a useful tool for measuring dynamical changes in tracking performance during verbal tasks due to the rapidly changing resource requirements of language production and comprehension.
Collapse
Affiliation(s)
- Jonathan C Rann
- Department of Psychology, University of South Carolina, 1512 Pendelton Street, Columbia, SC, 29208, USA. .,Institute for Mind and Brain, University of South Carolina, Columbia, SC, 29208, USA.
| | - Amit Almor
- Department of Psychology, University of South Carolina, 1512 Pendelton Street, Columbia, SC, 29208, USA.,Institute for Mind and Brain, University of South Carolina, Columbia, SC, 29208, USA.,Linguistics Program, University of South Carolina, Columbia, SC, 29208, USA
| |
Collapse
|
4
|
Schaeffner S, Koch I, Philipp AM. Sensory-motor modality compatibility in multitasking: The influence of processing codes. Acta Psychol (Amst) 2018; 191:210-218. [PMID: 30312892 DOI: 10.1016/j.actpsy.2018.09.012] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2017] [Revised: 08/24/2018] [Accepted: 09/26/2018] [Indexed: 11/16/2022] Open
Abstract
Sensory-motor modality compatibility is defined as the similarity between the sensory modality and the modality of response-related effects. Previous dual-task and task-switching studies have shown higher performance costs for coordinating relatively incompatible sensory-motor modality mappings (i.e., auditory-manual and visual-vocal) compared to more compatible mappings (i.e., auditory-vocal and visual-manual). Until now, however, little attention has been paid to potential variability in effects of modality compatibility depending on different processing codes. In the present study, we independently varied the processing codes of input and output (nonverbal-spatial, nonverbal-nominal, verbal-spatial, verbal-nominal) while participants switched between incompatible and compatible sensory-motor modality mappings. Beside higher switch costs for switching between incompatible sensory-motor modality mappings than for switching between compatible mappings, the results revealed stronger effects of modality compatibility on switch costs for verbal input than for nonverbal input codes. This suggests that priming mechanisms between sensory input and compatible motor output are modulated by the processing code of the sensory input. As possible explanations, we assume a higher degree of concordance with output processing codes as well as stronger associations with potential response effects for verbal than for nonverbal input.
Collapse
Affiliation(s)
| | - Iring Koch
- RWTH Aachen University, Institute of Psychology, Aachen, Germany
| | - Andrea M Philipp
- RWTH Aachen University, Institute of Psychology, Aachen, Germany
| |
Collapse
|
5
|
Dual-task automatization: The key role of sensory–motor modality compatibility. Atten Percept Psychophys 2017; 80:752-772. [DOI: 10.3758/s13414-017-1469-4] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
6
|
Fintor E, Stephan DN, Koch I. Emerging features of modality mappings in task switching: modality compatibility requires variability at the level of both stimulus and response modality. PSYCHOLOGICAL RESEARCH 2017; 82:121-133. [PMID: 28578525 DOI: 10.1007/s00426-017-0875-5] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2016] [Accepted: 05/23/2017] [Indexed: 11/30/2022]
Abstract
The term modality compatibility refers to the similarity between the stimulus modality and the modality of response-related sensory consequences. Previous research showed evidence for modality compatibility benefits in task switching, when participants switch either between two modality compatible tasks (auditory-vocal and visual-manual) or between two modality incompatible tasks (auditory-manual and visual-vocal). However, it remained unclear whether there is also a modality compatibility benefit when participants switch between a modality compatible and an incompatible task. To this end, in Experiment 1, we kept the same design as in earlier studies, so participants had to switch either between modality compatible or modality incompatible spatial discrimination tasks, but in Experiment 2A, participants switched at the response level (manual/vocal) while we kept the stimulus modality constant across tasks, and in Experiment 2B, they switched at the stimulus level (visual/auditory) while we kept the response modality constant across tasks. We found increased switch costs in modality incompatible tasks in Experiment 1, but no such a difference between modality compatible and incompatible tasks in Experiment 2A and 2B, supporting the idea that modality incompatible tasks increase crosstalk, due to the response-based priming of the competing task, but this crosstalk is reduced if the competing task involves either the same stimulus modality or the same response modality. We conclude that a significant impact of modality compatibility in task switching requires variability at the level of both stimulus and response modality.
Collapse
Affiliation(s)
- Edina Fintor
- Institute of Psychology, RWTH Aachen University, Jägerstraße 17-19, 52066, Aachen, Germany.
| | - Denise N Stephan
- Institute of Psychology, RWTH Aachen University, Jägerstraße 17-19, 52066, Aachen, Germany
| | - Iring Koch
- Institute of Psychology, RWTH Aachen University, Jägerstraße 17-19, 52066, Aachen, Germany
| |
Collapse
|
7
|
Schaeffner S, Koch I, Philipp AM. The role of learning in sensory-motor modality switching. PSYCHOLOGICAL RESEARCH 2017; 82:955-969. [PMID: 28540479 DOI: 10.1007/s00426-017-0872-8] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2016] [Accepted: 05/21/2017] [Indexed: 11/25/2022]
Abstract
Previous research has indicated that modality switching is considerably affected by modality compatibility. It has been shown that switch costs are higher for switching between relatively incompatible sensory-motor modality mappings (i.e., auditory-manual and visual-vocal) compared to switching between compatible mappings (i.e., auditory-vocal and visual-manual). So far, however, it has been unclear whether these findings are influenced by learning processes resulting from very small stimulus sets and a large number of stimulus repetitions. In the present study, we investigated the role of learning concept-to-category associations (Experiment 1) as well as influences of learning concept-to-modality mappings (Experiment 2) on sensory-motor modality switching in semantic categorizations. The results of both experiments revealed shorter overall reaction times due to learning. Additionally, learning of concept-to-category associations (Experiment 1) led to a significant reduction of modality switch costs. Interestingly, however, modality-compatibility effects were neither significantly influenced by learning of concept-to-category associations nor by learning of concept-to-modality mappings. Thus, the present study provides first evidence that learning on the semantic level influences modality switching but it does not significantly affect modality compatibility.
Collapse
Affiliation(s)
- Simone Schaeffner
- RWTH Aachen University, Institute of Psychology, Jägerstrasse 17-19, 52056, Aachen, Germany.
| | - Iring Koch
- RWTH Aachen University, Institute of Psychology, Jägerstrasse 17-19, 52056, Aachen, Germany
| | - Andrea M Philipp
- RWTH Aachen University, Institute of Psychology, Jägerstrasse 17-19, 52056, Aachen, Germany
| |
Collapse
|
8
|
Földes N, Philipp AM, Badets A, Koch I. Exploring Modality Compatibility in the Response-Effect Compatibility Paradigm. Adv Cogn Psychol 2017; 13:97-104. [PMID: 28450976 PMCID: PMC5404091 DOI: 10.5709/acp-0210-1] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2016] [Accepted: 02/03/2017] [Indexed: 01/24/2023] Open
Abstract
According to ideomotor theory, action planning is based on anticipatory perceptual representations of action-effects. This aspect of action control has been investigated in studies using the response-effect compatibility (REC) paradigm, in which responses have been shown to be facilitated if ensuing perceptual effects share codes with the response based on dimensional overlap (i.e., REC). Additionally, according to the notion of ideomotor compatibility, certain response-effect (R-E) mappings will be stronger than others because some response features resemble the anticipated sensory response effects more strongly than others (e.g., since vocal responses usually produce auditory effects, an auditory stimulus should be anticipated in a stronger manner following vocal responses rather than following manual responses). Yet, systematic research on this matter is lacking. In the present study, two REC experiments aimed to explore the influence of R-E modality mappings. In Experiment 1, vocal number word responses produced visual effects on the screen (digits vs. number words; i.e., visual-symbolic vs. visual-verbal effect codes). The REC effect was only marginally larger for visual-verbal than for visual-symbolic effects. Using verbal effect codes in Experiment 2, we found that the REC effect was larger with auditory-verbal R-E mapping than with visual-verbal R-E mapping. Overall, the findings support the hypothesis of a role of R-E modality mappings in REC effects, suggesting both further evidence for ideomotor accounts as well as code-specific and modality-specific contributions to effect anticipation.
Collapse
Affiliation(s)
| | | | - Arnaud Badets
- CNRS, Institut de Neurosciences Cognitives et Intégratives
d’Aquitaine (UMR 5287), Université de Bordeaux, France
| | | |
Collapse
|
9
|
Schaeffner S, Koch I, Philipp AM. Semantic effects on sensory-motor modality switching. JOURNAL OF COGNITIVE PSYCHOLOGY 2016. [DOI: 10.1080/20445911.2016.1181636] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|