1
|
Parekh P, Fan CC, Frei O, Palmer CE, Smith DM, Makowski C, Iversen JR, Pecheva D, Holland D, Loughnan R, Nedelec P, Thompson WK, Hagler DJ, Andreassen OA, Jernigan TL, Nichols TE, Dale AM. FEMA: Fast and efficient mixed-effects algorithm for large sample whole-brain imaging data. Hum Brain Mapp 2024; 45:e26579. [PMID: 38339910 PMCID: PMC10823765 DOI: 10.1002/hbm.26579] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2023] [Revised: 12/08/2023] [Accepted: 12/17/2023] [Indexed: 02/12/2024] Open
Abstract
The linear mixed-effects model (LME) is a versatile approach to account for dependence among observations. Many large-scale neuroimaging datasets with complex designs have increased the need for LME; however LME has seldom been used in whole-brain imaging analyses due to its heavy computational requirements. In this paper, we introduce a fast and efficient mixed-effects algorithm (FEMA) that makes whole-brain vertex-wise, voxel-wise, and connectome-wide LME analyses in large samples possible. We validate FEMA with extensive simulations, showing that the estimates of the fixed effects are equivalent to standard maximum likelihood estimates but obtained with orders of magnitude improvement in computational speed. We demonstrate the applicability of FEMA by studying the cross-sectional and longitudinal effects of age on region-of-interest level and vertex-wise cortical thickness, as well as connectome-wide functional connectivity values derived from resting state functional MRI, using longitudinal imaging data from the Adolescent Brain Cognitive DevelopmentSM Study release 4.0. Our analyses reveal distinct spatial patterns for the annualized changes in vertex-wise cortical thickness and connectome-wide connectivity values in early adolescence, highlighting a critical time of brain maturation. The simulations and application to real data show that FEMA enables advanced investigation of the relationships between large numbers of neuroimaging metrics and variables of interest while considering complex study designs, including repeated measures and family structures, in a fast and efficient manner. The source code for FEMA is available via: https://github.com/cmig-research-group/cmig_tools/.
Collapse
Affiliation(s)
- Pravesh Parekh
- NORMENT, Division of Mental Health and Addiction, Oslo University Hospital & Institute of Clinical MedicineUniversity of OsloOsloNorway
| | - Chun Chieh Fan
- Center for Population Neuroscience and GeneticsLaureate Institute for Brain ResearchTulsaOklahomaUSA
- Department of Radiology, School of MedicineUniversity of California San DiegoLa JollaCaliforniaUSA
| | - Oleksandr Frei
- NORMENT, Division of Mental Health and Addiction, Oslo University Hospital & Institute of Clinical MedicineUniversity of OsloOsloNorway
- Centre for Bioinformatics, Department of InformaticsUniversity of OsloOsloNorway
| | - Clare E. Palmer
- Center for Human DevelopmentUniversity of California San DiegoLa JollaCaliforniaUSA
| | - Diana M. Smith
- Center for Human DevelopmentUniversity of California San DiegoLa JollaCaliforniaUSA
- Center for Multimodal Imaging and GeneticsUniversity of California San DiegoLa JollaCaliforniaUSA
- Neurosciences Graduate ProgramUniversity of California San DiegoLa JollaCaliforniaUSA
- Medical Scientist Training ProgramUniversity of California San DiegoLa JollaCaliforniaUSA
| | - Carolina Makowski
- Department of Radiology, School of MedicineUniversity of California San DiegoLa JollaCaliforniaUSA
- Center for Multimodal Imaging and GeneticsUniversity of California San DiegoLa JollaCaliforniaUSA
| | - John R. Iversen
- Center for Human DevelopmentUniversity of California San DiegoLa JollaCaliforniaUSA
- Institute for Neural ComputationUniversity of California San DiegoLa JollaCaliforniaUSA
- The Swartz Center for Computational NeuroscienceUniversity of California San DiegoLa JollaCaliforniaUSA
- Department of Psychology Neuroscience & BehaviourMcMaster UniversityHamiltonOntarioCanada
| | - Diliana Pecheva
- Department of Radiology, School of MedicineUniversity of California San DiegoLa JollaCaliforniaUSA
- Center for Multimodal Imaging and GeneticsUniversity of California San DiegoLa JollaCaliforniaUSA
| | - Dominic Holland
- Department of Radiology, School of MedicineUniversity of California San DiegoLa JollaCaliforniaUSA
| | - Robert Loughnan
- Population Neuroscience and Genetics LabUniversity of California San DiegoLa JollaCaliforniaUSA
| | - Pierre Nedelec
- Department of Radiology and Biomedical ImagingUniversity of California San FranciscoSan FranciscoCaliforniaUSA
| | - Wesley K. Thompson
- Center for Population Neuroscience and GeneticsLaureate Institute for Brain ResearchTulsaOklahomaUSA
| | - Donald J. Hagler
- Department of Radiology, School of MedicineUniversity of California San DiegoLa JollaCaliforniaUSA
- Center for Multimodal Imaging and GeneticsUniversity of California San DiegoLa JollaCaliforniaUSA
| | - Ole A. Andreassen
- NORMENT, Division of Mental Health and Addiction, Oslo University Hospital & Institute of Clinical MedicineUniversity of OsloOsloNorway
| | - Terry L. Jernigan
- Department of Radiology, School of MedicineUniversity of California San DiegoLa JollaCaliforniaUSA
- Center for Human DevelopmentUniversity of California San DiegoLa JollaCaliforniaUSA
- Department of Cognitive ScienceUniversity of California San DiegoLa JollaCaliforniaUSA
- Department of PsychiatryUniversity of California San DiegoLa JollaCaliforniaUSA
| | - Thomas E. Nichols
- Big Data Institute, Li Ka Shing Centre for Health Information and Discovery, Nuffield Department of Population HealthUniversity of OxfordOxfordUK
- Wellcome Centre for Integrative Neuroimaging, FMRIB, Nuffield Department of Clinical NeurosciencesUniversity of OxfordOxfordUK
| | - Anders M. Dale
- Department of Radiology, School of MedicineUniversity of California San DiegoLa JollaCaliforniaUSA
- Center for Multimodal Imaging and GeneticsUniversity of California San DiegoLa JollaCaliforniaUSA
- Department of Cognitive ScienceUniversity of California San DiegoLa JollaCaliforniaUSA
- Department of PsychiatryUniversity of California San DiegoLa JollaCaliforniaUSA
- Department of NeuroscienceUniversity of California San DiegoLa JollaCaliforniaUSA
| |
Collapse
|
2
|
Persici V, Blain SD, Iversen JR, Key AP, Kotz SA, Devin McAuley J, Gordon RL. Individual differences in neural markers of beat processing relate to spoken grammar skills in six-year-old children. Brain Lang 2023; 246:105345. [PMID: 37994830 DOI: 10.1016/j.bandl.2023.105345] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/20/2023] [Revised: 10/05/2023] [Accepted: 10/10/2023] [Indexed: 11/24/2023]
Abstract
Based on the idea that neural entrainment establishes regular attentional fluctuations that facilitate hierarchical processing in both music and language, we hypothesized that individual differences in syntactic (grammatical) skills will be partly explained by patterns of neural responses to musical rhythm. To test this hypothesis, we recorded neural activity using electroencephalography (EEG) while children (N = 25) listened passively to rhythmic patterns that induced different beat percepts. Analysis of evoked beta and gamma activity revealed that individual differences in the magnitude of neural responses to rhythm explained variance in six-year-olds' expressive grammar abilities, beyond and complementarily to their performance in a behavioral rhythm perception task. These results reinforce the idea that mechanisms of neural beat entrainment may be a shared neural resource supporting hierarchical processing across music and language and suggest a relevant marker of the relationship between rhythm processing and grammar abilities in elementary-school-age children, previously observed only behaviorally.
Collapse
Affiliation(s)
- Valentina Persici
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA; Department of Psychology, University of Milano - Bicocca, Milan, Italy; Department of Otolaryngology - Head & Neck Surgery, Vanderbilt University Medical Center, Nashville, TN, USA; Department of Human Sciences, University of Verona, Verona, Italy.
| | - Scott D Blain
- Department of Psychiatry, University of Michigan, Ann Arbor, MI, USA
| | - John R Iversen
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, Ontario, Canada; Institute for Neural Computation, University of California San Diego, La Jolla, CA, USA
| | - Alexandra P Key
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA; Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, TN, USA; Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, TN, USA
| | - Sonja A Kotz
- Department of Neuropsychology and Psychopharmacology, Maastricht University, Maastricht, the Netherlands; Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - J Devin McAuley
- Department of Psychology, Michigan State University, East Lansing, MI, USA
| | - Reyna L Gordon
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA; Department of Otolaryngology - Head & Neck Surgery, Vanderbilt University Medical Center, Nashville, TN, USA; Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, TN, USA; Department of Psychology, Vanderbilt University, Nashville, TN, USA.
| |
Collapse
|
3
|
Gustavson DE, Nayak S, Coleman PL, Iversen JR, Lense MD, Gordon RL, Maes HH. Heritability of Childhood Music Engagement and Associations with Language and Executive Function: Insights from the Adolescent Brain Cognitive Development (ABCD) Study. Behav Genet 2023; 53:189-207. [PMID: 36757558 PMCID: PMC10159991 DOI: 10.1007/s10519-023-10135-0] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2022] [Accepted: 01/27/2023] [Indexed: 02/10/2023]
Abstract
Music engagement is a powerful, influential experience that often begins early in life. Music engagement is moderately heritable in adults (~ 41-69%), but fewer studies have examined genetic influences on childhood music engagement, including their association with language and executive functions. Here we explored genetic and environmental influences on music listening and instrument playing (including singing) in the baseline assessment of the Adolescent Brain Cognitive Development study. Parents reported on their 9-10-year-old children's music experiences (N = 11,876 children; N = 1543 from twin pairs). Both music measures were explained primarily by shared environmental influences. Instrument exposure (but not frequency of instrument engagement) was associated with language skills (r = .27) and executive functions (r = .15-0.17), and these associations with instrument engagement were stronger than those for music listening, visual art, or soccer engagement. These findings highlight the role of shared environmental influences between early music experiences, language, and executive function, during a formative time in development.
Collapse
Affiliation(s)
- Daniel E Gustavson
- Institute for Behavioral Genetics, University of Colorado Boulder, 1480 30th St, Boulder, CO, 80303, USA.
- Vanderbilt Genetics Institute, Vanderbilt University Medical Center, Nashville, TN, USA.
| | - Srishti Nayak
- Vanderbilt Genetics Institute, Vanderbilt University Medical Center, Nashville, TN, USA
- Department of Psychology, Middle Tennessee State University, Murfreesboro, TN, USA
- Department of Otolaryngology - Head & Neck Surgery, Vanderbilt University Medical Center, Nashville, TN, USA
| | | | - John R Iversen
- Swartz Center for Computational Neuroscience, Institute for Neural Computation, University of California, San Diego, La Jolla, CA, USA
| | - Miriam D Lense
- Department of Otolaryngology - Head & Neck Surgery, Vanderbilt University Medical Center, Nashville, TN, USA
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
- The Curb Center, Vanderbilt University, Nashville, TN, USA
| | - Reyna L Gordon
- Vanderbilt Genetics Institute, Vanderbilt University Medical Center, Nashville, TN, USA
- Department of Otolaryngology - Head & Neck Surgery, Vanderbilt University Medical Center, Nashville, TN, USA
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
- The Curb Center, Vanderbilt University, Nashville, TN, USA
| | - Hermine H Maes
- Department of Human and Molecular Genetics, Virginia Institute for Psychiatric and Behavioral Genetics, Virginia Commonwealth University, Richmond, VA, USA
- Department of Psychiatry, Virginia Institute for Psychiatric and Behavioral Genetics, Virginia Commonwealth University, Richmond, VA, USA
- Massey Cancer Center, Virginia Commonwealth University, Richmond, VA, USA
| |
Collapse
|
4
|
Palmer CE, Pecheva D, Iversen JR, Hagler DJ, Sugrue L, Nedelec P, Fan CC, Thompson WK, Jernigan TL, Dale AM. Corrigendum to "Microstructural development from 9 to 14 years: Evidence from the ABCD Study" [Dev. Cognit. Neurosci. 53 (2022) 101044]. Dev Cogn Neurosci 2022; 54:101063. [PMID: 35034850 PMCID: PMC9019833 DOI: 10.1016/j.dcn.2022.101063] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022] Open
Affiliation(s)
- Clare E Palmer
- Center for Human Development, University of California, San Diego, 9500 Gilman Drive, La Jolla, CA 92161, USA.
| | - Diliana Pecheva
- Center for Multimodal Imaging and Genetics, University of California, San Diego School of Medicine, 9444 Medical Center Dr, La Jolla, CA 92037, USA; Department of Radiology, University of California, San Diego School of Medicine, 9500 Gilman Drive, La Jolla, CA 92037, USA
| | - John R Iversen
- Institute for Neural Computation, University of California, San Diego, 9500 Gilman Drive, La Jolla, CA 92093, USA
| | - Donald J Hagler
- Center for Multimodal Imaging and Genetics, University of California, San Diego School of Medicine, 9444 Medical Center Dr, La Jolla, CA 92037, USA; Department of Radiology, University of California, San Diego School of Medicine, 9500 Gilman Drive, La Jolla, CA 92037, USA
| | - Leo Sugrue
- Department of Radiology and Biomedical Imaging, University of California, San Francisco, 505 Parnassus Avenue, San Francisco, CA 94143, USA
| | - Pierre Nedelec
- Department of Radiology and Biomedical Imaging, University of California, San Francisco, 505 Parnassus Avenue, San Francisco, CA 94143, USA
| | - Chun Chieh Fan
- Center for Multimodal Imaging and Genetics, University of California, San Diego School of Medicine, 9444 Medical Center Dr, La Jolla, CA 92037, USA
| | - Wesley K Thompson
- Division of Biostatistics, Department of Family Medicine and Public Health, University of California, San Diego, La Jolla, CA, USA
| | - Terry L Jernigan
- Center for Human Development, University of California, San Diego, 9500 Gilman Drive, La Jolla, CA 92161, USA; Department of Radiology, University of California, San Diego School of Medicine, 9500 Gilman Drive, La Jolla, CA 92037, USA; Department of Cognitive Science, University of California, San Diego, 9500 Gilman Drive, La Jolla, CA 92161, USA; Department of Psychiatry, University of California, San Diego School of Medicine, 9500 Gilman Drive, La Jolla, CA 92037, USA
| | - Anders M Dale
- Center for Multimodal Imaging and Genetics, University of California, San Diego School of Medicine, 9444 Medical Center Dr, La Jolla, CA 92037, USA; Department of Radiology, University of California, San Diego School of Medicine, 9500 Gilman Drive, La Jolla, CA 92037, USA; Department of Cognitive Science, University of California, San Diego, 9500 Gilman Drive, La Jolla, CA 92161, USA; Department of Neuroscience, University of California, San Diego School of Medicine, 9500 Gilman Drive, La Jolla, CA 92037, USA
| |
Collapse
|
5
|
Cheng THZ, Creel SC, Iversen JR. How Do You Feel the Rhythm: Dynamic Motor-Auditory Interactions Are Involved in the Imagination of Hierarchical Timing. J Neurosci 2022; 42:500-512. [PMID: 34848500 PMCID: PMC8802922 DOI: 10.1523/jneurosci.1121-21.2021] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2021] [Revised: 11/10/2021] [Accepted: 11/12/2021] [Indexed: 11/21/2022] Open
Abstract
Predicting and organizing patterns of events is important for humans to survive in a dynamically changing world. The motor system has been proposed to be actively, and necessarily, engaged in not only the production but the perception of rhythm by organizing hierarchical timing that influences auditory responses. It is not yet well understood how the motor system interacts with the auditory system to perceive and maintain hierarchical structure in time. This study investigated the dynamic interaction between auditory and motor functional sources during the perception and imagination of musical meters. We pursued this using a novel method combining high-density EEG, EMG, and motion capture with independent component analysis to separate motor and auditory activity during meter imagery while robustly controlling against covert movement. We demonstrated that endogenous brain activity in both auditory and motor functional sources reflects the imagination of binary and ternary meters in the absence of corresponding acoustic cues or overt movement at the meter rate. We found clear evidence for hypothesized motor-to-auditory information flow at the beat rate in all conditions, suggesting a role for top-down influence of the motor system on auditory processing of beat-based rhythms, and reflecting an auditory-motor system with tight reciprocal informational coupling. These findings align with and further extend a set of motor hypotheses from beat perception to hierarchical meter imagination, adding supporting evidence to active engagement of the motor system in auditory processing, which may more broadly speak to the neural mechanisms of temporal processing in other human cognitive functions.SIGNIFICANCE STATEMENT Humans live in a world full of hierarchically structured temporal information, the accurate perception of which is essential for understanding speech and music. Music provides a window into the brain mechanisms of time perception, enabling us to examine how the brain groups musical beats into, for example a march or waltz. Using a novel paradigm combining measurement of electrical brain activity with data-driven analysis, this study directly investigates motor-auditory connectivity during meter imagination. Findings highlight the importance of the motor system in the active imagination of meter. This study sheds new light on a fundamental form of perception by demonstrating how auditory-motor interaction may support hierarchical timing processing, which may have clinical implications for speech and motor rehabilitation.
Collapse
Affiliation(s)
- Tzu-Han Zoe Cheng
- Department of Cognitive Science, University of California-San Diego, La Jolla, California 92093
- Institute for Neural Computation and Swartz Center for Computational Neuroscience, University of California-San Diego, La Jolla, California 92093
| | - Sarah C Creel
- Department of Cognitive Science, University of California-San Diego, La Jolla, California 92093
| | - John R Iversen
- Institute for Neural Computation and Swartz Center for Computational Neuroscience, University of California-San Diego, La Jolla, California 92093
| |
Collapse
|
6
|
Abstract
Brain systems supporting body movement are active during music listening in the absence of overt movement. This covert motor activity is not well understood, but some theories propose a role in auditory timing prediction facilitated by motor simulation. One question is how music-related covert motor activity relates to motor activity during overt movement. We address this question using scalp electroencephalogram by measuring mu rhythms-cortical field phenomena associated with the somatomotor system that appear over sensorimotor cortex. Lateralized mu enhancement over hand sensorimotor cortex during/just before foot movement in foot versus hand movement paradigms is thought to reflect hand movement inhibition during current/prospective movement of another effector. Behavior of mu during music listening with movement suppressed has yet to be determined. We recorded 32-channel EEG (n = 17) during silence without movement, overt movement (foot/hand), and music listening without movement. Using an independent component analysis-based source equivalent dipole clustering technique, we identified three mu-related clusters, localized to left primary motor and right and midline premotor cortices. Right foot tapping was accompanied by mu enhancement in the left lateral source cluster, replicating previous work. Music listening was accompanied by similar mu enhancement in the left, as well as midline, clusters. We are the first, to our knowledge, to report, and also to source-resolve, music-related mu modulation in the absence of overt movements. Covert music-related motor activity has been shown to play a role in beat perception (Ross JM, Iversen JR, Balasubramaniam R. Neurocase 22: 558-565, 2016). Our current results show enhancement in somatotopically organized mu, supporting overt motor inhibition during beat perception.NEW & NOTEWORTHY We are the first to report music-related mu enhancement in the absence of overt movements and the first to source-resolve mu activity during music listening. We suggest that music-related mu modulation reflects overt motor inhibition during passive music listening. This work is relevant for the development of theories relating to the involvement of covert motor system activity for predictive beat perception.
Collapse
Affiliation(s)
- Jessica M. Ross
- 1Veterans Affairs Palo Alto Heathcare System, the Sierra Pacific Mental Illness, Research Education, and Clinical Center (MIRECC), Palo Alto, California,2Department of Psychiatry and Behavioral Sciences, Stanford University Medical Center, Stanford, California,3Berenson-Allen Center for Noninvasive Brain Stimulation,
Beth Israel Deaconess Medical Center, Boston, Massachusetts,4Department of Neurology, Harvard Medical School, Boston, Massachusetts
| | - Daniel C. Comstock
- 5Cognitive and Information Sciences, University of California, Merced, California
| | - John R. Iversen
- 6Swartz Center for Computational Neuroscience, Institute for Neural Computation, University of California, San Diego, California
| | - Scott Makeig
- 6Swartz Center for Computational Neuroscience, Institute for Neural Computation, University of California, San Diego, California
| | | |
Collapse
|
7
|
Palmer CE, Pecheva D, Iversen JR, Hagler DJ, Sugrue L, Nedelec P, Fan CC, Thompson WK, Jernigan TL, Dale AM. Microstructural development from 9 to 14 years: Evidence from the ABCD Study. Dev Cogn Neurosci 2021; 53:101044. [PMID: 34896850 PMCID: PMC8671104 DOI: 10.1016/j.dcn.2021.101044] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2021] [Revised: 11/23/2021] [Accepted: 12/02/2021] [Indexed: 02/03/2023] Open
Abstract
During late childhood behavioral changes, such as increased risk-taking and emotional reactivity, have been associated with the maturation of cortico-cortico and cortico-subcortical circuits. Understanding microstructural changes in both white matter and subcortical regions may aid our understanding of how individual differences in these behaviors emerge. Restriction spectrum imaging (RSI) is a framework for modelling diffusion-weighted imaging that decomposes the diffusion signal from a voxel into hindered, restricted, and free compartments. This yields greater specificity than conventional methods of characterizing diffusion. Using RSI, we quantified voxelwise restricted diffusion across the brain and measured age associations in a large sample (n = 8086) from the Adolescent Brain and Cognitive Development (ABCD) study aged 9-14 years. Older participants showed a higher restricted signal fraction across the brain, with the largest associations in subcortical regions, particularly the basal ganglia and ventral diencephalon. Importantly, age associations varied with respect to the cytoarchitecture within white matter fiber tracts and subcortical structures, for example age associations differed across thalamic nuclei. This suggests that age-related changes may map onto specific cell populations or circuits and highlights the utility of voxelwise compared to ROI-wise analyses. Future analyses will aim to understand the relevance of this microstructural developmental for behavioral outcomes.
Collapse
Affiliation(s)
- Clare E. Palmer
- Center for Human Development, University of California, San Diego, 9500 Gilman Drive, La Jolla, CA 92161, USA,Corresponding author.
| | - Diliana Pecheva
- Center for Multimodal Imaging and Genetics, University of California, San Diego School of Medicine, 9444 Medical Center Dr, La Jolla, CA 92037, USA,Department of Radiology, University of California, San Diego School of Medicine, 9500 Gilman Drive, La Jolla, CA 92037, USA
| | - John R. Iversen
- Institute for Neural Computation, University of California San Diego, 9500 Gilman Drive, La Jolla, CA 92093, USA
| | - Donald J. Hagler
- Center for Multimodal Imaging and Genetics, University of California, San Diego School of Medicine, 9444 Medical Center Dr, La Jolla, CA 92037, USA,Department of Radiology, University of California, San Diego School of Medicine, 9500 Gilman Drive, La Jolla, CA 92037, USA
| | - Leo Sugrue
- Department of Radiology and Biomedical Imaging, University of California, San Francisco, 505 Parnassus Avenue, San Francisco, CA 94143, USA
| | - Pierre Nedelec
- Department of Radiology and Biomedical Imaging, University of California, San Francisco, 505 Parnassus Avenue, San Francisco, CA 94143, USA
| | - Chun Chieh Fan
- Center for Multimodal Imaging and Genetics, University of California, San Diego School of Medicine, 9444 Medical Center Dr, La Jolla, CA 92037, USA
| | - Wesley K. Thompson
- Division of Biostatistics, Department of Family Medicine and Public Health, University of California, San Diego, La Jolla, CA, USA
| | - Terry L. Jernigan
- Center for Human Development, University of California, San Diego, 9500 Gilman Drive, La Jolla, CA 92161, USA,Department of Radiology, University of California, San Diego School of Medicine, 9500 Gilman Drive, La Jolla, CA 92037, USA,Department of Cognitive Science, University of California, San Diego, 9500 Gilman Drive, La Jolla, CA 92161, USA,Department of Psychiatry, University of California, San Diego School of Medicine, 9500 Gilman Drive, La Jolla, CA 92037, USA
| | - Anders M. Dale
- Center for Multimodal Imaging and Genetics, University of California, San Diego School of Medicine, 9444 Medical Center Dr, La Jolla, CA 92037, USA,Department of Radiology, University of California, San Diego School of Medicine, 9500 Gilman Drive, La Jolla, CA 92037, USA,Department of Cognitive Science, University of California, San Diego, 9500 Gilman Drive, La Jolla, CA 92161, USA,Department of Neuroscience, University of California, San Diego School of Medicine, 9500 Gilman Drive, La Jolla, CA 92037, USA
| |
Collapse
|
8
|
Gustavson DE, Coleman PL, Iversen JR, Maes HH, Gordon RL, Lense MD. Mental health and music engagement: review, framework, and guidelines for future studies. Transl Psychiatry 2021; 11:370. [PMID: 34226495 PMCID: PMC8257764 DOI: 10.1038/s41398-021-01483-8] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/23/2020] [Revised: 06/03/2021] [Accepted: 06/10/2021] [Indexed: 01/08/2023] Open
Abstract
Is engaging with music good for your mental health? This question has long been the topic of empirical clinical and nonclinical investigations, with studies indicating positive associations between music engagement and quality of life, reduced depression or anxiety symptoms, and less frequent substance use. However, many earlier investigations were limited by small populations and methodological limitations, and it has also been suggested that aspects of music engagement may even be associated with worse mental health outcomes. The purpose of this scoping review is first to summarize the existing state of music engagement and mental health studies, identifying their strengths and weaknesses. We focus on broad domains of mental health diagnoses including internalizing psychopathology (e.g., depression and anxiety symptoms and diagnoses), externalizing psychopathology (e.g., substance use), and thought disorders (e.g., schizophrenia). Second, we propose a theoretical model to inform future work that describes the importance of simultaneously considering music-mental health associations at the levels of (1) correlated genetic and/or environmental influences vs. (bi)directional associations, (2) interactions with genetic risk factors, (3) treatment efficacy, and (4) mediation through brain structure and function. Finally, we describe how recent advances in large-scale data collection, including genetic, neuroimaging, and electronic health record studies, allow for a more rigorous examination of these associations that can also elucidate their neurobiological substrates.
Collapse
Affiliation(s)
- Daniel E. Gustavson
- grid.412807.80000 0004 1936 9916Department of Medicine, Vanderbilt University Medical Center, Nashville, TN USA ,grid.412807.80000 0004 1936 9916Vanderbilt Genetics Institute, Vanderbilt University Medical Center, Nashville, TN USA
| | - Peyton L. Coleman
- grid.412807.80000 0004 1936 9916Department of Otolaryngology – Head & Neck Surgery, Vanderbilt University Medical Center, Nashville, TN USA
| | - John R. Iversen
- grid.266100.30000 0001 2107 4242Swartz Center for Computational Neuroscience, Institute for Neural Computation, University of California, San Diego, La Jolla, CA USA
| | - Hermine H. Maes
- grid.224260.00000 0004 0458 8737Department of Human and Molecular Genetics, Virginia Institute for Psychiatric and Behavioral Genetics, Virginia Commonwealth University, Richmond, VA USA ,grid.224260.00000 0004 0458 8737Department of Psychiatry, Virginia Institute for Psychiatric and Behavioral Genetics, Virginia Commonwealth University, Richmond, VA USA ,grid.224260.00000 0004 0458 8737Massey Cancer Center, Virginia Commonwealth University, Richmond, VA USA
| | - Reyna L. Gordon
- grid.412807.80000 0004 1936 9916Vanderbilt Genetics Institute, Vanderbilt University Medical Center, Nashville, TN USA ,grid.412807.80000 0004 1936 9916Department of Otolaryngology – Head & Neck Surgery, Vanderbilt University Medical Center, Nashville, TN USA ,grid.152326.10000 0001 2264 7217Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN USA ,grid.152326.10000 0001 2264 7217The Curb Center, Vanderbilt University, Nashville, TN USA
| | - Miriam D. Lense
- grid.412807.80000 0004 1936 9916Vanderbilt Genetics Institute, Vanderbilt University Medical Center, Nashville, TN USA ,grid.152326.10000 0001 2264 7217Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN USA ,grid.152326.10000 0001 2264 7217The Curb Center, Vanderbilt University, Nashville, TN USA
| |
Collapse
|
9
|
Shikauchi Y, Miyakoshi M, Makeig S, Iversen JR. Bayesian models of human navigation behaviour in an augmented reality audiomaze. Eur J Neurosci 2020; 54:8308-8317. [PMID: 33237612 PMCID: PMC9292259 DOI: 10.1111/ejn.15061] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2020] [Revised: 11/19/2020] [Accepted: 11/19/2020] [Indexed: 11/29/2022]
Abstract
We investigated Bayesian modelling of human whole‐body motion capture data recorded during an exploratory real‐space navigation task in an “Audiomaze” environment (see the companion paper by Miyakoshi et al. in the same volume) to study the effect of map learning on navigation behaviour. There were three models, a feedback‐only model (no map learning), a map resetting model (single‐trial limited map learning), and a map updating model (map learning accumulated across three trials). The estimated behavioural variables included step sizes and turning angles. Results showed that the estimated step sizes were constantly more accurate using the map learning models than the feedback‐only model. The same effect was confirmed for turning angle estimates, but only for data from the third trial. We interpreted these results as Bayesian evidence of human map learning on navigation behaviour. Furthermore, separating the participants into groups of egocentric and allocentric navigators revealed an advantage for the map updating model in estimating step sizes, but only for the allocentric navigators. This interaction indicated that the allocentric navigators may take more advantage of map learning than do egocentric navigators. We discuss relationships of these results to simultaneous localization and mapping (SLAM) problem.
Collapse
Affiliation(s)
- Yumi Shikauchi
- JSPS Research Fellow, Tokyo, Japan.,Rhythm-based Brain Information Processing Unit, CBS-TOYOTA Collaboration Center, RIKEN Center for Brain Science, Saitama, Japan
| | - Makoto Miyakoshi
- Swartz Center for Neural Computation, Institute for Neural Computation, University of California San Diego, San Diego, CA, USA
| | - Scott Makeig
- Swartz Center for Neural Computation, Institute for Neural Computation, University of California San Diego, San Diego, CA, USA
| | - John R Iversen
- Swartz Center for Neural Computation, Institute for Neural Computation, University of California San Diego, San Diego, CA, USA
| |
Collapse
|
10
|
D'Andrea-Penna GM, Iversen JR, Chiba AA, Khalil AK, Minces VH. One Tap at a Time: Correlating Sensorimotor Synchronization with Brain Signatures of Temporal Processing. Cereb Cortex Commun 2020; 1:tgaa036. [PMID: 33015622 PMCID: PMC7521132 DOI: 10.1093/texcom/tgaa036] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2020] [Revised: 07/14/2020] [Accepted: 07/15/2020] [Indexed: 11/13/2022] Open
Abstract
The ability to integrate our perceptions across sensory modalities and across time, to execute and coordinate movements, and to adapt to a changing environment rests on temporal processing. Timing is essential for basic daily tasks, such as walking, social interaction, speech and language comprehension, and attention. Impaired temporal processing may contribute to various disorders, from attention-deficit hyperactivity disorder and schizophrenia to Parkinson’s disease and dementia. The foundational importance of timing ability has yet to be fully understood; and popular tasks used to investigate behavioral timing ability, such as sensorimotor synchronization (SMS), engage a variety of processes in addition to the neural processing of time. The present study utilizes SMS in conjunction with a separate passive listening task that manipulates temporal expectancy while recording electroencephalographic data. Participants display a larger N1-P2 evoked potential complex to unexpected beats relative to temporally predictable beats, a differential we call the timing response index (TRI). The TRI correlates with performance on the SMS task: better synchronizers show a larger brain response to unexpected beats. The TRI, derived from the perceptually driven N1-P2 complex, disentangles the perceptual and motor components inherent in SMS and thus may serve as a neural marker of a more general temporal processing.
Collapse
Affiliation(s)
| | - John R Iversen
- Institute for Neural Computation, UC San Diego, La Jolla, CA 92093, USA
| | - Andrea A Chiba
- Neurosciences Graduate Program, UC San Diego, La Jolla, CA 92093, USA
| | | | - Victor H Minces
- Department of Cognitive Science, UC San Diego, La Jolla, CA 92093, USA
| |
Collapse
|
11
|
Abstract
Spontaneous movement to music occurs in every human culture and is a foundation of dance [1]. This response to music is absent in most species (including monkeys), yet it occurs in parrots, perhaps because they (like humans, and unlike monkeys) are vocal learners whose brains contain strong auditory-motor connections, conferring sophisticated audiomotor processing abilities [2,3]. Previous research has shown that parrots can bob their heads or lift their feet in synchrony with a musical beat [2,3], but humans move to music using a wide variety of movements and body parts. Is this also true of parrots? If so, it would constrain theories of how movement to music is controlled by parrot brains. Specifically, as head bobbing is part of parrot courtship displays [4] and foot lifting is part of locomotion, these may be innate movements controlled by central pattern generators which become entrained by auditory rhythms, without the involvement of complex motor planning. This would be unlike humans, where movement to music engages cortical networks including frontal and parietal areas [5]. Rich diversity in parrot movement to music would suggest a strong contribution of forebrain regions to this behavior, perhaps including motor learning regions abutting the complex vocal-learning 'shell' regions that are unique to parrots among vocal learning birds [6]. Here we report that a sulphur-crested cockatoo (Cacatua galerita eleonora) responds to music with remarkably diverse spontaneous movements employing a variety of body parts, and suggest why parrots share this response with humans.
Collapse
Affiliation(s)
- R Joanne Jao Keehn
- Brain Development Imaging Labs, Department of Psychology, San Diego State University, 6363 Alvarado Ct. #200, San Diego, CA 92120, USA
| | - John R Iversen
- University of California San Diego, Institute for Neural Computation, 9500 Gilman Dr. #0559, La Jolla, CA 92093, USA
| | - Irena Schulz
- Bird Lovers Only Rescue Service Inc., Duncan, SC 29334, USA
| | - Aniruddh D Patel
- Department of Psychology, Tufts University, 490 Boston Ave., Medford, MA 02155, USA; Azrieli Program in Brain, Mind, and Consciousness, Canadian Institute for Advanced Research (CIFAR), MaRS Centre, West Tower, 661 University Ave., Suite 505, Toronto, ON, MG5 1M1, Canada; Radcliffe Institute for Advanced Study, Harvard University, 10 Garden St., Cambridge, MA 02138, USA.
| |
Collapse
|
12
|
Ross JM, Iversen JR, Balasubramaniam R. The Role of Posterior Parietal Cortex in Beat-based Timing Perception: A Continuous Theta Burst Stimulation Study. J Cogn Neurosci 2018; 30:634-643. [DOI: 10.1162/jocn_a_01237] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
There is growing interest in how the brain's motor systems contribute to the perception of musical rhythms. The Action Simulation for Auditory Prediction hypothesis proposes that the dorsal auditory stream is involved in bidirectional interchange between auditory perception and beat-based prediction in motor planning structures via parietal cortex [Patel, A. D., & Iversen, J. R. The evolutionary neuroscience of musical beat perception: The Action Simulation for Auditory Prediction (ASAP) hypothesis. Frontiers in Systems Neuroscience, 8, 57, 2014]. We used a TMS protocol, continuous theta burst stimulation (cTBS), that is known to down-regulate cortical activity for up to 60 min following stimulation to test for causal contributions to beat-based timing perception. cTBS target areas included the left posterior parietal cortex (lPPC), which is part of the dorsal auditory stream, and the left SMA (lSMA). We hypothesized that down-regulating lPPC would interfere with accurate beat-based perception by disrupting the dorsal auditory stream. We hypothesized that we would induce no interference to absolute timing ability. We predicted that down-regulating lSMA, which is not part of the dorsal auditory stream but has been implicated in internally timed movements, would also interfere with accurate beat-based timing perception. We show ( n = 25) that cTBS down-regulation of lPPC does interfere with beat-based timing ability, but only the ability to detect shifts in beat phase, not changes in tempo. Down-regulation of lSMA, in contrast, did not interfere with beat-based timing. As expected, absolute interval timing ability was not impacted by the down-regulation of lPPC or lSMA. These results support that the dorsal auditory stream plays an essential role in accurate phase perception in beat-based timing. We find no evidence of an essential role of parietal cortex or SMA in interval timing.
Collapse
|
13
|
Courellis H, Mullen T, Poizner H, Cauwenberghs G, Iversen JR. EEG-Based Quantification of Cortical Current Density and Dynamic Causal Connectivity Generalized across Subjects Performing BCI-Monitored Cognitive Tasks. Front Neurosci 2017; 11:180. [PMID: 28566997 PMCID: PMC5434743 DOI: 10.3389/fnins.2017.00180] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2017] [Accepted: 03/20/2017] [Indexed: 11/13/2022] Open
Abstract
Quantification of dynamic causal interactions among brain regions constitutes an important component of conducting research and developing applications in experimental and translational neuroscience. Furthermore, cortical networks with dynamic causal connectivity in brain-computer interface (BCI) applications offer a more comprehensive view of brain states implicated in behavior than do individual brain regions. However, models of cortical network dynamics are difficult to generalize across subjects because current electroencephalography (EEG) signal analysis techniques are limited in their ability to reliably localize sources across subjects. We propose an algorithmic and computational framework for identifying cortical networks across subjects in which dynamic causal connectivity is modeled among user-selected cortical regions of interest (ROIs). We demonstrate the strength of the proposed framework using a "reach/saccade to spatial target" cognitive task performed by 10 right-handed individuals. Modeling of causal cortical interactions was accomplished through measurement of cortical activity using (EEG), application of independent component clustering to identify cortical ROIs as network nodes, estimation of cortical current density using cortically constrained low resolution electromagnetic brain tomography (cLORETA), multivariate autoregressive (MVAR) modeling of representative cortical activity signals from each ROI, and quantification of the dynamic causal interaction among the identified ROIs using the Short-time direct Directed Transfer function (SdDTF). The resulting cortical network and the computed causal dynamics among its nodes exhibited physiologically plausible behavior, consistent with past results reported in the literature. This physiological plausibility of the results strengthens the framework's applicability in reliably capturing complex brain functionality, which is required by applications, such as diagnostics and BCI.
Collapse
Affiliation(s)
- Hristos Courellis
- Swartz Center for Computational Neuroscience, University of California, San DiegoSan Diego, CA, United States.,Department of Bioengineering, University of California, San DiegoSan Diego, CA, United States
| | - Tim Mullen
- Swartz Center for Computational Neuroscience, University of California, San DiegoSan Diego, CA, United States
| | - Howard Poizner
- Institute for Neural Computation, University of California, San DiegoSan Diego, CA, United States
| | - Gert Cauwenberghs
- Department of Bioengineering, University of California, San DiegoSan Diego, CA, United States.,Institute for Neural Computation, University of California, San DiegoSan Diego, CA, United States
| | - John R Iversen
- Swartz Center for Computational Neuroscience, University of California, San DiegoSan Diego, CA, United States
| |
Collapse
|
14
|
Abstract
There is growing interest in whether the motor system plays an essential role in rhythm perception. The motor system is active during the perception of rhythms, but is such motor activity merely a sign of unexecuted motor planning, or does it play a causal role in shaping the perception of rhythm? We present evidence for a causal role of motor planning and simulation, and review theories of internal simulation for beat-based timing prediction. Brain stimulation studies have the potential to conclusively test if the motor system plays a causal role in beat perception and ground theories to their neural underpinnings.
Collapse
Affiliation(s)
- Jessica M Ross
- a Cognitive and Information Sciences , University of California , Merced , CA , USA
| | - John R Iversen
- b Swartz Center for Computational Neuroscience, Institute for Neural Computation , University of California , San Diego , CA , USA
| | | |
Collapse
|
15
|
Iversen JR, Patel AD, Nicodemus B, Emmorey K. Synchronization to auditory and visual rhythms in hearing and deaf individuals. Cognition 2014; 134:232-44. [PMID: 25460395 DOI: 10.1016/j.cognition.2014.10.018] [Citation(s) in RCA: 95] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2012] [Revised: 09/22/2014] [Accepted: 10/31/2014] [Indexed: 11/17/2022]
Abstract
A striking asymmetry in human sensorimotor processing is that humans synchronize movements to rhythmic sound with far greater precision than to temporally equivalent visual stimuli (e.g., to an auditory vs. a flashing visual metronome). Traditionally, this finding is thought to reflect a fundamental difference in auditory vs. visual processing, i.e., superior temporal processing by the auditory system and/or privileged coupling between the auditory and motor systems. It is unclear whether this asymmetry is an inevitable consequence of brain organization or whether it can be modified (or even eliminated) by stimulus characteristics or by experience. With respect to stimulus characteristics, we found that a moving, colliding visual stimulus (a silent image of a bouncing ball with a distinct collision point on the floor) was able to drive synchronization nearly as accurately as sound in hearing participants. To study the role of experience, we compared synchronization to flashing metronomes in hearing and profoundly deaf individuals. Deaf individuals performed better than hearing individuals when synchronizing with visual flashes, suggesting that cross-modal plasticity enhances the ability to synchronize with temporally discrete visual stimuli. Furthermore, when deaf (but not hearing) individuals synchronized with the bouncing ball, their tapping patterns suggest that visual timing may access higher-order beat perception mechanisms for deaf individuals. These results indicate that the auditory advantage in rhythmic synchronization is more experience- and stimulus-dependent than has been previously reported.
Collapse
Affiliation(s)
- John R Iversen
- Swartz Center for Computational Neuroscience, Institute for Neural Computation, University of California, San Diego, 9500 Gilman Drive # 0559, La Jolla, CA 92093, USA.
| | - Aniruddh D Patel
- Department of Psychology, Tufts University, 490 Boston Ave., Medford, MA 02155, USA
| | - Brenda Nicodemus
- Department of Interpretation, Gallaudet University, 800 Florida Avenue, NE, Washington, DC 20002, USA
| | - Karen Emmorey
- Laboratory for Language and Cognitive Neuroscience, San Diego State University, 6495 Alvarado Road, Suite 200, San Diego, CA 92120, USA
| |
Collapse
|
16
|
|
17
|
Patel AD, Iversen JR. The evolutionary neuroscience of musical beat perception: the Action Simulation for Auditory Prediction (ASAP) hypothesis. Front Syst Neurosci 2014; 8:57. [PMID: 24860439 PMCID: PMC4026735 DOI: 10.3389/fnsys.2014.00057] [Citation(s) in RCA: 207] [Impact Index Per Article: 20.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2013] [Accepted: 03/25/2014] [Indexed: 11/17/2022] Open
Abstract
EVERY HUMAN CULTURE HAS SOME FORM OF MUSIC WITH A BEAT a perceived periodic pulse that structures the perception of musical rhythm and which serves as a framework for synchronized movement to music. What are the neural mechanisms of musical beat perception, and how did they evolve? One view, which dates back to Darwin and implicitly informs some current models of beat perception, is that the relevant neural mechanisms are relatively general and are widespread among animal species. On the basis of recent neural and cross-species data on musical beat processing, this paper argues for a different view. Here we argue that beat perception is a complex brain function involving temporally-precise communication between auditory regions and motor planning regions of the cortex (even in the absence of overt movement). More specifically, we propose that simulation of periodic movement in motor planning regions provides a neural signal that helps the auditory system predict the timing of upcoming beats. This "action simulation for auditory prediction" (ASAP) hypothesis leads to testable predictions. We further suggest that ASAP relies on dorsal auditory pathway connections between auditory regions and motor planning regions via the parietal cortex, and suggest that these connections may be stronger in humans than in non-human primates due to the evolution of vocal learning in our lineage. This suggestion motivates cross-species research to determine which species are capable of human-like beat perception, i.e., beat perception that involves accurate temporal prediction of beat times across a fairly broad range of tempi.
Collapse
Affiliation(s)
| | - John R. Iversen
- Swartz Center for Computational Neuroscience, Institute for Neural Computation, University of California San DiegoLa Jolla, CA, USA
| |
Collapse
|
18
|
Vasehus Holck M, Iversen JR. CP-146 Shared Medication Record discrepancies in association with electronic transfer of prescriptions. Eur J Hosp Pharm 2014. [DOI: 10.1136/ejhpharm-2013-000436.144] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022] Open
|
19
|
Iversen JR, Ojeda A, Mullen T, Plank M, Snider J, Cauwenberghs G, Poizner H. Causal analysis of cortical networks involved in reaching to spatial targets. Annu Int Conf IEEE Eng Med Biol Soc 2014; 2014:4399-4402. [PMID: 25570967 DOI: 10.1109/embc.2014.6944599] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Abstract
The planning of goal-directed movement towards targets in different parts of space is an important function of the brain. Such visuo-motor planning and execution is known to involve multiple brain regions, including visual, parietal, and frontal cortices. To understand how these brain regions work together to both plan and execute goal-directed movement, it is essential to describe the dynamic causal interactions among them. Here we model causal interactions of distributed cortical source activity derived from non-invasively recorded EEG, using a combination of ICA, minimum-norm distributed source localization (cLORETA), and dynamical modeling within the Source Information Flow Toolbox (SIFT). We differentiate network causal connectivity of reach planning and execution, by comparing the causal network in a speeded reaching task with that for a control task not requiring goal-directed movement. Analysis of a pilot dataset (n=5) shows the utility of this technique and reveals increased connectivity between visual, motor and frontal brain regions during reach planning, together with decreased cross-hemisphere visual coupling during planning and execution, possibly related to task demands.
Collapse
|
20
|
Hove MJ, Iversen JR, Zhang A, Repp BH. Synchronization with competing visual and auditory rhythms: bouncing ball meets metronome. Psychol Res 2012; 77:388-98. [PMID: 22638726 DOI: 10.1007/s00426-012-0441-0] [Citation(s) in RCA: 60] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2012] [Accepted: 05/11/2012] [Indexed: 10/28/2022]
Abstract
Synchronization of finger taps with periodically flashing visual stimuli is known to be much more variable than synchronization with an auditory metronome. When one of these rhythms is the synchronization target and the other serves as a distracter at various temporal offsets, strong auditory dominance is observed. However, it has recently been shown that visuomotor synchronization improves substantially with moving stimuli such as a continuously bouncing ball. The present study pitted a bouncing ball against an auditory metronome in a target-distracter synchronization paradigm, with the participants being auditory experts (musicians) and visual experts (video gamers and ball players). Synchronization was still less variable with auditory than with visual target stimuli in both groups. For musicians, auditory stimuli tended to be more distracting than visual stimuli, whereas the opposite was the case for the visual experts. Overall, there was no main effect of distracter modality. Thus, a distracting spatiotemporal visual rhythm can be as effective as a distracting auditory rhythm in its capacity to perturb synchronous movement, but its effectiveness also depends on modality-specific expertise.
Collapse
Affiliation(s)
- Michael J Hove
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | | | | | | |
Collapse
|
21
|
Yoshida KA, Iversen JR, Patel AD, Mazuka R, Nito H, Gervain J, Werker JF. The development of perceptual grouping biases in infancy: a Japanese-English cross-linguistic study. Cognition 2010; 115:356-61. [PMID: 20144456 DOI: 10.1016/j.cognition.2010.01.005] [Citation(s) in RCA: 91] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2009] [Revised: 01/14/2010] [Accepted: 01/16/2010] [Indexed: 11/29/2022]
Abstract
Perceptual grouping has traditionally been thought to be governed by innate, universal principles. However, recent work has found differences in Japanese and English speakers' non-linguistic perceptual grouping, implicating language in non-linguistic perceptual processes (Iversen, Patel, & Ohgushi, 2008). Two experiments test Japanese- and English-learning infants of 5-6 and 7-8 months of age to explore the development of grouping preferences. At 5-6 months, neither the Japanese nor the English infants revealed any systematic perceptual biases. However, by 7-8 months, the same age as when linguistic phrasal grouping develops, infants developed non-linguistic grouping preferences consistent with their language's structure (and the grouping biases found in adulthood). These results reveal an early difference in non-linguistic perception between infants growing up in different language environments. The possibility that infants' linguistic phrasal grouping is bootstrapped by abstract perceptual principles is discussed.
Collapse
Affiliation(s)
- Katherine A Yoshida
- New York University, Department of Psychology, 6 Washington Place, New York NY 10003 USA.
| | | | | | | | | | | | | |
Collapse
|
22
|
Abstract
The recent discovery of spontaneous synchronization to music in a nonhuman animal (the sulphur-crested cockatoo Cacatua galerita eleonora) raises several questions. How does this behavior differ from nonmusical synchronization abilities in other species, such as synchronized frog calls or firefly flashes? What significance does the behavior have for debates over the evolution of human music? What kinds of animals can synchronize to musical rhythms, and what are the key methodological issues for research in this area? This paper addresses these questions and proposes some refinements to the "vocal learning and rhythmic synchronization hypothesis."
Collapse
Affiliation(s)
- Aniruddh D Patel
- The Neurosciences Institute, 10640 John Jay Hopkins Drive, San Diego, CA 92121, USA.
| | | | | | | |
Collapse
|
23
|
|
24
|
|
25
|
Patel AD, Iversen JR, Bregman MR, Schulz I. Experimental evidence for synchronization to a musical beat in a nonhuman animal. Curr Biol 2009; 19:827-30. [PMID: 19409790 DOI: 10.1016/j.cub.2009.03.038] [Citation(s) in RCA: 228] [Impact Index Per Article: 15.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2009] [Revised: 02/25/2009] [Accepted: 03/12/2009] [Indexed: 10/20/2022]
Abstract
The tendency to move in rhythmic synchrony with a musical beat (e.g., via head bobbing, foot tapping, or dance) is a human universal [1] yet is not commonly observed in other species [2]. Does this ability reflect a brain specialization for music cognition, or does it build on neural circuitry that ordinarily serves other functions? According to the "vocal learning and rhythmic synchronization" hypothesis [3], entrainment to a musical beat relies on the neural circuitry for complex vocal learning, an ability that requires a tight link between auditory and motor circuits in the brain [4, 5]. This hypothesis predicts that only vocal learning species (such as humans and some birds, cetaceans, and pinnipeds, but not nonhuman primates) are capable of synchronizing movements to a musical beat. Here we report experimental evidence for synchronization to a beat in a sulphur-crested cockatoo (Cacatua galerita eleonora). By manipulating the tempo of a musical excerpt across a wide range, we show that the animal spontaneously adjusts the tempo of its rhythmic movements to stay synchronized with the beat. These findings indicate that synchronization to a musical beat is not uniquely human and suggest that animal models can provide insights into the neurobiology and evolution of human music [6].
Collapse
Affiliation(s)
- Aniruddh D Patel
- The Neurosciences Institute, 10640 John Jay Hopkins Drive, San Diego, CA 92121, USA.
| | | | | | | |
Collapse
|
26
|
Abstract
Many aspects of perception are known to be shaped by experience, but others are thought to be innate universal properties of the brain. A specific example comes from rhythm perception, where one of the fundamental perceptual operations is the grouping of successive events into higher-level patterns, an operation critical to the perception of language and music. Grouping has long been thought to be governed by innate perceptual principles established a century ago. The current work demonstrates instead that grouping can be strongly dependent on culture. Native English and Japanese speakers were tested for their perception of grouping of simple rhythmic sequences of tones. Members of the two cultures showed different patterns of perceptual grouping, demonstrating that these basic auditory processes are not universal but are shaped by experience. It is suggested that the observed perceptual differences reflect the rhythms of the two languages, and that native language can exert an influence on general auditory perception at a basic level.
Collapse
Affiliation(s)
- John R Iversen
- The Neurosciences Institute, 10640 John Jay Hopkins Drive, San Diego, California 92121, USA.
| | | | | |
Collapse
|
27
|
Abstract
Growing evidence points to a link between musical abilities and certain phonetic and prosodic skills in language. However, the mechanisms that underlie these relations are not well understood. A recent study by Wong et al. suggests that musical training sharpens the subcortical encoding of linguistic pitch patterns. We consider the implications of their methods and findings for establishing a link between musical training and phonetic abilities more generally.
Collapse
Affiliation(s)
- Aniruddh D Patel
- The Neurosciences Institute, 10640 John Jay Hopkins Drive, San Diego, CA 92121, USA.
| | | |
Collapse
|
28
|
Seth AK, Iversen JR, Edelman GM. Single-trial discrimination of truthful from deceptive responses during a game of financial risk using alpha-band MEG signals. Neuroimage 2006; 32:465-76. [PMID: 16678444 DOI: 10.1016/j.neuroimage.2006.02.050] [Citation(s) in RCA: 16] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2005] [Revised: 02/15/2006] [Accepted: 02/28/2006] [Indexed: 11/16/2022] Open
Abstract
We studied whether magnetoencephalography (MEG) could detect deceptive responses on a single-subject, trial-by-trial basis. To elicit spontaneous, ecologically valid deception, we developed a paradigm in which subjects in a simulated customs setting were presented with a series of pictures of items which might be in their baggage, and for each item, they decided whether to "declare" (tell the truth) or "smuggle" (lie). Telling the truth involved a small but certain monetary penalty, whereas lying involved both greater monetary risk and greater potential reward. Most subjects showed decreased signal power in the 8-12 Hz (alpha) range during deceptive responses as compared to truthful responses. In a cross-validation analysis, we were able to use alpha power to classify truthful and deceptive responses on a trial-by-trial basis, with significantly greater predictive accuracy than that achieved using simultaneously recorded skin conductance signals. Average predictive accuracy for spontaneous deception was greater than 78%, and for some subjects, predictive accuracy exceeded 90%. Our results raise the possibility that alpha power modulation during deception may reflect risk management and/or cognitive control.
Collapse
Affiliation(s)
- Anil K Seth
- The Neurosciences Institute, San Diego, CA 92121, USA.
| | | | | |
Collapse
|
29
|
Patel AD, Iversen JR, Rosenberg JC. Comparing the rhythm and melody of speech and music: the case of British English and French. J Acoust Soc Am 2006; 119:3034-47. [PMID: 16708959 DOI: 10.1121/1.2179657] [Citation(s) in RCA: 38] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/09/2023]
Abstract
For over half a century, musicologists and linguists have suggested that the prosody of a culture's native language is reflected in the rhythms and melodies of its instrumental music. Testing this idea requires quantitative methods for comparing musical and spoken rhythm and melody. This study applies such methods to the speech and music of England and France. The results reveal that music reflects patterns of durational contrast between successive vowels in spoken sentences, as well as patterns of pitch interval variability in speech. The methods presented here are suitable for studying speech-music relations in a broad range of cultures.
Collapse
Affiliation(s)
- Aniruddh D Patel
- The Neurosciences Institute, 10640 John Jay Hopkins Drive, San Diego, California 92121, USA.
| | | | | |
Collapse
|
30
|
Patel AD, Iversen JR, Chen Y, Repp BH. The influence of metricality and modality on synchronization with a beat. Exp Brain Res 2005; 163:226-38. [PMID: 15654589 DOI: 10.1007/s00221-004-2159-8] [Citation(s) in RCA: 204] [Impact Index Per Article: 10.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2004] [Accepted: 10/18/2004] [Indexed: 11/28/2022]
Abstract
The great majority of the world's music is metrical, i.e., has periodic structure at multiple time scales. Does the metrical structure of a non-isochronous rhythm improve synchronization with a beat compared to synchronization with an isochronous sequence at the beat period? Beat synchronization is usually associated with auditory stimuli, but are people able to extract a beat from rhythmic visual sequences with metrical structure? We addressed these questions by presenting listeners with rhythmic patterns which were either isochronous or non-isochronous in either the auditory or visual modality, and by asking them to tap to the beat, which was prescribed to occur at 800-ms intervals. For auditory patterns, we found that a strongly metrical structure did not improve overall accuracy of synchronization compared with isochronous patterns of the same beat period, though it did influence the higher-level patterning of taps. Synchronization was impaired in weakly metrical patterns in which some beats were silent. For the visual patterns, we found that participants were generally unable to synchronize to metrical non-isochronous rhythms, or to rapid isochronous rhythms. This suggests that beat perception and synchronization have a special affinity with the auditory system.
Collapse
Affiliation(s)
- Aniruddh D Patel
- The Neurosciences Institute, 10640 John Jay Hopkins Drive, San Diego, CA 92121, USA.
| | | | | | | |
Collapse
|
31
|
Abstract
Seven male subjects ran at 3.0 m/s on a motorized treadmill including a force platform under the tread. The subjects ran at each of five treadmill inclinations: +0.17, +0.077, 0, -0.077, and -0.17 radians. The position of the subjects' legs were read from ciné films (100 frames/s). Results of the film and force plate analysis generally corroborated the "hanging triangle" hypothesis, which postulates that the angle between the leg and the vertical upon foot strike does not change as the treadmill is tipped up or down. A mathematical model of running, in which the leg is represented as a nonlinear spring, made satisfactory predictions of the way many parameters of running change with the treadmill angle, including the length of the leg at touchdown and liftoff and the peak leg force in the middle of a step. The peak leg force reaches a maximum at a treadmill angle near -0.12 radians, close to the downhill angle where other authors have found a minimum in the rate of oxygen consumption.
Collapse
Affiliation(s)
- J R Iversen
- Division of Applied Sciences, Harvard University, Cambridge, MA 02138
| | | |
Collapse
|
32
|
Iversen JR, Amlie E, Hagen S, Harbitz T, Kåresen R, Rø J. [Late diagnosis of advanced breast cancer--a challenge for health services]. Tidsskr Nor Laegeforen 1992; 112:1821-4. [PMID: 1631840] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/28/2022] Open
Abstract
From 1982 to 1987, 1,637 cancers of the breast were diagnosed in Oslo. 235 were classified as advanced according to one or more of the following criteria: tumour size greater than or equal to 5 cm (T3) or T4, metastasis within 4 months, pathological diagnosis pT3, pT4 or pN2. These were further studied. The distribution of women with advanced cancer mammae was uneven. For no obvious reason, incidence was significantly higher in one out of four hospitals in Oslo. 169 of the patients discovered the tumour themselves. Many patients delayed seeking help. 93 waited for more than eight weeks before doing so. For patients with metastasis at time of diagnosis, survival was slightly more than one year, and for patients without metastasis it was four and a half year. The length of stay in hospital increased with increasing admissions.
Collapse
|