1
|
Bedford O, Noly-Gandon A, Ara A, Wiesman AI, Albouy P, Baillet S, Penhune V, Zatorre RJ. Human Auditory-Motor Networks Show Frequency-Specific Phase-Based Coupling in Resting-State MEG. Hum Brain Mapp 2025; 46:e70045. [PMID: 39757971 DOI: 10.1002/hbm.70045] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2024] [Revised: 08/31/2024] [Accepted: 09/25/2024] [Indexed: 01/07/2025] Open
Abstract
Perception and production of music and speech rely on auditory-motor coupling, a mechanism which has been linked to temporally precise oscillatory coupling between auditory and motor regions of the human brain, particularly in the beta frequency band. Recently, brain imaging studies using magnetoencephalography (MEG) have also shown that accurate auditory temporal predictions specifically depend on phase coherence between auditory and motor cortical regions. However, it is not yet clear whether this tight oscillatory phase coupling is an intrinsic feature of the auditory-motor loop, or whether it is only elicited by task demands. Further, we do not know if phase synchrony is uniquely enhanced in the auditory-motor system compared to other sensorimotor modalities, or to which degree it is amplified by musical training. In order to resolve these questions, we measured the degree of phase locking between motor regions and auditory or visual areas in musicians and non-musicians using resting-state MEG. We derived phase locking values (PLVs) and phase transfer entropy (PTE) values from 90 healthy young participants. We observed significantly higher PLVs across all auditory-motor pairings compared to all visuomotor pairings in all frequency bands. The pairing with the highest degree of phase synchrony was right primary auditory cortex with right ventral premotor cortex, a connection which has been highlighted in previous literature on auditory-motor coupling. Additionally, we observed that auditory-motor and visuomotor PLVs were significantly higher across all structures in the right hemisphere, and we found the highest differences between auditory and visual PLVs in the theta, alpha, and beta frequency bands. Last, we found that the theta and beta bands exhibited a preference for a motor-to-auditory PTE direction and that the alpha and gamma bands exhibited the opposite preference for an auditory-to-motor PTE direction. Taken together, these findings confirm our hypotheses that motor phase synchrony is significantly enhanced in auditory compared to visual cortical regions at rest, that these differences are highest across the theta-beta spectrum of frequencies, and that there exist alternating information flow loops across auditory-motor structures as a function of frequency. In our view, this supports the existence of an intrinsic, time-based coupling for low-latency integration of sounds and movements which involves synchronized phasic activity between primary auditory cortex with motor and premotor cortical areas.
Collapse
Affiliation(s)
- Oscar Bedford
- Montreal Neurological Institute, McGill University, Montréal, Quebec, Canada
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montréal, Quebec, Canada
- Centre for Research on Brain, Language and Music (CRBLM), McGill University, Montréal, Quebec, Canada
| | - Alix Noly-Gandon
- Montreal Neurological Institute, McGill University, Montréal, Quebec, Canada
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montréal, Quebec, Canada
- Centre for Research on Brain, Language and Music (CRBLM), McGill University, Montréal, Quebec, Canada
| | - Alberto Ara
- Montreal Neurological Institute, McGill University, Montréal, Quebec, Canada
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montréal, Quebec, Canada
- Centre for Research on Brain, Language and Music (CRBLM), McGill University, Montréal, Quebec, Canada
| | - Alex I Wiesman
- Montreal Neurological Institute, McGill University, Montréal, Quebec, Canada
| | - Philippe Albouy
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montréal, Quebec, Canada
- Centre for Research on Brain, Language and Music (CRBLM), McGill University, Montréal, Quebec, Canada
- CERVO Brain Research Centre, School of Psychology, Université Laval, Québec City, Quebec, Canada
| | - Sylvain Baillet
- Montreal Neurological Institute, McGill University, Montréal, Quebec, Canada
| | - Virginia Penhune
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montréal, Quebec, Canada
- Centre for Research on Brain, Language and Music (CRBLM), McGill University, Montréal, Quebec, Canada
- Department of Psychology, Concordia University, Montréal, Quebec, Canada
| | - Robert J Zatorre
- Montreal Neurological Institute, McGill University, Montréal, Quebec, Canada
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montréal, Quebec, Canada
- Centre for Research on Brain, Language and Music (CRBLM), McGill University, Montréal, Quebec, Canada
| |
Collapse
|
2
|
Wang X, Zhou C, Jin X. Resonance and beat perception of ballroom dancers: An EEG study. PLoS One 2024; 19:e0312302. [PMID: 39432504 PMCID: PMC11493285 DOI: 10.1371/journal.pone.0312302] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2024] [Accepted: 10/03/2024] [Indexed: 10/23/2024] Open
Abstract
PURPOSE The ability to synchronize the perceptual and motor systems is important for full motor coordination and the core determinant of motor skill performance. Dance-related training has been found to effectively improve sensorimotor synchronization, however, the underlying characteristics behind these improvements still warrant further exploration. This study was conducted to investigate the behavioral and neuroactivity characteristics of ballroom dancers relative to those of non-dancers. PARTICIPANTS AND METHODS Thirty-two dancers (19.8 ± 1.8 years old) and 31 non-dancers (22.6 ± 3.1 years old) were recruited to perform a finger-tapping task in synchrony with audiovisual beat stimuli at two intervals: 400 and 800 ms, while simultaneously recording EEG data. Behavioral and neural activity data were recorded during the task. RESULTS The dancers employed a predictive strategy when synchronizing with the beat. EEG recordings revealed stronger brain resonance with external rhythmic stimuli, indicating heightened neural resonance compared to non-dancers (p < 0.05). The task was more challenging with an 800-ms beat interval, as observed through both behavioral metrics and corresponding neural signatures in the EEG data, leading to poorer synchronization performance and necessitating a greater allocation of attentional resources (ps < 0.05). CONCLUSION When performing the finger-tapping task involving audiovisual beats, the beat interval was the primary factor influencing movement synchronization, neural activity and attentional resource allocation. Although no significant behavioral differences were observed between dancers and non-dancers, dancers have enhanced neural resonance in response to rhythmic stimuli. Further research using more ecologically valid tasks and stimuli may better capture the full extent of dancers' synchronization abilities.
Collapse
Affiliation(s)
- Xuru Wang
- Shanghai Institute of Early Childhood Education, Shanghai Normal University, Shanghai, China
- School of Psychology, Shanghai University of Sport, Shanghai, China
| | - Chenglin Zhou
- School of Psychology, Shanghai University of Sport, Shanghai, China
- Key Laboratory of Motor Cognitive Assessment and Regulation, Shanghai, China
| | - Xinhong Jin
- School of Psychology, Shanghai University of Sport, Shanghai, China
- Key Laboratory of Motor Cognitive Assessment and Regulation, Shanghai, China
- Key Laboratory of Exercise and Health Sciences (Shanghai University of Sport), Ministry of Education, Shanghai, China
| |
Collapse
|
3
|
Bagchi D, Arumugam R, Chandrasekar VK, Senthilkumar DV. Generalized synchronization in a tritrophic food web metacommunity. J Theor Biol 2024; 582:111759. [PMID: 38367766 DOI: 10.1016/j.jtbi.2024.111759] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2023] [Revised: 01/22/2024] [Accepted: 02/06/2024] [Indexed: 02/19/2024]
Abstract
Complete synchronization among the metacommunity is known to elevate the risk of their extinction due to stochasticity and other environmental perturbations. Owing to the inherent heterogeneous nature of the metacommunity, we demonstrate the emergence of generalized synchronization among the patches of dispersally connected tritrophic food web using the framework of an auxiliary system approach and the mutual false nearest neighbor. We find that the critical value of the dispersal rate increases significantly with the size of the metacommunity for both unidirectional and bidirectional dispersals, which in turn corroborates that larger metacommunities are more stable than smaller ones. Further, we find that the critical value of the dispersal for the onset of generalized synchronization is smaller(larger) for bidirectional dispersal than that for unidirectional dispersal for smaller(larger) metacommunities. Most importantly, complete synchronization error remains finite even after the onset of generalized synchronization in a wider range of dispersal rate elucidating that the latter can serve as an early warning signal for the extinction of the metacommunity.
Collapse
Affiliation(s)
- Dweepabiswa Bagchi
- School of Physics, Indian Institute of Science Education and Research, Thiruvananthapuram 695 551, Kerala, India
| | - Ramesh Arumugam
- Department of Mathematics, School of Advanced Sciences, VIT-AP University, Guntur 522237, Andhra Pradesh, India
| | - V K Chandrasekar
- Department of Physics, Centre for Nonlinear Science and Engineering, School of Electrical and Electronics Engineering, SASTRA Deemed University, Thanjavur 613401, Tamilnadu, India
| | - D V Senthilkumar
- School of Physics, Indian Institute of Science Education and Research, Thiruvananthapuram 695 551, Kerala, India.
| |
Collapse
|
4
|
Huntley MK, Nguyen A, Albrecht MA, Marinovic W. Tactile cues are more intrinsically linked to motor timing than visual cues in visual-tactile sensorimotor synchronization. Atten Percept Psychophys 2024; 86:1022-1037. [PMID: 38263510 PMCID: PMC11062975 DOI: 10.3758/s13414-023-02828-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/07/2023] [Indexed: 01/25/2024]
Abstract
Many tasks require precise synchronization with external sensory stimuli, such as driving a car. This study investigates whether combined visual-tactile information provides additional benefits to movement synchrony over separate visual and tactile stimuli and explores the relationship with the temporal binding window for multisensory integration. In Experiment 1, participants completed a sensorimotor synchronization task to examine movement variability and a simultaneity judgment task to measure the temporal binding window. Results showed similar synchronization variability between visual-tactile and tactile-only stimuli, but significantly lower than visual only. In Experiment 2, participants completed a visual-tactile sensorimotor synchronization task with cross-modal stimuli presented inside (stimulus onset asynchrony 80 ms) and outside (stimulus-onset asynchrony 400 ms) the temporal binding window to examine temporal accuracy of movement execution. Participants synchronized their movement with the first stimulus in the cross-modal pair, either the visual or tactile stimulus. Results showed significantly greater temporal accuracy when only one stimulus was presented inside the window and the second stimulus was outside the window than when both stimuli were presented inside the window, with movement execution being more accurate when attending to the tactile stimulus. Overall, these findings indicate there may be a modality-specific benefit to sensorimotor synchronization performance, such that tactile cues are weighted more strongly than visual information as tactile information is more intrinsically linked to motor timing than visual information. Further, our findings indicate that the visual-tactile temporal binding window is related to the temporal accuracy of movement execution.
Collapse
Affiliation(s)
- Michelle K Huntley
- School of Population Health, Curtin University, Perth, Western Australia, Australia.
- School of Psychology and Public Health, La Trobe University, Wodonga, Victoria, Australia.
| | - An Nguyen
- School of Population Health, Curtin University, Perth, Western Australia, Australia
| | - Matthew A Albrecht
- Western Australia Centre for Road Safety Research, School of Psychological Science, University of Western Australia, Perth, Western Australia, Australia
| | - Welber Marinovic
- School of Population Health, Curtin University, Perth, Western Australia, Australia
| |
Collapse
|
5
|
Kim HW, Kovar J, Bajwa JS, Mian Y, Ahmad A, Mancilla Moreno M, Price TJ, Lee YS. Rhythmic motor behavior explains individual differences in grammar skills in adults. Sci Rep 2024; 14:3710. [PMID: 38355855 PMCID: PMC10867023 DOI: 10.1038/s41598-024-53382-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/06/2023] [Accepted: 01/31/2024] [Indexed: 02/16/2024] Open
Abstract
A growing body of literature has reported the relationship between music and language, particularly between individual differences in perceptual rhythm skill and grammar competency in children. Here, we investigated whether motoric aspects of rhythm processing-as measured by rhythmic finger tapping tasks-also explain the rhythm-grammar connection in 150 healthy young adults. We found that all expressive rhythm skills (spontaneous, synchronized, and continued tapping) along with rhythm discrimination skill significantly predicted receptive grammar skills on either auditory sentence comprehension or grammaticality well-formedness judgment (e.g., singular/plural, past/present), even after controlling for verbal working memory and music experience. Among these, synchronized tapping and rhythm discrimination explained unique variance of sentence comprehension and grammaticality judgment, respectively, indicating differential associations between different rhythm and grammar skills. Together, we demonstrate that even simple and repetitive motor behavior can account for seemingly high-order grammar skills in the adult population, suggesting that the sensorimotor system continue to support syntactic operations.
Collapse
Affiliation(s)
- Hyun-Woong Kim
- School of Behavioral and Brain Sciences, University of Texas at Dallas, Richardson, USA
- Callier Center for Communication Disorders, University of Texas at Dallas, Richardson, USA
- Department of Psychology, The University of Texas at Dallas, Richardson, USA
| | - Jessica Kovar
- School of Behavioral and Brain Sciences, University of Texas at Dallas, Richardson, USA
- Callier Center for Communication Disorders, University of Texas at Dallas, Richardson, USA
| | - Jesper Singh Bajwa
- School of Behavioral and Brain Sciences, University of Texas at Dallas, Richardson, USA
| | - Yasir Mian
- School of Behavioral and Brain Sciences, University of Texas at Dallas, Richardson, USA
| | - Ayesha Ahmad
- School of Behavioral and Brain Sciences, University of Texas at Dallas, Richardson, USA
- Department of Neuroscience and Center for Advanced Pain Studies, University of Texas at Dallas, Richardson, USA
| | - Marisol Mancilla Moreno
- School of Behavioral and Brain Sciences, University of Texas at Dallas, Richardson, USA
- Department of Neuroscience and Center for Advanced Pain Studies, University of Texas at Dallas, Richardson, USA
| | - Theodore J Price
- School of Behavioral and Brain Sciences, University of Texas at Dallas, Richardson, USA
- Department of Neuroscience and Center for Advanced Pain Studies, University of Texas at Dallas, Richardson, USA
| | - Yune Sang Lee
- School of Behavioral and Brain Sciences, University of Texas at Dallas, Richardson, USA.
- Callier Center for Communication Disorders, University of Texas at Dallas, Richardson, USA.
- Department of Speech, Language, and Hearing, The University of Texas at Dallas, Richardson, USA.
| |
Collapse
|
6
|
Whitton SA, Jiang F. Sensorimotor synchronization with visual, auditory, and tactile modalities. PSYCHOLOGICAL RESEARCH 2023; 87:2204-2217. [PMID: 36773102 PMCID: PMC10567517 DOI: 10.1007/s00426-023-01801-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2022] [Accepted: 01/30/2023] [Indexed: 02/12/2023]
Abstract
While it is well known that humans are highly responsive to rhythm, the factors that influence our ability to synchronize remain unclear. In the current study, we examined how stimulus modality and rhythmic deviation, along with the synchronizer's level of musicality, impacted sensorimotor synchronization (SMS). Utilizing a finger-tapping task and three sensory modalities (visual, auditory, and tactile), we manipulated rhythmic deviation by varying the temporal position, intensity, and availability of cues across four deviation levels. Additionally, to determine our participants' musical familiarity and aptitude, we administered the Goldsmiths Musical Sophistication Index (Gold-MSI) questionnaire. We found that SMS to external rhythmic stimuli was significantly more precise for auditory and tactile than for visual sequences. Further, we found SMS consistency significantly decreased in all modalities with increased rhythmic deviation, suggesting rhythmic deviation directly relates to SMS difficulty. Moreover, a significant correlation was found between Gold-MSI scores and SMS consistency in the most rhythmically deviant level, such that the higher one's musical general sophistication score, the greater one's SMS ability. This held for all three modalities. Combined, these findings suggest that rhythmic synchronization performance is affected not only by the modality and rhythmic deviation of the stimuli but also by the musical general sophistication of the synchronizer.
Collapse
Affiliation(s)
| | - Fang Jiang
- Department of Psychology, University of Nevada, Reno, USA
| |
Collapse
|
7
|
Huang JK, Yin B. Phylogenic evolution of beat perception and synchronization: a comparative neuroscience perspective. Front Syst Neurosci 2023; 17:1169918. [PMID: 37325439 PMCID: PMC10264645 DOI: 10.3389/fnsys.2023.1169918] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2023] [Accepted: 05/16/2023] [Indexed: 06/17/2023] Open
Abstract
The study of music has long been of interest to researchers from various disciplines. Scholars have put forth numerous hypotheses regarding the evolution of music. With the rise of cross-species research on music cognition, researchers hope to gain a deeper understanding of the phylogenic evolution, behavioral manifestation, and physiological limitations of the biological ability behind music, known as musicality. This paper presents the progress of beat perception and synchronization (BPS) research in cross-species settings and offers varying views on the relevant hypothesis of BPS. The BPS ability observed in rats and other mammals as well as recent neurobiological findings presents a significant challenge to the vocal learning and rhythm synchronization hypothesis if taken literally. An integrative neural-circuit model of BPS is proposed to accommodate the findings. In future research, it is recommended that greater consideration be given to the social attributes of musicality and to the behavioral and physiological changes that occur across different species in response to music characteristics.
Collapse
Affiliation(s)
- Jin-Kun Huang
- Laboratory for Learning and Behavioral Sciences, School of Psychology, Fujian Normal University, Fuzhou, Fujian, China
| | - Bin Yin
- Laboratory for Learning and Behavioral Sciences, School of Psychology, Fujian Normal University, Fuzhou, Fujian, China
- Department of Applied Psychology, School of Psychology, Fujian Normal University, Fuzhou, Fujian, China
| |
Collapse
|
8
|
Luo L, Lu L. Studying rhythm processing in speech through the lens of auditory-motor synchronization. Front Neurosci 2023; 17:1146298. [PMID: 36937684 PMCID: PMC10017839 DOI: 10.3389/fnins.2023.1146298] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2023] [Accepted: 02/20/2023] [Indexed: 03/06/2023] Open
Abstract
Continuous speech is organized into a hierarchy of rhythms. Accurate processing of this rhythmic hierarchy through the interactions of auditory and motor systems is fundamental to speech perception and production. In this mini-review, we aim to evaluate the implementation of behavioral auditory-motor synchronization paradigms when studying rhythm processing in speech. First, we present an overview of the classic finger-tapping paradigm and its application in revealing differences in auditory-motor synchronization between the typical and clinical populations. Next, we highlight key findings on rhythm hierarchy processing in speech and non-speech stimuli from finger-tapping studies. Following this, we discuss the potential caveats of the finger-tapping paradigm and propose the speech-speech synchronization (SSS) task as a promising tool for future studies. Overall, we seek to raise interest in developing new methods to shed light on the neural mechanisms of speech processing.
Collapse
Affiliation(s)
- Lu Luo
- School of Psychology, Beijing Sport University, Beijing, China
- Laboratory of Sports Stress and Adaptation of General Administration of Sport, Beijing, China
| | - Lingxi Lu
- Center for the Cognitive Science of Language, Beijing Language and Culture University, Beijing, China
- *Correspondence: Lingxi Lu,
| |
Collapse
|
9
|
Zhan L, Huang Y, Guo Z, Yang J, Gu L, Zhong S, Wu X. Visual over auditory superiority in sensorimotor timing under optimized condition. Front Psychol 2022; 13:1048943. [PMID: 36507012 PMCID: PMC9731274 DOI: 10.3389/fpsyg.2022.1048943] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/20/2022] [Accepted: 11/04/2022] [Indexed: 11/25/2022] Open
Abstract
Auditory over visual advantage in temporal processing is generally appreciated, such as the well-established auditory superiority in sensorimotor timing. To test for a possible visual superiority in temporal processing, here, we present a data set composed of a large 60 subjects sample and a data set including eight smaller samples of approximately 15 subjects, showing that synchronization to a temporally regular sequence was more stable for a visual bouncing ball (VB) than for auditory tones (ATs). The results demonstrate that vision can be superior over audition in sensorimotor timing under optimized conditions, challenging the generally believed auditory superiority in temporal processing. In contrast to the auditory-specific biological substrates of timing in sensorimotor interaction, the present finding points to tight visual-motor cortical coupling in sensorimotor timing.
Collapse
|
10
|
Yin B, Shi Z, Wang Y, Meck WH. Oscillation/Coincidence-Detection Models of Reward-Related Timing in Corticostriatal Circuits. TIMING & TIME PERCEPTION 2022. [DOI: 10.1163/22134468-bja10057] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
Abstract
The major tenets of beat-frequency/coincidence-detection models of reward-related timing are reviewed in light of recent behavioral and neurobiological findings. This includes the emphasis on a core timing network embedded in the motor system that is comprised of a corticothalamic-basal ganglia circuit. Therein, a central hub provides timing pulses (i.e., predictive signals) to the entire brain, including a set of distributed satellite regions in the cerebellum, cortex, amygdala, and hippocampus that are selectively engaged in timing in a manner that is more dependent upon the specific sensory, behavioral, and contextual requirements of the task. Oscillation/coincidence-detection models also emphasize the importance of a tuned ‘perception’ learning and memory system whereby target durations are detected by striatal networks of medium spiny neurons (MSNs) through the coincidental activation of different neural populations, typically utilizing patterns of oscillatory input from the cortex and thalamus or derivations thereof (e.g., population coding) as a time base. The measure of success of beat-frequency/coincidence-detection accounts, such as the Striatal Beat-Frequency model of reward-related timing (SBF), is their ability to accommodate new experimental findings while maintaining their original framework, thereby making testable experimental predictions concerning diagnosis and treatment of issues related to a variety of dopamine-dependent basal ganglia disorders, including Huntington’s and Parkinson’s disease.
Collapse
Affiliation(s)
- Bin Yin
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708, USA
- School of Psychology, Fujian Normal University, Fuzhou, 350117, Fujian, China
| | - Zhuanghua Shi
- Department of Psychology, Ludwig Maximilian University of Munich, 80802 Munich, Germany
| | - Yaxin Wang
- School of Psychology, Fujian Normal University, Fuzhou, 350117, Fujian, China
| | - Warren H. Meck
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708, USA
| |
Collapse
|
11
|
Ross JM, Balasubramaniam R. Time Perception for Musical Rhythms: Sensorimotor Perspectives on Entrainment, Simulation, and Prediction. Front Integr Neurosci 2022; 16:916220. [PMID: 35865808 PMCID: PMC9294366 DOI: 10.3389/fnint.2022.916220] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2022] [Accepted: 06/16/2022] [Indexed: 11/19/2022] Open
Abstract
Neural mechanisms supporting time perception in continuously changing sensory environments may be relevant to a broader understanding of how the human brain utilizes time in cognition and action. In this review, we describe current theories of sensorimotor engagement in the support of subsecond timing. We focus on musical timing due to the extensive literature surrounding movement with and perception of musical rhythms. First, we define commonly used but ambiguous concepts including neural entrainment, simulation, and prediction in the context of musical timing. Next, we summarize the literature on sensorimotor timing during perception and performance and describe current theories of sensorimotor engagement in the support of subsecond timing. We review the evidence supporting that sensorimotor engagement is critical in accurate time perception. Finally, potential clinical implications for a sensorimotor perspective of timing are highlighted.
Collapse
Affiliation(s)
- Jessica M. Ross
- Veterans Affairs Palo Alto Healthcare System and the Sierra Pacific Mental Illness, Research, Education, and Clinical Center, Palo Alto, CA, United States
- Department of Psychiatry and Behavioral Sciences, Stanford University Medical Center, Stanford, CA, United States
- Berenson-Allen Center for Non-invasive Brain Stimulation, Beth Israel Deaconess Medical Center, Boston, MA, United States
- Department of Neurology, Harvard Medical School, Boston, MA, United States
- *Correspondence: Jessica M. Ross,
| | - Ramesh Balasubramaniam
- Cognitive and Information Sciences, University of California, Merced, Merced, CA, United States
| |
Collapse
|
12
|
Ross JM, Sarkar M, Keller CJ. Experimental suppression of transcranial magnetic stimulation-electroencephalography sensory potentials. Hum Brain Mapp 2022; 43:5141-5153. [PMID: 35770956 PMCID: PMC9812254 DOI: 10.1002/hbm.25990] [Citation(s) in RCA: 19] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2022] [Revised: 05/13/2022] [Accepted: 06/10/2022] [Indexed: 01/15/2023] Open
Abstract
The sensory experience of transcranial magnetic stimulation (TMS) evokes cortical responses measured in electroencephalography (EEG) that confound interpretation of TMS-evoked potentials (TEPs). Methods for sensory masking have been proposed to minimize sensory contributions to the TEP, but the most effective combination for suprathreshold TMS to dorsolateral prefrontal cortex (dlPFC) is unknown. We applied sensory suppression techniques and quantified electrophysiology and perception from suprathreshold dlPFC TMS to identify the best combination to minimize the sensory TEP. In 21 healthy adults, we applied single pulse TMS at 120% resting motor threshold (rMT) to the left dlPFC and compared EEG vertex N100-P200 and perception. Conditions included three protocols: No masking (no auditory masking, no foam, and jittered interstimulus interval [ISI]), Standard masking (auditory noise, foam, and jittered ISI), and our ATTENUATE protocol (auditory noise, foam, over-the-ear protection, and unjittered ISI). ATTENUATE reduced vertex N100-P200 by 56%, "click" loudness perception by 50%, and scalp sensation by 36%. We show that sensory prediction, induced with predictable ISI, has a suppressive effect on vertex N100-P200, and that combining standard suppression protocols with sensory prediction provides the best N100-P200 suppression. ATTENUATE was more effective than Standard masking, which only reduced vertex N100-P200 by 22%, loudness by 27%, and scalp sensation by 24%. We introduce a sensory suppression protocol superior to Standard masking and demonstrate that using an unjittered ISI can contribute to minimizing sensory confounds. ATTENUATE provides superior sensory suppression to increase TEP signal-to-noise and contributes to a growing understanding of TMS-EEG sensory neuroscience.
Collapse
Affiliation(s)
- Jessica M. Ross
- Veterans Affairs Palo Alto Healthcare System, and the Sierra Pacific Mental IllnessResearch, Education, and Clinical Center (MIRECC)Palo AltoCaliforniaUSA,Department of Psychiatry and Behavioral SciencesStanford University Medical CenterStanfordCaliforniaUSA
| | - Manjima Sarkar
- Department of Psychiatry and Behavioral SciencesStanford University Medical CenterStanfordCaliforniaUSA
| | - Corey J. Keller
- Veterans Affairs Palo Alto Healthcare System, and the Sierra Pacific Mental IllnessResearch, Education, and Clinical Center (MIRECC)Palo AltoCaliforniaUSA,Department of Psychiatry and Behavioral SciencesStanford University Medical CenterStanfordCaliforniaUSA
| |
Collapse
|
13
|
It Takes Two: Interpersonal Neural Synchrony Is Increased after Musical Interaction. Brain Sci 2022; 12:brainsci12030409. [PMID: 35326366 PMCID: PMC8946180 DOI: 10.3390/brainsci12030409] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2021] [Revised: 03/11/2022] [Accepted: 03/13/2022] [Indexed: 02/05/2023] Open
Abstract
Music’s deeply interpersonal nature suggests that music-derived neuroplasticity relates to interpersonal temporal dynamics, or synchrony. Interpersonal neural synchrony (INS) has been found to correlate with increased behavioral synchrony during social interactions and may represent mechanisms that support them. As social interactions often do not have clearly delineated boundaries, and many start and stop intermittently, we hypothesize that a neural signature of INS may be detectable following an interaction. The present study aimed to investigate this hypothesis using a pre-post paradigm, measuring interbrain phase coherence before and after a cooperative dyadic musical interaction. Ten dyads underwent synchronous electroencephalographic (EEG) recording during silent, non-interactive periods before and after a musical interaction in the form of a cooperative tapping game. Significant post-interaction increases in delta band INS were found in the post-condition and were positively correlated with the duration of the preceding interaction. These findings suggest a mechanism by which social interaction may be efficiently continued after interruption and hold the potential for measuring neuroplastic adaption in longitudinal studies. These findings also support the idea that INS during social interaction represents active mechanisms for maintaining synchrony rather than mere parallel processing of stimuli and motor activity.
Collapse
|
14
|
Huang Y, Zhong S, Zhan L, Sun M, Wu X. Sustained visual attention improves visuomotor timing. PSYCHOLOGICAL RESEARCH 2022; 86:2059-2066. [PMID: 35048198 DOI: 10.1007/s00426-021-01629-9] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2021] [Accepted: 12/07/2021] [Indexed: 11/24/2022]
Abstract
Relative to audition, vision is considered much less trustworthy in sensorimotor timing such as synchronizing finger movements with a temporally regular sequence. Visuomotor timing requires maintaining attention over time, whereas the sustained visual attention may not be well held in conventional visuomotor timing task settings where flashing visual stimuli consisted of a briefly presented flash and a long blank period. In the present study, the potential attentional lapses in time due to the disappearance of the flash were carefully controlled in Experiment 1 by changing the color of the flash instead of its disappearance, or in Experiment 2 by adding an additional continuously presented fixation point serving as an external attentional cue when the flash disappeared. Improvement of visuomotor timing performance was found in both experiments. The finding suggests a role of enhanced sustained visual attention in improving visuomotor timing, by which vision could also be a trustworthy modality for processing temporal information in sensorimotor interactions.
Collapse
Affiliation(s)
- Yingyu Huang
- Department of Psychology, Sun Yat-Sen University, 132 Waihuan East Road, Higher Education Mega Center, Guangzhou, 510006, Guangdong, China
| | - Shengqi Zhong
- Department of Psychology, Sun Yat-Sen University, 132 Waihuan East Road, Higher Education Mega Center, Guangzhou, 510006, Guangdong, China
| | - Liying Zhan
- Department of Psychology, Sun Yat-Sen University, 132 Waihuan East Road, Higher Education Mega Center, Guangzhou, 510006, Guangdong, China
| | - Mi Sun
- Department of Psychology, Sun Yat-Sen University, 132 Waihuan East Road, Higher Education Mega Center, Guangzhou, 510006, Guangdong, China
| | - Xiang Wu
- Department of Psychology, Sun Yat-Sen University, 132 Waihuan East Road, Higher Education Mega Center, Guangzhou, 510006, Guangdong, China.
| |
Collapse
|
15
|
Comstock DC, Ross JM, Balasubramaniam R. Modality-specific frequency band activity during neural entrainment to auditory and visual rhythms. Eur J Neurosci 2021; 54:4649-4669. [PMID: 34008232 DOI: 10.1111/ejn.15314] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2019] [Revised: 05/04/2021] [Accepted: 05/14/2021] [Indexed: 01/22/2023]
Abstract
Rhythm perception depends on the ability to predict the onset of rhythmic events. Previous studies indicate beta band modulation is involved in predicting the onset of auditory rhythmic events (Fujioka et al., 2009, 2012; Snyder & Large, 2005). We sought to determine if similar processes are recruited for prediction of visual rhythms by investigating whether beta band activity plays a role in a modality-dependent manner for rhythm perception. We looked at electroencephalography time-frequency neural correlates of prediction using an omission paradigm with auditory and visual rhythms. By using omissions, we can separate out predictive timing activity from stimulus-driven activity. We hypothesized that there would be modality-independent markers of rhythm prediction in induced beta band oscillatory activity, and our results support this hypothesis. We find induced and evoked predictive timing in both auditory and visual modalities. Additionally, we performed an exploratory-independent components-based spatial clustering analysis, and describe all resulting clusters. This analysis reveals that there may be overlapping networks of predictive beta activity based on common activation in the parietal and right frontal regions, auditory-specific predictive beta in bilateral sensorimotor regions, and visually specific predictive beta in midline central, and bilateral temporal/parietal regions. This analysis also shows evoked predictive beta activity in the left sensorimotor region specific to auditory rhythms and implicates modality-dependent networks for auditory and visual rhythm perception.
Collapse
Affiliation(s)
- Daniel C Comstock
- Cognitive and Information Sciences, University of California, Merced, CA, USA
| | - Jessica M Ross
- Berenson-Allen Center for Noninvasive Brain Stimulation, Beth Israel Deaconess Medical Center, Boston, MA, USA.,Department of Neurology, Harvard Medical School, Boston, MA, USA
| | | |
Collapse
|
16
|
Blais M, Jucla M, Maziero S, Albaret JM, Chaix Y, Tallet J. The Differential Effects of Auditory and Visual Stimuli on Learning, Retention and Reactivation of a Perceptual-Motor Temporal Sequence in Children With Developmental Coordination Disorder. Front Hum Neurosci 2021; 15:616795. [PMID: 33867955 PMCID: PMC8044544 DOI: 10.3389/fnhum.2021.616795] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2020] [Accepted: 03/03/2021] [Indexed: 11/13/2022] Open
Abstract
This study investigates the procedural learning, retention, and reactivation of temporal sensorimotor sequences in children with and without developmental coordination disorder (DCD). Twenty typically-developing (TD) children and 12 children with DCD took part in this study. The children were required to tap on a keyboard, synchronizing with auditory or visual stimuli presented as an isochronous temporal sequence, and practice non-isochronous temporal sequences to memorize them. Immediate and delayed retention of the audio-motor and visuo-motor non-isochronous sequences were tested by removing auditory or visual stimuli immediately after practice and after a delay of 2 h. A reactivation test involved reintroducing the auditory and visual stimuli after the delayed recall. Data were computed via circular analyses to obtain asynchrony, the stability of synchronization and errors (i.e., the number of supplementary taps). Firstly, an overall deficit in synchronization with both auditory and visual isochronous stimuli was observed in DCD children compared to TD children. During practice, further improvements (decrease in asynchrony and increase in stability) were found for the audio-motor non-isochronous sequence compared to the visuo-motor non-isochronous sequence in both TD children and children with DCD. However, a drastic increase in errors occurred in children with DCD during immediate retention as soon as the auditory stimuli were removed. Reintroducing auditory stimuli decreased errors in the audio-motor sequence for children with DCD. Such changes were not seen for the visuo-motor non-isochronous sequence, which was equally learned, retained and reactivated in DCD and TD children. All these results suggest that TD children benefit from both auditory and visual stimuli to memorize the sequence, whereas children with DCD seem to present a deficit in integrating an audio-motor sequence in their memory. The immediate effect of reactivation suggests a specific dependency on auditory information in DCD. Contrary to the audio-motor sequence, the visuo-motor sequence was both learned and retained in children with DCD. This suggests that visual stimuli could be the best information for memorizing a temporal sequence in DCD. All these results are discussed in terms of a specific audio-motor coupling deficit in DCD.
Collapse
Affiliation(s)
- Mélody Blais
- Toulouse NeuroImaging Center, Université de Toulouse, Inserm, UPS, Toulouse, France
- EuroMov Digital Health in Motion, Univ Montpellier, IMT Mines Ales, Montpellier, France
| | - Mélanie Jucla
- Octogone-Lordat, University of Toulouse, Toulouse, France
| | - Stéphanie Maziero
- Toulouse NeuroImaging Center, Université de Toulouse, Inserm, UPS, Toulouse, France
- Octogone-Lordat, University of Toulouse, Toulouse, France
| | - Jean-Michel Albaret
- Toulouse NeuroImaging Center, Université de Toulouse, Inserm, UPS, Toulouse, France
| | - Yves Chaix
- Toulouse NeuroImaging Center, Université de Toulouse, Inserm, UPS, Toulouse, France
- Hôpital des Enfants, Centre Hospitalier Universitaire de Toulouse, CHU Purpan, Toulouse, France
| | - Jessica Tallet
- Toulouse NeuroImaging Center, Université de Toulouse, Inserm, UPS, Toulouse, France
| |
Collapse
|
17
|
Braun Janzen T, Schaffert N, Schlüter S, Ploigt R, Thaut MH. The effect of perceptual-motor continuity compatibility on the temporal control of continuous and discontinuous self-paced rhythmic movements. Hum Mov Sci 2021; 76:102761. [PMID: 33485154 DOI: 10.1016/j.humov.2021.102761] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2020] [Revised: 12/18/2020] [Accepted: 01/07/2021] [Indexed: 11/16/2022]
Abstract
One of the questions yet to be fully understood is to what extent the properties of the sensory and the movement information interact to facilitate sensorimotor integration. In this study, we examined the relative contribution of the continuity compatibility between motor goals and their sensory outcomes in timing variability. The variability of inter-response intervals was measured in a synchronization-continuation paradigm. Participants performed two repetitive movement tasks whereby they drew circles either using continuous or discontinuous self-paced movements while receiving discrete or continuous auditory feedback. The results demonstrated that the effect of perceptual-motor continuity compatibility may be limited in self-paced auditory-motor synchronization as timing variability was not significantly influenced by the continuity of the feedback or the continuity compatibility between feedback and the movement produced. In addition, results suggested that the presence of salient perceptual events marking the completion of the time intervals elicited a common timing process in both continuous and discontinuous circle drawing, regardless of the continuity of the auditory feedback. These findings open a new line of investigation into the role of the discriminability and reliability of the event-based information in determining the nature of the timing mechanisms engaged in continuous and discontinuous self-paced rhythmic movements.
Collapse
Affiliation(s)
- Thenille Braun Janzen
- Center for Mathematics, Computing and Cognition, Universidade Federal do ABC, Sao Bernardo do Campo, Brazil.
| | - Nina Schaffert
- Department of Movement and Training Science, Institute for Human Movement Science, University of Hamburg, Hamburg, Germany; BeSB GmbH Berlin, Sound Engineering, Berlin, Germany
| | | | - Roy Ploigt
- BeSB GmbH Berlin, Sound Engineering, Berlin, Germany
| | - Michael H Thaut
- Music and Health Science Research Collaboratory, Faculty of Music, University of Toronto, Toronto, Canada
| |
Collapse
|
18
|
Proksch S, Comstock DC, Médé B, Pabst A, Balasubramaniam R. Motor and Predictive Processes in Auditory Beat and Rhythm Perception. Front Hum Neurosci 2020; 14:578546. [PMID: 33061902 PMCID: PMC7518112 DOI: 10.3389/fnhum.2020.578546] [Citation(s) in RCA: 28] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Accepted: 08/18/2020] [Indexed: 11/30/2022] Open
Abstract
In this article, we review recent advances in research on rhythm and musical beat perception, focusing on the role of predictive processes in auditory motor interactions. We suggest that experimental evidence of the motor system's role in beat perception, including in passive listening, may be explained by the generation and maintenance of internal predictive models, concordant with the Active Inference framework of sensory processing. We highlight two complementary hypotheses for the neural underpinnings of rhythm perception: The Action Simulation for Auditory Prediction hypothesis (Patel and Iversen, 2014) and the Gradual Audiomotor Evolution hypothesis (Merchant and Honing, 2014) and review recent experimental progress supporting each of these hypotheses. While initial formulations of ASAP and GAE explain different aspects of beat-based timing-the involvement of motor structures in the absence of movement, and physical entrainment to an auditory beat respectively-we suggest that work under both hypotheses provide converging evidence toward understanding the predictive role of the motor system in the perception of rhythm, and the specific neural mechanisms involved. We discuss future experimental work necessary to further evaluate the causal neural mechanisms underlying beat and rhythm perception.
Collapse
Affiliation(s)
- Shannon Proksch
- Sensorimotor Neuroscience Laboratory, Cognitive & Information Sciences, University of California, Merced, Merced, CA, United States
| | - Daniel C Comstock
- Sensorimotor Neuroscience Laboratory, Cognitive & Information Sciences, University of California, Merced, Merced, CA, United States
| | - Butovens Médé
- Sensorimotor Neuroscience Laboratory, Cognitive & Information Sciences, University of California, Merced, Merced, CA, United States
| | - Alexandria Pabst
- Sensorimotor Neuroscience Laboratory, Cognitive & Information Sciences, University of California, Merced, Merced, CA, United States
| | - Ramesh Balasubramaniam
- Sensorimotor Neuroscience Laboratory, Cognitive & Information Sciences, University of California, Merced, Merced, CA, United States
| |
Collapse
|
19
|
Graber E, Fujioka T. Induced Beta Power Modulations during Isochronous Auditory Beats Reflect Intentional Anticipation before Gradual Tempo Changes. Sci Rep 2020; 10:4207. [PMID: 32144306 PMCID: PMC7060226 DOI: 10.1038/s41598-020-61044-9] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2019] [Accepted: 02/20/2020] [Indexed: 11/20/2022] Open
Abstract
Induced beta-band power modulations in auditory and motor-related brain areas have been associated with automatic temporal processing of isochronous beats and explicit, temporally-oriented attention. Here, we investigated how explicit top-down anticipation before upcoming tempo changes, a sustained process commonly required during music performance, changed beta power modulations during listening to isochronous beats. Musicians’ electroencephalograms were recorded during the task of anticipating accelerating, decelerating, or steady beats after direction-specific visual cues. In separate behavioural testing for tempo-change onset detection, such cues were found to facilitate faster responses, thus effectively inducing high-level anticipation. In the electroencephalograms, periodic beta power reductions in a frontocentral topographic component with seed-based source contributions from auditory and sensorimotor cortices were apparent after isochronous beats with anticipation in all conditions, generally replicating patterns found previously during passive listening to isochronous beats. With anticipation before accelerations, the magnitude of the power reduction was significantly weaker than in the steady condition. Between the accelerating and decelerating conditions, no differences were found, suggesting that the observed beta patterns may represent an aspect of high-level anticipation common before both tempo changes, like increased attention. Overall, these results indicate that top-down anticipation influences ongoing auditory beat processing in beta-band networks.
Collapse
Affiliation(s)
- Emily Graber
- Center for Computer Research in Music and Acoustics, Stanford University, Stanford, CA, 94305, USA.
| | - Takako Fujioka
- Center for Computer Research in Music and Acoustics, Stanford University, Stanford, CA, 94305, USA.,Wu Tsai Neurosciences Institute, Stanford University, Stanford, CA, 94305, USA
| |
Collapse
|
20
|
González CR, Bavassi ML, Laje R. Response to perturbations as a built-in feature in a mathematical model for paced finger tapping. Phys Rev E 2020; 100:062412. [PMID: 31962404 DOI: 10.1103/physreve.100.062412] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2019] [Indexed: 11/07/2022]
Abstract
Paced finger tapping is one of the simplest tasks to study sensorimotor synchronization. The subject is instructed to tap in synchrony with a periodic sequence of brief tones, and the time difference (called asynchrony) between each response and the corresponding stimulus is recorded. Despite its simplicity, this task helps to unveil interesting features of the underlying neural system and the error-correction mechanism responsible for synchronization. Perturbation experiments are usually performed to probe the subject's response, for example, in the form of a "step change," i.e., an unexpected change in tempo. The asynchrony is the usual observable in such experiments and it is chosen as the main variable in many mathematical models that attempt to describe the phenomenon. In this work we show that although asynchrony can be perfectly described in operational terms, it is not well defined as a model variable when tempo perturbations are considered. We introduce an alternative variable and a mathematical model that intrinsically takes into account the perturbation and make theoretical predictions about the response to novel perturbations based on the geometrical organization of the trajectories in phase space. Our proposal is relevant to understand interpersonal synchronization and the synchronization to nonperiodic stimuli.
Collapse
Affiliation(s)
- Claudia R González
- Universidad Nacional de Quilmes, Departamento de Ciencia y Tecnología, Laboratorio de Dinámica Sensomotora, Bernal, Argentina
| | - M Luz Bavassi
- Universidad de Buenos Aires, Instituto de Fisiología, Biología Molecular y Neurociencias (IFIByNE), Buenos Aires, Argentina and CONICET, Buenos Aires, Argentina
| | - Rodrigo Laje
- Universidad Nacional de Quilmes, Departamento de Ciencia y Tecnología, Laboratorio de Dinámica Sensomotora, Bernal, Argentina and CONICET, Buenos Aires, Argentina
| |
Collapse
|
21
|
Sasaki M, Iversen J, Callan DE. Music Improvisation Is Characterized by Increase EEG Spectral Power in Prefrontal and Perceptual Motor Cortical Sources and Can be Reliably Classified From Non-improvisatory Performance. Front Hum Neurosci 2019; 13:435. [PMID: 31920594 PMCID: PMC6915035 DOI: 10.3389/fnhum.2019.00435] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2019] [Accepted: 11/27/2019] [Indexed: 01/31/2023] Open
Abstract
This study expores neural activity underlying creative processes through the investigation of music improvisation. Fourteen guitar players with a high level of improvisation skill participated in this experiment. The experimental task involved playing 32-s alternating blocks of improvisation and scales on guitar. electroencephalography (EEG) data was measured continuously throughout the experiment. In order to remove potential artifacts and extract brain-related activity the following signal processing techniques were employed: bandpass filtering, Artifact Subspace Reconstruction, and Independent Component Analysis (ICA). For each participant, artifact related independent components (ICs) were removed from the EEG data and only ICs found to be from brain activity were retained. Source localization using this brain-related activity was carried out using sLORETA. Greater activity for improvisation over scale was found in multiple frequency bands (theta, alpha, and beta) localized primarily in the medial frontal cortex (MFC), Middle frontal gyrus (MFG), anterior cingulate, polar medial prefrontal cortex (MPFC), premotor cortex (PMC), pre and postcentral gyrus (PreCG and PostCG), superior temporal gyrus (STG), inferior parietal lobule (IPL), and the temporal-parietal junction. Together this collection of brain regions suggests that improvisation was mediated by processes involved in coordinating planned sequences of movement that are modulated in response to ongoing environmental context through monitoring and feedback of sensory states in relation to internal plans and goals. Machine-learning using Common Spatial Patterns (CSP) for EEG feature extraction attained a mean of over 75% classification performance for improvisation vs. scale conditions across participants. These machine-learning results are a step towards the development of a brain-computer interface that could be used for neurofeedback training to improve creativity.
Collapse
Affiliation(s)
- Masaru Sasaki
- Graduate School of Frontier Biosciences, Osaka University, Osaka, Japan
| | - John Iversen
- Swartz Center for Computational Neuroscience, University of California, San Diego, San Diego, CA, United States
| | - Daniel E Callan
- Center for Information and Neural Networks (CiNet), National Institute of Information and Communications Technology (NICT), Osaka University, Osaka, Japan
| |
Collapse
|
22
|
Shared neural resources of rhythm and syntax: An ALE meta-analysis. Neuropsychologia 2019; 137:107284. [PMID: 31783081 DOI: 10.1016/j.neuropsychologia.2019.107284] [Citation(s) in RCA: 35] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2019] [Accepted: 11/25/2019] [Indexed: 11/20/2022]
Abstract
A growing body of evidence has highlighted behavioral connections between musical rhythm and linguistic syntax, suggesting that these abilities may be mediated by common neural resources. Here, we performed a quantitative meta-analysis of neuroimaging studies using activation likelihood estimate (ALE) to localize the shared neural structures engaged in a representative set of musical rhythm (rhythm, beat, and meter) and linguistic syntax (merge movement, and reanalysis) operations. Rhythm engaged a bilateral sensorimotor network throughout the brain consisting of the inferior frontal gyri, supplementary motor area, superior temporal gyri/temporoparietal junction, insula, intraparietal lobule, and putamen. By contrast, syntax mostly recruited the left sensorimotor network including the inferior frontal gyrus, posterior superior temporal gyrus, premotor cortex, and supplementary motor area. Intersections between rhythm and syntax maps yielded overlapping regions in the left inferior frontal gyrus, left supplementary motor area, and bilateral insula-neural substrates involved in temporal hierarchy processing and predictive coding. Together, this is the first neuroimaging meta-analysis providing detailed anatomical overlap of sensorimotor regions recruited for musical rhythm and linguistic syntax.
Collapse
|
23
|
Rankin J, Rinzel J. Computational models of auditory perception from feature extraction to stream segregation and behavior. Curr Opin Neurobiol 2019; 58:46-53. [PMID: 31326723 DOI: 10.1016/j.conb.2019.06.009] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2019] [Accepted: 06/22/2019] [Indexed: 10/26/2022]
Abstract
Audition is by nature dynamic, from brainstem processing on sub-millisecond time scales, to segregating and tracking sound sources with changing features, to the pleasure of listening to music and the satisfaction of getting the beat. We review recent advances from computational models of sound localization, of auditory stream segregation and of beat perception/generation. A wealth of behavioral, electrophysiological and imaging studies shed light on these processes, typically with synthesized sounds having regular temporal structure. Computational models integrate knowledge from different experimental fields and at different levels of description. We advocate a neuromechanistic modeling approach that incorporates knowledge of the auditory system from various fields, that utilizes plausible neural mechanisms, and that bridges our understanding across disciplines.
Collapse
Affiliation(s)
- James Rankin
- College of Engineering, Mathematics and Physical Sciences, University of Exeter, Harrison Building, North Park Rd, Exeter EX4 4QF, UK.
| | - John Rinzel
- Center for Neural Science, New York University, 4 Washington Place, 10003 New York, NY, United States; Courant Institute of Mathematical Sciences, New York University, 251 Mercer St, 10012 New York, NY, United States
| |
Collapse
|
24
|
Yang J, Ouyang F, Holm L, Huang Y, Gan L, Zhou L, Chao H, Wang M, He M, Zhang S, Yang B, Wu X. A mechanism of timing variability underlying the association between the mean and SD of asynchrony. Hum Mov Sci 2019; 67:102500. [PMID: 31326744 DOI: 10.1016/j.humov.2019.102500] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2018] [Revised: 06/17/2019] [Accepted: 07/09/2019] [Indexed: 11/28/2022]
Abstract
Sensorimotor timing behaviors typically exhibit an elusive phenomenon known as the negative asynchrony. When synchronizing movements (e.g. finger taps) with an external sequence (e.g. a metronome), people's taps precede event onsets by a few tens of milliseconds. We recently reported that asynchrony is less negative in participants with lower asynchrony variability. This indicates an association between negative asynchrony and variability of timing. Here, in 24 metronome-synchronization data sets, we modeled asynchrony series using a sensorimotor synchronization model that accounts for serial dependence of asynchronies. The results showed that the modeling well captured the negative correlation between the mean and SD of asynchrony. The finding suggests that serial dependence in asynchronies is an essential mechanism of timing variability underlying the association between the mean and SD of asynchrony.
Collapse
Affiliation(s)
- Junkai Yang
- Department of Psychology, Sun Yat-Sen University, China; Laboratory for Behavioral and Regional Finance, Guangdong University of Finance, Guangzhou, China
| | - Feiyi Ouyang
- Department of Psychology, Sun Yat-Sen University, China
| | - Linus Holm
- Department of Psychology, Umeå University, Sweden.
| | - Yingyu Huang
- Department of Psychology, Sun Yat-Sen University, China
| | - Lingyu Gan
- Department of Psychology, Sun Yat-Sen University, China
| | - Liang Zhou
- Department of Psychology, Sun Yat-Sen University, China; School of Psychology, Shandong Normal University, Jinan, China
| | - Huizhen Chao
- Department of Psychology, Sun Yat-Sen University, China
| | - Mengye Wang
- Department of Psychology, Sun Yat-Sen University, China
| | - Mengxue He
- Department of Psychology, Sun Yat-Sen University, China
| | - Sheng Zhang
- Department of Physiology, Anhui Medical College, China
| | - Bo Yang
- Department of Neurology, First Affiliated Hospital of Anhui University of Traditional Chinese Medicine, Hefei, China
| | - Xiang Wu
- Department of Psychology, Sun Yat-Sen University, China.
| |
Collapse
|
25
|
Gu L, Huang Y, Wu X. Advantage of audition over vision in a perceptual timing task but not in a sensorimotor timing task. PSYCHOLOGICAL RESEARCH 2019; 84:2046-2056. [PMID: 31190091 DOI: 10.1007/s00426-019-01204-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2019] [Accepted: 05/24/2019] [Indexed: 12/28/2022]
Abstract
Timing is essential for various behaviors and relative to vision, audition is considered to be specialized for temporal processing. The present study conducted a sensorimotor timing task that required tapping in synchrony with a temporally regular sequence and a perceptual timing task that required detecting a timing deviation among a temporally regular sequence. The sequence was composed of auditory tones, visual flashes, or a visual bouncing ball. In the sensorimotor task, sensorimotor timing performance (synchronization stability) of the bouncing ball was much greater than that of flashes and was comparable to that of tones. In the perceptual task, where perceptual timing performance of the bouncing ball was greater than that of flashes, it was poorer than that of tones. These results suggest the facilitation of both perceptual and sensorimotor processing of temporal information by the bouncing ball. Given such facilitation of temporal processing, however, audition is still superior over vision in perceptual detection of timing.
Collapse
Affiliation(s)
- Li Gu
- State Key Laboratory of Ophthalmology, Guangdong Provincial Key Lab of Ophthalmology and Visual Science, Zhongshan Ophthalmic Center, Sun Yat-Sen University, Guangzhou, China.,Department of Psychology, Sun Yat-Sen University, 132 Waihuan East Road, Higher Education Mega Center, Guangzhou, 510006, Guangdong, China
| | - Yingyu Huang
- Department of Psychology, Sun Yat-Sen University, 132 Waihuan East Road, Higher Education Mega Center, Guangzhou, 510006, Guangdong, China
| | - Xiang Wu
- Department of Psychology, Sun Yat-Sen University, 132 Waihuan East Road, Higher Education Mega Center, Guangzhou, 510006, Guangdong, China.
| |
Collapse
|
26
|
|
27
|
Bose A, Byrne Á, Rinzel J. A neuromechanistic model for rhythmic beat generation. PLoS Comput Biol 2019; 15:e1006450. [PMID: 31071078 PMCID: PMC6508617 DOI: 10.1371/journal.pcbi.1006450] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2018] [Accepted: 03/01/2019] [Indexed: 11/18/2022] Open
Abstract
When listening to music, humans can easily identify and move to the beat. Numerous experimental studies have identified brain regions that may be involved with beat perception and representation. Several theoretical and algorithmic approaches have been proposed to account for this ability. Related to, but different from the issue of how we perceive a beat, is the question of how we learn to generate and hold a beat. In this paper, we introduce a neuronal framework for a beat generator that is capable of learning isochronous rhythms over a range of frequencies that are relevant to music and speech. Our approach combines ideas from error-correction and entrainment models to investigate the dynamics of how a biophysically-based neuronal network model synchronizes its period and phase to match that of an external stimulus. The model makes novel use of on-going faster gamma rhythms to form a set of discrete clocks that provide estimates, but not exact information, of how well the beat generator spike times match those of a stimulus sequence. The beat generator is endowed with plasticity allowing it to quickly learn and thereby adjust its spike times to achieve synchronization. Our model makes generalizable predictions about the existence of asymmetries in the synchronization process, as well as specific predictions about resynchronization times after changes in stimulus tempo or phase. Analysis of the model demonstrates that accurate rhythmic time keeping can be achieved over a range of frequencies relevant to music, in a manner that is robust to changes in parameters and to the presence of noise.
Collapse
Affiliation(s)
- Amitabha Bose
- Department of Mathematical Sciences, New Jersey Institute of Technology, Newark, New Jersey, United States of America
| | - Áine Byrne
- Center for Neural Science, New York University, New York, New York, United States of America
- * E-mail:
| | - John Rinzel
- Center for Neural Science, New York University, New York, New York, United States of America
- Courant Institute of Mathematical Sciences, New York University, New York, New York, United States of America
| |
Collapse
|
28
|
Braun Janzen T, Haase M, Thaut MH. Rhythmic priming across effector systems: A randomized controlled trial with Parkinson’s disease patients. Hum Mov Sci 2019; 64:355-365. [DOI: 10.1016/j.humov.2019.03.001] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2018] [Revised: 02/21/2019] [Accepted: 03/01/2019] [Indexed: 01/23/2023]
|
29
|
Comstock DC, Hove MJ, Balasubramaniam R. Sensorimotor Synchronization With Auditory and Visual Modalities: Behavioral and Neural Differences. Front Comput Neurosci 2018; 12:53. [PMID: 30072885 PMCID: PMC6058047 DOI: 10.3389/fncom.2018.00053] [Citation(s) in RCA: 39] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2018] [Accepted: 06/19/2018] [Indexed: 11/13/2022] Open
Abstract
It has long been known that the auditory system is better suited to guide temporally precise behaviors like sensorimotor synchronization (SMS) than the visual system. Although this phenomenon has been studied for many years, the underlying neural and computational mechanisms remain unclear. Growing consensus suggests the existence of multiple, interacting, context-dependent systems, and that reduced precision in visuo-motor timing might be due to the way experimental tasks have been conceived. Indeed, the appropriateness of the stimulus for a given task greatly influences timing performance. In this review, we examine timing differences for sensorimotor synchronization and error correction with auditory and visual sequences, to inspect the underlying neural mechanisms that contribute to modality differences in timing. The disparity between auditory and visual timing likely relates to differences in the processing specialization between auditory and visual modalities (temporal vs. spatial). We propose this difference could offer potential explanation for the differing temporal abilities between modalities. We also offer suggestions as to how these sensory systems interface with motor and timing systems.
Collapse
Affiliation(s)
- Daniel C Comstock
- Cognitive and Information Sciences, University of California, Merced, Merced, CA, United States
| | - Michael J Hove
- Department of Psychological Science, Fitchburg State University, Fitchburg, MA, United States
| | - Ramesh Balasubramaniam
- Cognitive and Information Sciences, University of California, Merced, Merced, CA, United States
| |
Collapse
|
30
|
Castro-Meneses LJ, Sowman PF. Stop signals delay synchrony more for finger tapping than vocalization: a dual modality study of rhythmic synchronization in the stop signal task. PeerJ 2018; 6:e5242. [PMID: 30013856 PMCID: PMC6046193 DOI: 10.7717/peerj.5242] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2017] [Accepted: 06/26/2018] [Indexed: 11/20/2022] Open
Abstract
BACKGROUND A robust feature of sensorimotor synchronization (SMS) performance in finger tapping to an auditory pacing signal is the negative asynchrony of the tap with respect to the pacing signal. The Paillard-Fraisse hypothesis suggests that negative asynchrony is a result of inter-modal integration, in which the brain compares sensory information across two modalities (auditory and tactile). The current study compared the asynchronies of vocalizations and finger tapping in time to an auditory pacing signal. Our first hypothesis was that vocalizations have less negative asynchrony compared to finger tapping due to the requirement for sensory integration within only a single (auditory) modality (intra-modal integration). However, due to the different measurements for vocalizations and finger responses, interpreting the comparison between these two response modalities is problematic. To address this problem, we included stop signals in the synchronization task. The rationale for this manipulation was that stop signals would perturb synchronization more in the inter-modal compared to the intra-modal task. We hypothesized that the inclusion of stop signals induce proactive inhibition, which reduces negative asynchrony. We further hypothesized that any reduction in negative asynchrony occurs to a lesser degree for vocalization than for finger tapping. METHOD A total of 30 participants took part in this study. We compared SMS in a single sensory modality (vocalizations (or auditory) to auditory pacing signal) to a dual sensory modality (fingers (or tactile) to auditory pacing signal). The task was combined with a stop signal task in which stop signals were relevant in some blocks and irrelevant in others. Response-to-pacing signal asynchronies and stop signal reaction times were compared across modalities and across the two types of stop signal blocks. RESULTS In the blocks where stopping was irrelevant, we found that vocalization (-61.47 ms) was more synchronous with the auditory pacing signal compared to finger tapping (-128.29 ms). In the blocks where stopping was relevant, stop signals induced proactive inhibition, shifting the response times later. However, proactive inhibition (26.11 ms) was less evident for vocalizations compared to finger tapping (58.06 ms). DISCUSSION These results support the interpretation that relatively large negative asynchrony in finger tapping is a consequence of inter-modal integration, whereas smaller asynchrony is associated with intra-modal integration. This study also supports the interpretation that intra-modal integration is more sensitive to synchronization discrepancies compared to inter-modal integration.
Collapse
Affiliation(s)
- Leidy J. Castro-Meneses
- Perception in Action Research Centre (PARC), Department of Cognitive Science, Macquarie University, North Ryde, NSW, Australia
- Australian Research Council Centre of Excellence in Cognition and its Disorders (CCD), Macquarie University, North Ryde, NSW, Australia
- The MARCS Institute for Brain, Behaviour and Development, University of Western Sydney, Bankstown, NSW, Australia
| | - Paul F. Sowman
- Perception in Action Research Centre (PARC), Department of Cognitive Science, Macquarie University, North Ryde, NSW, Australia
- Australian Research Council Centre of Excellence in Cognition and its Disorders (CCD), Macquarie University, North Ryde, NSW, Australia
| |
Collapse
|
31
|
Yang J, Ouyang F, Holm L, Huang Y, Gan L, Zhou L, Chao H, Wang M, He M, Zhang S, Yang B, Pan J, Wu X. Tapping ahead of time: its association with timing variability. PSYCHOLOGICAL RESEARCH 2018; 84:343-351. [PMID: 29955958 DOI: 10.1007/s00426-018-1043-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2018] [Accepted: 06/21/2018] [Indexed: 11/28/2022]
Abstract
Researchers have puzzled over the phenomenon in sensorimotor timing that people tend to tap ahead of time. When synchronizing movements (e.g., finger taps) with an external sequence (e.g., a metronome), humans typically tap tens of milliseconds before event onsets, producing the elusive negative asynchrony. Here, we present 24 metronome-tapping data sets from 8 experiments with different experimental settings, showing that less negative asynchrony is associated with lower tapping variability. Further analyses reveal that this negative mean-SD correlation of asynchrony is likely to be observed for sequence types appropriate for synchronization, as indicated by the statistically negative lag 1 autocorrelation of inter-response intervals. The reported findings indicate an association between negative asynchrony and timing variability.
Collapse
Affiliation(s)
- Junkai Yang
- Laboratory for Behavioral and Regional Finance, Guangdong University of Finance, Guangzhou, China.,Department of Psychology, Sun Yat-Sen University, Higher Education Mega Center, 132 Waihuan East road, Guangzhou, 510006, Guangdong, China
| | - Feiyi Ouyang
- Department of Psychology, Sun Yat-Sen University, Higher Education Mega Center, 132 Waihuan East road, Guangzhou, 510006, Guangdong, China
| | - Linus Holm
- Department of Psychology, Umeå University, Umeå, 90187, Sweden.
| | - Yingyu Huang
- Department of Psychology, Sun Yat-Sen University, Higher Education Mega Center, 132 Waihuan East road, Guangzhou, 510006, Guangdong, China
| | - Lingyu Gan
- Department of Psychology, Sun Yat-Sen University, Higher Education Mega Center, 132 Waihuan East road, Guangzhou, 510006, Guangdong, China
| | - Liang Zhou
- Department of Psychology, Sun Yat-Sen University, Higher Education Mega Center, 132 Waihuan East road, Guangzhou, 510006, Guangdong, China
| | - Huizhen Chao
- Department of Psychology, Sun Yat-Sen University, Higher Education Mega Center, 132 Waihuan East road, Guangzhou, 510006, Guangdong, China
| | - Mengye Wang
- Department of Psychology, Sun Yat-Sen University, Higher Education Mega Center, 132 Waihuan East road, Guangzhou, 510006, Guangdong, China
| | - Mengxue He
- Department of Psychology, Sun Yat-Sen University, Higher Education Mega Center, 132 Waihuan East road, Guangzhou, 510006, Guangdong, China
| | - Sheng Zhang
- Department of Physiology, Anhui Medical College, Hefei, China
| | - Bo Yang
- Department of Neurology, First Affiliated Hospital of Anhui University of Traditional Chinese Medicine, Hefei, China
| | - Junhao Pan
- Department of Psychology, Sun Yat-Sen University, Higher Education Mega Center, 132 Waihuan East road, Guangzhou, 510006, Guangdong, China
| | - Xiang Wu
- Department of Psychology, Sun Yat-Sen University, Higher Education Mega Center, 132 Waihuan East road, Guangzhou, 510006, Guangdong, China.
| |
Collapse
|
32
|
Chang A, Bosnyak DJ, Trainor LJ. Beta oscillatory power modulation reflects the predictability of pitch change. Cortex 2018; 106:248-260. [PMID: 30053731 DOI: 10.1016/j.cortex.2018.06.008] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2018] [Revised: 04/19/2018] [Accepted: 06/19/2018] [Indexed: 12/15/2022]
Abstract
Humans process highly dynamic auditory information in real time, and regularities in stimuli such as speech and music can aid such processing by allowing sensory predictions for upcoming events. Auditory sequences contain information about both the identity of sounds (what) and their timing (when they occur). Temporal prediction in isochronous sequences is reflected in neural oscillatory power modulation in the beta band (∼20 Hz). Specifically, power decreases (desynchronization) after tone onset and then increases (resynchronization) to reach a maximum around the expected time of the next tone. The current study investigates whether the predictability of the pitch of a tone (what) is also reflected in beta power modulation. We presented two isochronous auditory oddball sequences, each with 20% of tones at a deviant pitch. In one sequence the deviant tones occurred regularly every fifth tone (predictably), but in the other sequence they occurred pseudorandomly (unpredictably). We recorded the electroencephalogram (EEG) while participants listened passively to these sequences. The results showed that auditory beta power desynchronization was larger prior to a predictable than an unpredictable pitch change. A single-trial correlation analysis using linear mixed-effect (LME) models further showed that the deeper the pre-deviant beta desynchronization depth, the smaller the event-related P3a amplitude following the deviant, and this effect only occurred when the pitch change was predictable. Given that P3a is associated with attentional response to prediction error, larger beta desynchronization depth indicates better prediction of an upcoming deviant pitch. Thus, these findings suggest that beta oscillations reflect predictions for what in additional to when during dynamic auditory information processing.
Collapse
Affiliation(s)
- Andrew Chang
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, ON, Canada
| | - Dan J Bosnyak
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, ON, Canada; McMaster Institute for Music and the Mind, McMaster University, Hamilton, ON, Canada
| | - Laurel J Trainor
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, ON, Canada; McMaster Institute for Music and the Mind, McMaster University, Hamilton, ON, Canada; Rotman Research Institute, Baycrest Hospital, Toronto, ON, Canada.
| |
Collapse
|
33
|
Ross JM, Iversen JR, Balasubramaniam R. The Role of Posterior Parietal Cortex in Beat-based Timing Perception: A Continuous Theta Burst Stimulation Study. J Cogn Neurosci 2018; 30:634-643. [DOI: 10.1162/jocn_a_01237] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
There is growing interest in how the brain's motor systems contribute to the perception of musical rhythms. The Action Simulation for Auditory Prediction hypothesis proposes that the dorsal auditory stream is involved in bidirectional interchange between auditory perception and beat-based prediction in motor planning structures via parietal cortex [Patel, A. D., & Iversen, J. R. The evolutionary neuroscience of musical beat perception: The Action Simulation for Auditory Prediction (ASAP) hypothesis. Frontiers in Systems Neuroscience, 8, 57, 2014]. We used a TMS protocol, continuous theta burst stimulation (cTBS), that is known to down-regulate cortical activity for up to 60 min following stimulation to test for causal contributions to beat-based timing perception. cTBS target areas included the left posterior parietal cortex (lPPC), which is part of the dorsal auditory stream, and the left SMA (lSMA). We hypothesized that down-regulating lPPC would interfere with accurate beat-based perception by disrupting the dorsal auditory stream. We hypothesized that we would induce no interference to absolute timing ability. We predicted that down-regulating lSMA, which is not part of the dorsal auditory stream but has been implicated in internally timed movements, would also interfere with accurate beat-based timing perception. We show ( n = 25) that cTBS down-regulation of lPPC does interfere with beat-based timing ability, but only the ability to detect shifts in beat phase, not changes in tempo. Down-regulation of lSMA, in contrast, did not interfere with beat-based timing. As expected, absolute interval timing ability was not impacted by the down-regulation of lPPC or lSMA. These results support that the dorsal auditory stream plays an essential role in accurate phase perception in beat-based timing. We find no evidence of an essential role of parietal cortex or SMA in interval timing.
Collapse
|
34
|
Abstract
There is growing interest in whether the motor system plays an essential role in rhythm perception. The motor system is active during the perception of rhythms, but is such motor activity merely a sign of unexecuted motor planning, or does it play a causal role in shaping the perception of rhythm? We present evidence for a causal role of motor planning and simulation, and review theories of internal simulation for beat-based timing prediction. Brain stimulation studies have the potential to conclusively test if the motor system plays a causal role in beat perception and ground theories to their neural underpinnings.
Collapse
Affiliation(s)
- Jessica M Ross
- a Cognitive and Information Sciences , University of California , Merced , CA , USA
| | - John R Iversen
- b Swartz Center for Computational Neuroscience, Institute for Neural Computation , University of California , San Diego , CA , USA
| | | |
Collapse
|
35
|
Ravignani A, Fitch WT, Hanke FD, Heinrich T, Hurgitsch B, Kotz SA, Scharff C, Stoeger AS, de Boer B. What Pinnipeds Have to Say about Human Speech, Music, and the Evolution of Rhythm. Front Neurosci 2016; 10:274. [PMID: 27378843 PMCID: PMC4913109 DOI: 10.3389/fnins.2016.00274] [Citation(s) in RCA: 37] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2016] [Accepted: 05/31/2016] [Indexed: 12/19/2022] Open
Abstract
Research on the evolution of human speech and music benefits from hypotheses and data generated in a number of disciplines. The purpose of this article is to illustrate the high relevance of pinniped research for the study of speech, musical rhythm, and their origins, bridging and complementing current research on primates and birds. We briefly discuss speech, vocal learning, and rhythm from an evolutionary and comparative perspective. We review the current state of the art on pinniped communication and behavior relevant to the evolution of human speech and music, showing interesting parallels to hypotheses on rhythmic behavior in early hominids. We suggest future research directions in terms of species to test and empirical data needed.
Collapse
Affiliation(s)
- Andrea Ravignani
- Artificial Intelligence Lab, Vrije Universiteit BrusselBrussels, Belgium; Sensory and Cognitive Ecology, Institute for Biosciences, University of RostockRostock, Germany
| | - W Tecumseh Fitch
- Department of Cognitive Biology, University of Vienna Vienna, Austria
| | - Frederike D Hanke
- Sensory and Cognitive Ecology, Institute for Biosciences, University of Rostock Rostock, Germany
| | - Tamara Heinrich
- Sensory and Cognitive Ecology, Institute for Biosciences, University of Rostock Rostock, Germany
| | | | - Sonja A Kotz
- Basic and Applied NeuroDynamics Lab, Department of Neuropsychology and Psychopharmacology, Maastricht UniversityMaastricht, Netherlands; Department of Neuropsychology, Max-Planck Institute for Human Cognitive and Brain SciencesLeipzig, Germany
| | - Constance Scharff
- Department of Animal Behavior, Institute of Biology, Freie Universität Berlin Berlin, Germany
| | - Angela S Stoeger
- Department of Cognitive Biology, University of Vienna Vienna, Austria
| | - Bart de Boer
- Artificial Intelligence Lab, Vrije Universiteit Brussel Brussels, Belgium
| |
Collapse
|