1
|
Pazdera JK, Trainor LJ. Pitch biases sensorimotor synchronization to auditory rhythms. Sci Rep 2025; 15:17012. [PMID: 40379668 PMCID: PMC12084412 DOI: 10.1038/s41598-025-00827-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2024] [Accepted: 04/30/2025] [Indexed: 05/19/2025] Open
Abstract
Current models of rhythm perception propose that humans track musical beats using the phase, period, and amplitude of sound patterns. However, a growing body of evidence suggests that pitch can also influence the perceived timing of auditory signals. In the present study, we conducted two experiments to investigate whether pitch affects the phase and period of sensorimotor synchronization. To do so, we asked participants to synchronize with a repeating tone, whose pitch on each trial was drawn from one of six different octaves (110-3520 Hz). In Experiment 1, we observed U-shaped patterns in both mean asynchrony and continuation tapping rates, with participants tapping latest and slowest when synchronizing to low and extremely high (above 2000 Hz) pitches, and tapping earliest and fastest to moderately high pitches. In Experiment 2, we found that extremely high pitches still produced slower timing than moderately high pitches when participants were exposed to an exclusively high-pitched context. Based on our results, we advocate for the incorporation of pitch into models of rhythm perception and discuss possible origins of these effects.
Collapse
Affiliation(s)
- Jesse K Pazdera
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, ON, Canada.
| | - Laurel J Trainor
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, ON, Canada
- McMaster Institute for Music and the Mind, McMaster University, Hamilton, ON, Canada
- Rotman Research Institute, Baycrest Hospital, Toronto, ON, Canada
| |
Collapse
|
2
|
Parr T, Oswal A, Manohar SG. Inferring when to move. Neurosci Biobehav Rev 2025; 169:105984. [PMID: 39694432 DOI: 10.1016/j.neubiorev.2024.105984] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2024] [Revised: 12/11/2024] [Accepted: 12/16/2024] [Indexed: 12/20/2024]
Abstract
Most of our movement consists of sequences of discrete actions at regular intervals-including speech, walking, playing music, or even chewing. Despite this, few models of the motor system address how the brain determines the interval at which to trigger actions. This paper offers a theoretical analysis of the problem of timing movements. We consider a scenario in which we must align an alternating movement with a regular external (auditory) stimulus. We assume that our brains employ generative world models that include internal clocks of various speeds. These allow us to associate a temporally regular sensory input with an internal clock, and actions with parts of that clock cycle. We treat this as process of inferring which clock best explains sensory input. This offers a way in which temporally discrete choices might emerge from a continuous process. This is not straightforward, particularly if each of those choices unfolds during a time that has a (possibly unknown) duration. We develop a route for translation to neurology, in the context of Parkinson's disease-a disorder that characteristically slows down movements. The effects are often elicited in clinic by alternating movements. We find that it is possible to reproduce behavioural and electrophysiological features associated with parkinsonism by disrupting specific parameters-that determine the priors for inferences made by the brain. We observe three core features of Parkinson's disease: amplitude decrement, festination, and breakdown of repetitive movements. Our simulations provide a mechanistic interpretation of how pathology and therapeutics might influence behaviour and neural activity.
Collapse
Affiliation(s)
- Thomas Parr
- Nuffield Department of Clinical Neurosciences, University of Oxford, UK.
| | - Ashwini Oswal
- Nuffield Department of Clinical Neurosciences, University of Oxford, UK
| | - Sanjay G Manohar
- Nuffield Department of Clinical Neurosciences, University of Oxford, UK; Department of Experimental Psychology, University of Oxford, UK
| |
Collapse
|
3
|
Lenc T, Lenoir C, Keller PE, Polak R, Mulders D, Nozaradan S. Measuring self-similarity in empirical signals to understand musical beat perception. Eur J Neurosci 2025; 61:e16637. [PMID: 39853878 PMCID: PMC11760665 DOI: 10.1111/ejn.16637] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2024] [Revised: 10/15/2024] [Accepted: 11/26/2024] [Indexed: 01/26/2025]
Abstract
Experiencing music often entails the perception of a periodic beat. Despite being a widespread phenomenon across cultures, the nature and neural underpinnings of beat perception remain largely unknown. In the last decade, there has been a growing interest in developing methods to probe these processes, particularly to measure the extent to which beat-related information is contained in behavioral and neural responses. Here, we propose a theoretical framework and practical implementation of an analytic approach to capture beat-related periodicity in empirical signals using frequency-tagging. We highlight its sensitivity in measuring the extent to which the periodicity of a perceived beat is represented in a range of continuous time-varying signals with minimal assumptions. We also discuss a limitation of this approach with respect to its specificity when restricted to measuring beat-related periodicity only from the magnitude spectrum of a signal and introduce a novel extension of the approach based on autocorrelation to overcome this issue. We test the new autocorrelation-based method using simulated signals and by re-analyzing previously published data and show how it can be used to process measurements of brain activity as captured with surface EEG in adults and infants in response to rhythmic inputs. Taken together, the theoretical framework and related methodological advances confirm and elaborate the frequency-tagging approach as a promising window into the processes underlying beat perception and, more generally, temporally coordinated behaviors.
Collapse
Affiliation(s)
- Tomas Lenc
- Institute of Neuroscience (IONS), UCLouvainBrusselsBelgium
- Basque Center on Cognition, Brain and Language (BCBL)Donostia‐San SebastianSpain
| | - Cédric Lenoir
- Institute of Neuroscience (IONS), UCLouvainBrusselsBelgium
| | - Peter E. Keller
- MARCS Institute for Brain, Behaviour and DevelopmentWestern Sydney UniversitySydneyAustralia
- Center for Music in the Brain & Department of Clinical MedicineAarhus UniversityAarhusDenmark
| | - Rainer Polak
- RITMO Centre for Interdisciplinary Studies in Rhythm, Time and MotionUniversity of OsloOsloNorway
- Department of MusicologyUniversity of OsloOsloNorway
| | - Dounia Mulders
- Institute of Neuroscience (IONS), UCLouvainBrusselsBelgium
- Computational and Biological Learning Unit, Department of EngineeringUniversity of CambridgeCambridgeUK
- Institute for Information and Communication TechnologiesElectronics and Applied Mathematics, UCLouvainLouvain‐la‐NeuveBelgium
- Department of Brain and Cognitive Sciences and McGovern InstituteMassachusetts Institute of Technology (MIT)CambridgeMassachusettsUSA
| | - Sylvie Nozaradan
- Institute of Neuroscience (IONS), UCLouvainBrusselsBelgium
- International Laboratory for Brain, Music and Sound Research (BRAMS)MontrealCanada
| |
Collapse
|
4
|
Rajendran VG, Tsdaka Y, Keung TY, Schnupp JW, Nelken I. Rats synchronize predictively to metronomes. iScience 2024; 27:111053. [PMID: 39507253 PMCID: PMC11539146 DOI: 10.1016/j.isci.2024.111053] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2023] [Revised: 05/29/2024] [Accepted: 09/24/2024] [Indexed: 11/08/2024] Open
Abstract
Predictive auditory-motor synchronization, in which rhythmic movements anticipate rhythmic sounds, is at the core of the human capacity for music. Rodents show impressive capabilities in timing and motor tasks, but their ability to predictively coordinate sensation and action has not been demonstrated. Here, we reveal a clear capacity for predictive auditory-motor synchronization in rodent species using a modeling approach for the quantitative exploration of synchronization behaviors. We trained 8 rats to synchronize their licking to metronomes with tempi ranging from 0.5to 2 Hz and observed periodic lick patterns locked to metronome beats. We developed a flexible Markovian modeling framework to formally test how well different candidate strategies could explain the observed lick patterns. The best models required predictive control of licking that could not be explained by reactive strategies, indicating that predictive auditory-motor synchronization may be more widely shared across mammalian species than previously appreciated.
Collapse
Affiliation(s)
- Vani G. Rajendran
- Department of Biomedical Sciences, City University of Hong Kong, Hong Kong, China
- Instituto de Fisiología Celular, Universidad Nacional Autónoma de México, Mexico City, Mexico
| | - Yehonadav Tsdaka
- Edmond and Lily Safra Center for Brain Sciences and the Department for Neurobiology, Hebrew University of Jerusalem, Jerusalem, Israel
| | - Tung Yee Keung
- Department of Biomedical Sciences, City University of Hong Kong, Hong Kong, China
| | - Jan W.H. Schnupp
- Department of Biomedical Sciences, City University of Hong Kong, Hong Kong, China
- Gerald Choa Neuroscience Institute, The Chinese University of Hong Kong, Hong Kong, China
- Department of Otolaryngology, Chinese University of Hong Kong, Hong Kong SAR, China
| | - Israel Nelken
- Edmond and Lily Safra Center for Brain Sciences and the Department for Neurobiology, Hebrew University of Jerusalem, Jerusalem, Israel
- Instituto de Fisiología Celular, Universidad Nacional Autónoma de México, Mexico City, Mexico
| |
Collapse
|
5
|
Zemlianova K, Bose A, Rinzel J. Dynamical mechanisms of how an RNN keeps a beat, uncovered with a low-dimensional reduced model. Sci Rep 2024; 14:26388. [PMID: 39488649 PMCID: PMC11531529 DOI: 10.1038/s41598-024-77849-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2024] [Accepted: 10/25/2024] [Indexed: 11/04/2024] Open
Abstract
Despite music's omnipresence, the specific neural mechanisms responsible for perceiving and anticipating temporal patterns in music are unknown. To study potential mechanisms for keeping time in rhythmic contexts, we train a biologically constrained RNN, with excitatory (E) and inhibitory (I) units, on seven different stimulus tempos (2-8 Hz) on a synchronization and continuation task, a standard experimental paradigm. Our trained RNN generates a network oscillator that uses an input current (context parameter) to control oscillation frequency and replicates key features of neural dynamics observed in neural recordings of monkeys performing the same task. We develop a reduced three-variable rate model of the RNN and analyze its dynamic properties. By treating our understanding of the mathematical structure for oscillations in the reduced model as predictive, we confirm that the dynamical mechanisms are found also in the RNN. Our neurally plausible reduced model reveals an E-I circuit with two distinct inhibitory sub-populations, of which one is tightly synchronized with the excitatory units.
Collapse
Affiliation(s)
- Klavdia Zemlianova
- Center for Neural Science, New York University, New York, NY, 10003, USA
| | - Amitabha Bose
- Department of Mathematical Sciences, New Jersey Institute of Technology, Newark, NJ, 07102, USA
| | - John Rinzel
- Center for Neural Science and Courant Institute of Mathematical Sciences, New York University, New York, NY, 10003, USA.
| |
Collapse
|
6
|
Zalta A, Large EW, Schön D, Morillon B. Neural dynamics of predictive timing and motor engagement in music listening. SCIENCE ADVANCES 2024; 10:eadi2525. [PMID: 38446888 PMCID: PMC10917349 DOI: 10.1126/sciadv.adi2525] [Citation(s) in RCA: 16] [Impact Index Per Article: 16.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/13/2023] [Accepted: 01/30/2024] [Indexed: 03/08/2024]
Abstract
Why do humans spontaneously dance to music? To test the hypothesis that motor dynamics reflect predictive timing during music listening, we created melodies with varying degrees of rhythmic predictability (syncopation) and asked participants to rate their wanting-to-move (groove) experience. Degree of syncopation and groove ratings are quadratically correlated. Magnetoencephalography data showed that, while auditory regions track the rhythm of melodies, beat-related 2-hertz activity and neural dynamics at delta (1.4 hertz) and beta (20 to 30 hertz) rates in the dorsal auditory pathway code for the experience of groove. Critically, the left sensorimotor cortex coordinates these groove-related delta and beta activities. These findings align with the predictions of a neurodynamic model, suggesting that oscillatory motor engagement during music listening reflects predictive timing and is effected by interaction of neural dynamics along the dorsal auditory pathway.
Collapse
Affiliation(s)
- Arnaud Zalta
- Aix Marseille Université, Inserm, INS, Institut de Neurosciences des Systèmes, Marseille, France
- APHM, INSERM, Inst Neurosci Syst, Service de Pharmacologie Clinique et Pharmacovigilance, Aix Marseille Université, Marseille, France
| | - Edward W. Large
- Department of Psychological Sciences, Ecological Psychology Division, University of Connecticut, Storrs, CT, USA
- Department of Physics, University of Connecticut, Storrs, CT, USA
| | - Daniele Schön
- Aix Marseille Université, Inserm, INS, Institut de Neurosciences des Systèmes, Marseille, France
| | - Benjamin Morillon
- Aix Marseille Université, Inserm, INS, Institut de Neurosciences des Systèmes, Marseille, France
| |
Collapse
|
7
|
Bouwer FL, Háden GP, Honing H. Probing Beat Perception with Event-Related Potentials (ERPs) in Human Adults, Newborns, and Nonhuman Primates. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2024; 1455:227-256. [PMID: 38918355 DOI: 10.1007/978-3-031-60183-5_13] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/27/2024]
Abstract
The aim of this chapter is to give an overview of how the perception of rhythmic temporal regularity such as a regular beat in music can be studied in human adults, human newborns, and nonhuman primates using event-related brain potentials (ERPs). First, we discuss different aspects of temporal structure in general, and musical rhythm in particular, and we discuss the possible mechanisms underlying the perception of regularity (e.g., a beat) in rhythm. Additionally, we highlight the importance of dissociating beat perception from the perception of other types of structure in rhythm, such as predictable sequences of temporal intervals, ordinal structure, and rhythmic grouping. In the second section of the chapter, we start with a discussion of auditory ERPs elicited by infrequent and frequent sounds: ERP responses to regularity violations, such as mismatch negativity (MMN), N2b, and P3, as well as early sensory responses to sounds, such as P1 and N1, have been shown to be instrumental in probing beat perception. Subsequently, we discuss how beat perception can be probed by comparing ERP responses to sounds in regular and irregular sequences, and by comparing ERP responses to sounds in different metrical positions in a rhythm, such as on and off the beat or on strong and weak beats. Finally, we will discuss previous research that has used the aforementioned ERPs and paradigms to study beat perception in human adults, human newborns, and nonhuman primates. In doing so, we consider the possible pitfalls and prospects of the technique, as well as future perspectives.
Collapse
Affiliation(s)
- Fleur L Bouwer
- Cognitive Psychology Unit, Institute of Psychology, Leiden Institute for Brain and Cognition, Leiden University, Leiden, The Netherlands.
- Department of Psychology, Brain & Cognition, University of Amsterdam, Amsterdam, The Netherlands.
| | - Gábor P Háden
- Institute of Cognitive Neuroscience and Psychology, Budapest, Hungary
- Department of Telecommunications and Media Informatics, Faculty of Electrical Engineering and Informatics, Budapest University of Technology and Economics, Budapest, Hungary
| | - Henkjan Honing
- Music Cognition group (MCG), Institute for Logic, Language and Computation (ILLC), Amsterdam Brain and Cognition (ABC), University of Amsterdam, Amsterdam, The Netherlands
| |
Collapse
|
8
|
Beker S, Molholm S. Do we all synch alike? Brain-body-environment interactions in ASD. Front Neural Circuits 2023; 17:1275896. [PMID: 38186630 PMCID: PMC10769494 DOI: 10.3389/fncir.2023.1275896] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2023] [Accepted: 11/27/2023] [Indexed: 01/09/2024] Open
Abstract
Autism Spectrum Disorder (ASD) is characterized by rigidity of routines and restricted interests, and atypical social communication and interaction. Recent evidence for altered synchronization of neuro-oscillatory brain activity with regularities in the environment and of altered peripheral nervous system function in ASD present promising novel directions for studying pathophysiology and its relationship to ASD clinical phenotype. Human cognition and action are significantly influenced by physiological rhythmic processes that are generated by both the central nervous system (CNS) and the autonomic nervous system (ANS). Normally, perception occurs in a dynamic context, where brain oscillations and autonomic signals synchronize with external events to optimally receive temporally predictable rhythmic information, leading to improved performance. The recent findings on the time-sensitive coupling between the brain and the periphery in effective perception and successful social interactions in typically developed highlight studying the interactions within the brain-body-environment triad as a critical direction in the study of ASD. Here we offer a novel perspective of autism as a case where the temporal dynamics of brain-body-environment coupling is impaired. We present evidence from the literature to support the idea that in autism the nervous system fails to operate in an adaptive manner to synchronize with temporally predictable events in the environment to optimize perception and behavior. This framework could potentially lead to novel biomarkers of hallmark deficits in ASD such as cognitive rigidity and altered social interaction.
Collapse
Affiliation(s)
- Shlomit Beker
- Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine, Bronx, NY, United States
| | | |
Collapse
|
9
|
Fram NR, Berger J. Syncopation as Probabilistic Expectation: Conceptual, Computational, and Experimental Evidence. Cogn Sci 2023; 47:e13390. [PMID: 38043104 DOI: 10.1111/cogs.13390] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2022] [Revised: 08/22/2023] [Accepted: 11/17/2023] [Indexed: 12/05/2023]
Abstract
Definitions of syncopation share two characteristics: the presence of a meter or analogous hierarchical rhythmic structure and a displacement or contradiction of that structure. These attributes are translated in terms of a Bayesian theory of syncopation, where the syncopation of a rhythm is inferred based on a hierarchical structure that is, in turn, learned from the ongoing musical stimulus. Several experiments tested its simplest possible implementation, with equally weighted priors associated with different meters and independence of auditory events, which can be decomposed into two terms representing note density and deviation from a metric hierarchy. A computational simulation demonstrated that extant measures of syncopation fall into two distinct factors analogous to the terms in the simple Bayesian model. Next, a series of behavioral experiments found that perceived syncopation is significantly related to both terms, offering support for the general Bayesian construction of syncopation. However, we also found that the prior expectations associated with different metric structures are not equal across meters and that there is an interaction between density and hierarchical deviation, implying that auditory events are not independent from each other. Together, these findings provide evidence that syncopation is a manifestation of a form of temporal expectation that can be directly represented in Bayesian terms and offer a complementary, feature-driven approach to recent Bayesian models of temporal prediction.
Collapse
Affiliation(s)
- Noah R Fram
- Center for Computer Research in Music and Acoustics, Department of Music, Stanford University
- Department of Otolaryngology, Vanderbilt University Medical Center
| | - Jonathan Berger
- Center for Computer Research in Music and Acoustics, Department of Music, Stanford University
| |
Collapse
|
10
|
Meng J, Zhao Y, Wang K, Sun J, Yi W, Xu F, Xu M, Ming D. Rhythmic temporal prediction enhances neural representations of movement intention for brain-computer interface. J Neural Eng 2023; 20:066004. [PMID: 37875107 DOI: 10.1088/1741-2552/ad0650] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2023] [Accepted: 10/24/2023] [Indexed: 10/26/2023]
Abstract
Objective.Detecting movement intention is a typical use of brain-computer interfaces (BCI). However, as an endogenous electroencephalography (EEG) feature, the neural representation of movement is insufficient for improving motor-based BCI. This study aimed to develop a new movement augmentation BCI encoding paradigm by incorporating the cognitive function of rhythmic temporal prediction, and test the feasibility of this new paradigm in optimizing detections of movement intention.Methods.A visual-motion synchronization task was designed with two movement intentions (left vs. right) and three rhythmic temporal prediction conditions (1000 ms vs. 1500 ms vs. no temporal prediction). Behavioural and EEG data of 24 healthy participants were recorded. Event-related potentials (ERPs), event-related spectral perturbation induced by left- and right-finger movements, the common spatial pattern (CSP) and support vector machine, Riemann tangent space algorithm and logistic regression were used and compared across the three temporal prediction conditions, aiming to test the impact of temporal prediction on movement detection.Results.Behavioural results showed significantly smaller deviation time for 1000 ms and 1500 ms conditions. ERP analyses revealed 1000 ms and 1500 ms conditions led to rhythmic oscillations with a time lag in contralateral and ipsilateral areas of movement. Compared with no temporal prediction, 1000 ms condition exhibited greater beta event-related desynchronization (ERD) lateralization in motor area (P< 0.001) and larger beta ERD in frontal area (P< 0.001). 1000 ms condition achieved an averaged left-right decoding accuracy of 89.71% using CSP and 97.30% using Riemann tangent space, both significantly higher than no temporal prediction. Moreover, movement and temporal information can be decoded simultaneously, achieving 88.51% four-classification accuracy.Significance.The results not only confirm the effectiveness of rhythmic temporal prediction in enhancing detection ability of motor-based BCI, but also highlight the dual encodings of movement and temporal information within a single BCI paradigm, which is promising to expand the range of intentions that can be decoded by the BCI.
Collapse
Affiliation(s)
- Jiayuan Meng
- The Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin 300072, People's Republic of China
- Haihe Laboratory of Brain-computer Interaction and Human-machine Integration, Tianjin 300392, People's Republic of China
| | - Yingru Zhao
- The Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin 300072, People's Republic of China
| | - Kun Wang
- The Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin 300072, People's Republic of China
- Haihe Laboratory of Brain-computer Interaction and Human-machine Integration, Tianjin 300392, People's Republic of China
| | - Jinsong Sun
- The Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin 300072, People's Republic of China
| | - Weibo Yi
- Beijing Machine and Equipment Institute, Beijing, People's Republic of China
| | - Fangzhou Xu
- International School for Optoelectronic Engineering, Qilu University of Technology (Shandong Academy of Sciences), Jinan, People's Republic of China
| | - Minpeng Xu
- The Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin 300072, People's Republic of China
- Haihe Laboratory of Brain-computer Interaction and Human-machine Integration, Tianjin 300392, People's Republic of China
- International School for Optoelectronic Engineering, Qilu University of Technology (Shandong Academy of Sciences), Jinan, People's Republic of China
| | - Dong Ming
- The Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin 300072, People's Republic of China
- Haihe Laboratory of Brain-computer Interaction and Human-machine Integration, Tianjin 300392, People's Republic of China
| |
Collapse
|
11
|
Betancourt A, Pérez O, Gámez J, Mendoza G, Merchant H. Amodal population clock in the primate medial premotor system for rhythmic tapping. Cell Rep 2023; 42:113234. [PMID: 37838944 DOI: 10.1016/j.celrep.2023.113234] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2022] [Revised: 08/09/2023] [Accepted: 09/24/2023] [Indexed: 10/17/2023] Open
Abstract
The neural substrate for beat extraction and response entrainment to rhythms is not fully understood. Here we analyze the activity of medial premotor neurons in monkeys performing isochronous tapping guided by brief flashing stimuli or auditory tones. The population dynamics shared the following properties across modalities: the circular dynamics of the neural trajectories form a regenerating loop for every produced interval; the trajectories converge in similar state space at tapping times resetting the clock; and the tempo of the synchronized tapping is encoded in the trajectories by a combination of amplitude modulation and temporal scaling. Notably, the modality induces displacement in the neural trajectories in the auditory and visual subspaces without greatly altering the time-keeping mechanism. These results suggest that the interaction between the medial premotor cortex's amodal internal representation of pulse and a modality-specific external input generates a neural rhythmic clock whose dynamics govern rhythmic tapping execution across senses.
Collapse
Affiliation(s)
- Abraham Betancourt
- Instituto de Neurobiología, UNAM, Campus Juriquilla, Boulevard Juriquilla No. 3001, Querétaro, Qro 76230, México
| | - Oswaldo Pérez
- Escuela Nacional de Estudios Superiores, Unidad Juriquilla, UNAM, Boulevard Juriquilla No. 3001, Querétaro, Qro 76230, México
| | - Jorge Gámez
- Instituto de Neurobiología, UNAM, Campus Juriquilla, Boulevard Juriquilla No. 3001, Querétaro, Qro 76230, México
| | - Germán Mendoza
- Instituto de Neurobiología, UNAM, Campus Juriquilla, Boulevard Juriquilla No. 3001, Querétaro, Qro 76230, México
| | - Hugo Merchant
- Instituto de Neurobiología, UNAM, Campus Juriquilla, Boulevard Juriquilla No. 3001, Querétaro, Qro 76230, México.
| |
Collapse
|
12
|
Kaplan T, Jamone L, Pearce M. Probabilistic modelling of microtiming perception. Cognition 2023; 239:105532. [PMID: 37442021 DOI: 10.1016/j.cognition.2023.105532] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2022] [Revised: 06/11/2023] [Accepted: 06/21/2023] [Indexed: 07/15/2023]
Abstract
Music performances are rich in systematic temporal irregularities called "microtiming", too fine-grained to be notated in a musical score but important for musical expression and communication. Several studies have examined listeners' preference for rhythms varying in microtiming, but few have addressed precisely how microtiming is perceived, especially in terms of cognitive mechanisms, making the empirical evidence difficult to interpret. Here we provide evidence that microtiming perception can be simulated as a process of probabilistic prediction. Participants performed an XAB discrimination test, in which an archetypal popular drum rhythm was presented with different microtiming. The results indicate that listeners could implicitly discriminate the mean and variance of stimulus microtiming. Furthermore, their responses were effectively simulated by a Bayesian model of entrainment, using a distance function derived from its dynamic posterior estimate over phase. Wide individual differences in participant sensitivity to microtiming were predicted by a model parameter likened to noisy timekeeping processes in the brain. Overall, this suggests that the cognitive mechanisms underlying perception of microtiming reflect a continuous inferential process, potentially driving qualitative judgements of rhythmic feel.
Collapse
Affiliation(s)
- Thomas Kaplan
- School of Electronic Engineering & Computer Science, Queen Mary University of London, London, United Kingdom.
| | - Lorenzo Jamone
- School of Engineering & Materials Science, Queen Mary University of London, London, United Kingdom
| | - Marcus Pearce
- School of Electronic Engineering & Computer Science, Queen Mary University of London, London, United Kingdom; Department of Clinical Medicine, Aarhus University, Aarhus, Denmark
| |
Collapse
|
13
|
Large EW, Roman I, Kim JC, Cannon J, Pazdera JK, Trainor LJ, Rinzel J, Bose A. Dynamic models for musical rhythm perception and coordination. Front Comput Neurosci 2023; 17:1151895. [PMID: 37265781 PMCID: PMC10229831 DOI: 10.3389/fncom.2023.1151895] [Citation(s) in RCA: 20] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2023] [Accepted: 04/28/2023] [Indexed: 06/03/2023] Open
Abstract
Rhythmicity permeates large parts of human experience. Humans generate various motor and brain rhythms spanning a range of frequencies. We also experience and synchronize to externally imposed rhythmicity, for example from music and song or from the 24-h light-dark cycles of the sun. In the context of music, humans have the ability to perceive, generate, and anticipate rhythmic structures, for example, "the beat." Experimental and behavioral studies offer clues about the biophysical and neural mechanisms that underlie our rhythmic abilities, and about different brain areas that are involved but many open questions remain. In this paper, we review several theoretical and computational approaches, each centered at different levels of description, that address specific aspects of musical rhythmic generation, perception, attention, perception-action coordination, and learning. We survey methods and results from applications of dynamical systems theory, neuro-mechanistic modeling, and Bayesian inference. Some frameworks rely on synchronization of intrinsic brain rhythms that span the relevant frequency range; some formulations involve real-time adaptation schemes for error-correction to align the phase and frequency of a dedicated circuit; others involve learning and dynamically adjusting expectations to make rhythm tracking predictions. Each of the approaches, while initially designed to answer specific questions, offers the possibility of being integrated into a larger framework that provides insights into our ability to perceive and generate rhythmic patterns.
Collapse
Affiliation(s)
- Edward W. Large
- Department of Psychological Sciences, University of Connecticut, Mansfield, CT, United States
- Department of Physics, University of Connecticut, Mansfield, CT, United States
| | - Iran Roman
- Music and Audio Research Laboratory, New York University, New York, NY, United States
| | - Ji Chul Kim
- Department of Psychological Sciences, University of Connecticut, Mansfield, CT, United States
| | - Jonathan Cannon
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, ON, Canada
| | - Jesse K. Pazdera
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, ON, Canada
| | - Laurel J. Trainor
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, ON, Canada
| | - John Rinzel
- Center for Neural Science, New York University, New York, NY, United States
- Courant Institute of Mathematical Sciences, New York University, New York, NY, United States
| | - Amitabha Bose
- Department of Mathematical Sciences, New Jersey Institute of Technology, Newark, NJ, United States
| |
Collapse
|
14
|
Kaplan T, Cannon J, Jamone L, Pearce M. Modeling enculturated bias in entrainment to rhythmic patterns. PLoS Comput Biol 2022; 18:e1010579. [PMID: 36174063 PMCID: PMC9553061 DOI: 10.1371/journal.pcbi.1010579] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2021] [Revised: 10/11/2022] [Accepted: 09/16/2022] [Indexed: 11/19/2022] Open
Abstract
Long-term and culture-specific experience of music shapes rhythm perception, leading to enculturated expectations that make certain rhythms easier to track and more conducive to synchronized movement. However, the influence of enculturated bias on the moment-to-moment dynamics of rhythm tracking is not well understood. Recent modeling work has formulated entrainment to rhythms as a formal inference problem, where phase is continuously estimated based on precise event times and their correspondence to timing expectations: PIPPET (Phase Inference from Point Process Event Timing). Here we propose that the problem of optimally tracking a rhythm also requires an ongoing process of inferring which pattern of event timing expectations is most suitable to predict a stimulus rhythm. We formalize this insight as an extension of PIPPET called pPIPPET (PIPPET with pattern inference). The variational solution to this problem introduces terms representing the likelihood that a stimulus is based on a particular member of a set of event timing patterns, which we initialize according to culturally-learned prior expectations of a listener. We evaluate pPIPPET in three experiments. First, we demonstrate that pPIPPET can qualitatively reproduce enculturated bias observed in human tapping data for simple two-interval rhythms. Second, we simulate categorization of a continuous three-interval rhythm space by Western-trained musicians through derivation of a comprehensive set of priors for pPIPPET from metrical patterns in a sample of Western rhythms. Third, we simulate iterated reproduction of three-interval rhythms, and show that models configured with notated rhythms from different cultures exhibit both universal and enculturated biases as observed experimentally in listeners from those cultures. These results suggest the influence of enculturated timing expectations on human perceptual and motor entrainment can be understood as approximating optimal inference about the rhythmic stimulus, with respect to prototypical patterns in an empirical sample of rhythms that represent the music-cultural environment of the listener.
Collapse
Affiliation(s)
- Thomas Kaplan
- Cognitive Science Research Group, School of Electronic Engineering & Computer Science, Queen Mary University of London, London, United Kingdom
| | - Jonathan Cannon
- Department of Psychology, Neuroscience & Behaviour, McMaster University, Hamilton, Ontario, Canada
| | - Lorenzo Jamone
- Cognitive Science Research Group, School of Electronic Engineering & Computer Science, Queen Mary University of London, London, United Kingdom
- Advanced Robotics at Queen Mary (ARQ), School of Electronic Engineering & Computer Science, Queen Mary University of London, London, United Kingdom
| | - Marcus Pearce
- Cognitive Science Research Group, School of Electronic Engineering & Computer Science, Queen Mary University of London, London, United Kingdom
- Department of Clinical Medicine, Aarhus University, Aarhus, Denmark
| |
Collapse
|
15
|
Stupacher J, Matthews TE, Pando-Naude V, Foster Vander Elst O, Vuust P. The sweet spot between predictability and surprise: musical groove in brain, body, and social interactions. Front Psychol 2022; 13:906190. [PMID: 36017431 PMCID: PMC9396343 DOI: 10.3389/fpsyg.2022.906190] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2022] [Accepted: 07/19/2022] [Indexed: 11/30/2022] Open
Abstract
Groove-defined as the pleasurable urge to move to a rhythm-depends on a fine-tuned interplay between predictability arising from repetitive rhythmic patterns, and surprise arising from rhythmic deviations, for example in the form of syncopation. The perfect balance between predictability and surprise is commonly found in rhythmic patterns with a moderate level of rhythmic complexity and represents the sweet spot of the groove experience. In contrast, rhythms with low or high complexity are usually associated with a weaker experience of groove because they are too boring to be engaging or too complex to be interpreted, respectively. Consequently, the relationship between rhythmic complexity and groove experience can be described by an inverted U-shaped function. We interpret this inverted U shape in light of the theory of predictive processing and provide perspectives on how rhythmic complexity and groove can help us to understand the underlying neural mechanisms linking temporal predictions, movement, and reward. A better understanding of these mechanisms can guide future approaches to improve treatments for patients with motor impairments, such as Parkinson's disease, and to investigate prosocial aspects of interpersonal interactions that feature music, such as dancing. Finally, we present some open questions and ideas for future research.
Collapse
Affiliation(s)
- Jan Stupacher
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University and The Royal Academy of Music Aarhus/Aalborg, Aarhus, Denmark
- Institute of Psychology, University of Graz, Graz, Austria
| | - Tomas Edward Matthews
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University and The Royal Academy of Music Aarhus/Aalborg, Aarhus, Denmark
| | - Victor Pando-Naude
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University and The Royal Academy of Music Aarhus/Aalborg, Aarhus, Denmark
| | - Olivia Foster Vander Elst
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University and The Royal Academy of Music Aarhus/Aalborg, Aarhus, Denmark
| | - Peter Vuust
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University and The Royal Academy of Music Aarhus/Aalborg, Aarhus, Denmark
| |
Collapse
|
16
|
Herbst SK, Stefanics G, Obleser J. Endogenous modulation of delta phase by expectation–A replication of Stefanics et al., 2010. Cortex 2022; 149:226-245. [DOI: 10.1016/j.cortex.2022.02.001] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2021] [Revised: 01/31/2022] [Accepted: 02/01/2022] [Indexed: 11/03/2022]
|
17
|
Cannon J. Correction: Expectancy-based rhythmic entrainment as continuous Bayesian inference. PLoS Comput Biol 2021; 17:e1009692. [PMID: 34898606 PMCID: PMC8668110 DOI: 10.1371/journal.pcbi.1009692] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022] Open
Abstract
[This corrects the article DOI: 10.1371/journal.pcbi.1009025.].
Collapse
|