1
|
Barbaresi M, Nardo D, Fagioli S. Physiological Entrainment: A Key Mind-Body Mechanism for Cognitive, Motor and Affective Functioning, and Well-Being. Brain Sci 2024; 15:3. [PMID: 39851371 PMCID: PMC11763407 DOI: 10.3390/brainsci15010003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2024] [Revised: 12/13/2024] [Accepted: 12/21/2024] [Indexed: 01/26/2025] Open
Abstract
BACKGROUND The human sensorimotor system can naturally synchronize with environmental rhythms, such as light pulses or sound beats. Several studies showed that different styles and tempos of music, or other rhythmic stimuli, have an impact on physiological rhythms, including electrocortical brain activity, heart rate, and motor coordination. Such synchronization, also known as the "entrainment effect", has been identified as a crucial mechanism impacting cognitive, motor, and affective functioning. OBJECTIVES This review examines theoretical and empirical contributions to the literature on entrainment, with a particular focus on the physiological mechanisms underlying this phenomenon and its role in cognitive, motor, and affective functions. We also address the inconsistent terminology used in the literature and evaluate the range of measurement approaches used to assess entrainment phenomena. Finally, we propose a definition of "physiological entrainment" that emphasizes its role as a fundamental mechanism that encompasses rhythmic interactions between the body and its environment, to support information processing across bodily systems and to sustain adaptive motor responses. METHODS We reviewed the recent literature through the lens of the "embodied cognition" framework, offering a unified perspective on the phenomenon of physiological entrainment. RESULTS Evidence from the current literature suggests that physiological entrainment produces measurable effects, especially on neural oscillations, heart rate variability, and motor synchronization. Eventually, such physiological changes can impact cognitive processing, affective functioning, and motor coordination. CONCLUSIONS Physiological entrainment emerges as a fundamental mechanism underlying the mind-body connection. Entrainment-based interventions may be used to promote well-being by enhancing cognitive, motor, and affective functions, suggesting potential rehabilitative approaches to enhancing mental health.
Collapse
Affiliation(s)
| | - Davide Nardo
- Department of Education, “Roma Tre” University, 00185 Rome, Italy; (M.B.); (S.F.)
| | | |
Collapse
|
2
|
Uemura M, Katagiri Y, Imai E, Kawahara Y, Otani Y, Ichinose T, Kondo K, Kowa H. Dorsal Anterior Cingulate Cortex Coordinates Contextual Mental Imagery for Single-Beat Manipulation during Rhythmic Sensorimotor Synchronization. Brain Sci 2024; 14:757. [PMID: 39199452 PMCID: PMC11352649 DOI: 10.3390/brainsci14080757] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2024] [Revised: 07/17/2024] [Accepted: 07/23/2024] [Indexed: 09/01/2024] Open
Abstract
Flexible pulse-by-pulse regulation of sensorimotor synchronization is crucial for voluntarily showing rhythmic behaviors synchronously with external cueing; however, the underpinning neurophysiological mechanisms remain unclear. We hypothesized that the dorsal anterior cingulate cortex (dACC) plays a key role by coordinating both proactive and reactive motor outcomes based on contextual mental imagery. To test our hypothesis, a missing-oddball task in finger-tapping paradigms was conducted in 33 healthy young volunteers. The dynamic properties of the dACC were evaluated by event-related deep-brain activity (ER-DBA), supported by event-related potential (ERP) analysis and behavioral evaluation based on signal detection theory. We found that ER-DBA activation/deactivation reflected a strategic choice of motor control modality in accordance with mental imagery. Reverse ERP traces, as omission responses, confirmed that the imagery was contextual. We found that mental imagery was updated only by environmental changes via perceptual evidence and response-based abductive reasoning. Moreover, stable on-pulse tapping was achievable by maintaining proactive control while creating an imagery of syncopated rhythms from simple beat trains, whereas accuracy was degraded with frequent erroneous tapping for missing pulses. We conclude that the dACC voluntarily regulates rhythmic sensorimotor synchronization by utilizing contextual mental imagery based on experience and by creating novel rhythms.
Collapse
Affiliation(s)
- Maho Uemura
- Department of Rehabilitation Science, Kobe University Graduate School of Health Sciences, Kobe 654-0142, Japan; (Y.O.); (H.K.)
- School of Music, Mukogawa Women’s University, Nishinomiya 663-8558, Japan;
| | - Yoshitada Katagiri
- Department of Bioengineering, School of Engineering, The University of Tokyo, Tokyo 113-8655, Japan;
| | - Emiko Imai
- Department of Biophysics, Kobe University Graduate School of Health Sciences, Kobe 654-0142, Japan;
| | - Yasuhiro Kawahara
- Department of Human life and Health Sciences, Division of Arts and Sciences, The Open University of Japan, Chiba 261-8586, Japan;
| | - Yoshitaka Otani
- Department of Rehabilitation Science, Kobe University Graduate School of Health Sciences, Kobe 654-0142, Japan; (Y.O.); (H.K.)
- Faculty of Rehabilitation, Kobe International University, Kobe 658-0032, Japan
| | - Tomoko Ichinose
- School of Music, Mukogawa Women’s University, Nishinomiya 663-8558, Japan;
| | | | - Hisatomo Kowa
- Department of Rehabilitation Science, Kobe University Graduate School of Health Sciences, Kobe 654-0142, Japan; (Y.O.); (H.K.)
| |
Collapse
|
3
|
Jacoby N, Polak R, Grahn JA, Cameron DJ, Lee KM, Godoy R, Undurraga EA, Huanca T, Thalwitzer T, Doumbia N, Goldberg D, Margulis EH, Wong PCM, Jure L, Rocamora M, Fujii S, Savage PE, Ajimi J, Konno R, Oishi S, Jakubowski K, Holzapfel A, Mungan E, Kaya E, Rao P, Rohit MA, Alladi S, Tarr B, Anglada-Tort M, Harrison PMC, McPherson MJ, Dolan S, Durango A, McDermott JH. Commonality and variation in mental representations of music revealed by a cross-cultural comparison of rhythm priors in 15 countries. Nat Hum Behav 2024; 8:846-877. [PMID: 38438653 PMCID: PMC11132990 DOI: 10.1038/s41562-023-01800-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2021] [Accepted: 12/07/2023] [Indexed: 03/06/2024]
Abstract
Music is present in every known society but varies from place to place. What, if anything, is universal to music cognition? We measured a signature of mental representations of rhythm in 39 participant groups in 15 countries, spanning urban societies and Indigenous populations. Listeners reproduced random 'seed' rhythms; their reproductions were fed back as the stimulus (as in the game of 'telephone'), such that their biases (the prior) could be estimated from the distribution of reproductions. Every tested group showed a sparse prior with peaks at integer-ratio rhythms. However, the importance of different integer ratios varied across groups, often reflecting local musical practices. Our results suggest a common feature of music cognition: discrete rhythm 'categories' at small-integer ratios. These discrete representations plausibly stabilize musical systems in the face of cultural transmission but interact with culture-specific traditions to yield the diversity that is evident when mental representations are probed across many cultures.
Collapse
Affiliation(s)
- Nori Jacoby
- Computational Auditory Perception Group, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany.
- Presidential Scholars in Society and Neuroscience, Columbia University, New York, NY, USA.
| | - Rainer Polak
- RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, University of Oslo, Blindern, Oslo, Norway
| | - Jessica A Grahn
- Brain and Mind Institute and Department of Psychology, University of Western Ontario, London, Ontario, Canada
| | - Daniel J Cameron
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, Ontario, Canada
| | - Kyung Myun Lee
- School of Digital Humanities and Social Sciences, Korea Advanced Institute of Science and Technology, Daejeon, Republic of Korea
- Graduate School of Culture Technology, Korea Advanced Institute of Science and Technology, Daejeon, Republic of Korea
| | - Ricardo Godoy
- Heller School for Social Policy and Management, Brandeis University, Waltham, MA, USA
| | - Eduardo A Undurraga
- Escuela de Gobierno, Pontificia Universidad Católica de Chile, Santiago, Chile
- CIFAR Azrieli Global Scholars programme, CIFAR, Toronto, Ontario, Canada
| | - Tomás Huanca
- Centro Boliviano de Investigación y Desarrollo Socio Integral, San Borja, Bolivia
| | | | - Noumouké Doumbia
- Sciences de l'Education, Université Catholique d'Afrique de l'Ouest, Bamako, Mali
| | - Daniel Goldberg
- Department of Music, University of Connecticut, Storrs, CT, USA
| | | | - Patrick C M Wong
- Department of Linguistics & Modern Languages and Brain and Mind Institute, Chinese University of Hong Kong, Hong Kong SAR, China
| | - Luis Jure
- School of Music, Universidad de la República, Montevideo, Uruguay
| | - Martín Rocamora
- Signal Processing Department, School of Engineering, Universidad de la República, Montevideo, Uruguay
- Music Technology Group, Universitat Pompeu Fabra, Barcelona, Spain
| | - Shinya Fujii
- Faculty of Environment and Information Studies, Keio University, Fujisawa, Japan
| | - Patrick E Savage
- Faculty of Environment and Information Studies, Keio University, Fujisawa, Japan
- School of Psychology, University of Auckland, Auckland, New Zealand
| | - Jun Ajimi
- Department of Traditional Japanese Music, Tokyo University of the Arts, Tokyo, Japan
| | - Rei Konno
- Faculty of Environment and Information Studies, Keio University, Fujisawa, Japan
| | - Sho Oishi
- Faculty of Environment and Information Studies, Keio University, Fujisawa, Japan
| | | | - Andre Holzapfel
- Division of Media Technology and Interaction Design, KTH Royal Institute of Technology, Stockholm, Sweden
| | - Esra Mungan
- Department of Psychology, Bogazici University, Istanbul, Turkey
| | - Ece Kaya
- Max Planck Research Group 'Neural and Environmental Rhythms', Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany
- Cognitive Science Master Program, Bogazici University, Istanbul, Turkey
| | - Preeti Rao
- Department of Electrical Engineering, Indian Institute of Technology Bombay, Mumbai, India
| | - Mattur A Rohit
- Department of Electrical Engineering, Indian Institute of Technology Bombay, Mumbai, India
| | | | - Bronwyn Tarr
- Department of Cognitive and Evolutionary Anthropology, University of Oxford, Oxford, UK
- Department of Experimental Psychology, University of Oxford, Oxford, UK
| | - Manuel Anglada-Tort
- Computational Auditory Perception Group, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany
- Department of Psychology, Goldsmiths, University of London, London, UK
| | - Peter M C Harrison
- Computational Auditory Perception Group, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany
- Faculty of Music, University of Cambridge, Cambridge, UK
| | - Malinda J McPherson
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, USA
- Program in Speech and Hearing Biosciences and Technology, Harvard University, Cambridge, MA, USA
- McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Sophie Dolan
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, USA
- Department of Brain and Cognitive Sciences, Wellesley College, Wellesley, MA, USA
| | - Alex Durango
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, USA
- McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, MA, USA
- Neurosciences Graduate Program, Stanford University, Stanford, CA, USA
| | - Josh H McDermott
- Faculty of Music, University of Cambridge, Cambridge, UK.
- Program in Speech and Hearing Biosciences and Technology, Harvard University, Cambridge, MA, USA.
- McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, MA, USA.
- Center for Brains, Minds & Machines, Massachusetts Institute of Technology, Cambridge, MA, USA.
| |
Collapse
|
4
|
Bouwer FL, Háden GP, Honing H. Probing Beat Perception with Event-Related Potentials (ERPs) in Human Adults, Newborns, and Nonhuman Primates. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2024; 1455:227-256. [PMID: 38918355 DOI: 10.1007/978-3-031-60183-5_13] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/27/2024]
Abstract
The aim of this chapter is to give an overview of how the perception of rhythmic temporal regularity such as a regular beat in music can be studied in human adults, human newborns, and nonhuman primates using event-related brain potentials (ERPs). First, we discuss different aspects of temporal structure in general, and musical rhythm in particular, and we discuss the possible mechanisms underlying the perception of regularity (e.g., a beat) in rhythm. Additionally, we highlight the importance of dissociating beat perception from the perception of other types of structure in rhythm, such as predictable sequences of temporal intervals, ordinal structure, and rhythmic grouping. In the second section of the chapter, we start with a discussion of auditory ERPs elicited by infrequent and frequent sounds: ERP responses to regularity violations, such as mismatch negativity (MMN), N2b, and P3, as well as early sensory responses to sounds, such as P1 and N1, have been shown to be instrumental in probing beat perception. Subsequently, we discuss how beat perception can be probed by comparing ERP responses to sounds in regular and irregular sequences, and by comparing ERP responses to sounds in different metrical positions in a rhythm, such as on and off the beat or on strong and weak beats. Finally, we will discuss previous research that has used the aforementioned ERPs and paradigms to study beat perception in human adults, human newborns, and nonhuman primates. In doing so, we consider the possible pitfalls and prospects of the technique, as well as future perspectives.
Collapse
Affiliation(s)
- Fleur L Bouwer
- Cognitive Psychology Unit, Institute of Psychology, Leiden Institute for Brain and Cognition, Leiden University, Leiden, The Netherlands.
- Department of Psychology, Brain & Cognition, University of Amsterdam, Amsterdam, The Netherlands.
| | - Gábor P Háden
- Institute of Cognitive Neuroscience and Psychology, Budapest, Hungary
- Department of Telecommunications and Media Informatics, Faculty of Electrical Engineering and Informatics, Budapest University of Technology and Economics, Budapest, Hungary
| | - Henkjan Honing
- Music Cognition group (MCG), Institute for Logic, Language and Computation (ILLC), Amsterdam Brain and Cognition (ABC), University of Amsterdam, Amsterdam, The Netherlands
| |
Collapse
|
5
|
Fram NR, Berger J. Syncopation as Probabilistic Expectation: Conceptual, Computational, and Experimental Evidence. Cogn Sci 2023; 47:e13390. [PMID: 38043104 DOI: 10.1111/cogs.13390] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2022] [Revised: 08/22/2023] [Accepted: 11/17/2023] [Indexed: 12/05/2023]
Abstract
Definitions of syncopation share two characteristics: the presence of a meter or analogous hierarchical rhythmic structure and a displacement or contradiction of that structure. These attributes are translated in terms of a Bayesian theory of syncopation, where the syncopation of a rhythm is inferred based on a hierarchical structure that is, in turn, learned from the ongoing musical stimulus. Several experiments tested its simplest possible implementation, with equally weighted priors associated with different meters and independence of auditory events, which can be decomposed into two terms representing note density and deviation from a metric hierarchy. A computational simulation demonstrated that extant measures of syncopation fall into two distinct factors analogous to the terms in the simple Bayesian model. Next, a series of behavioral experiments found that perceived syncopation is significantly related to both terms, offering support for the general Bayesian construction of syncopation. However, we also found that the prior expectations associated with different metric structures are not equal across meters and that there is an interaction between density and hierarchical deviation, implying that auditory events are not independent from each other. Together, these findings provide evidence that syncopation is a manifestation of a form of temporal expectation that can be directly represented in Bayesian terms and offer a complementary, feature-driven approach to recent Bayesian models of temporal prediction.
Collapse
Affiliation(s)
- Noah R Fram
- Center for Computer Research in Music and Acoustics, Department of Music, Stanford University
- Department of Otolaryngology, Vanderbilt University Medical Center
| | - Jonathan Berger
- Center for Computer Research in Music and Acoustics, Department of Music, Stanford University
| |
Collapse
|
6
|
Kaplan T, Jamone L, Pearce M. Probabilistic modelling of microtiming perception. Cognition 2023; 239:105532. [PMID: 37442021 DOI: 10.1016/j.cognition.2023.105532] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2022] [Revised: 06/11/2023] [Accepted: 06/21/2023] [Indexed: 07/15/2023]
Abstract
Music performances are rich in systematic temporal irregularities called "microtiming", too fine-grained to be notated in a musical score but important for musical expression and communication. Several studies have examined listeners' preference for rhythms varying in microtiming, but few have addressed precisely how microtiming is perceived, especially in terms of cognitive mechanisms, making the empirical evidence difficult to interpret. Here we provide evidence that microtiming perception can be simulated as a process of probabilistic prediction. Participants performed an XAB discrimination test, in which an archetypal popular drum rhythm was presented with different microtiming. The results indicate that listeners could implicitly discriminate the mean and variance of stimulus microtiming. Furthermore, their responses were effectively simulated by a Bayesian model of entrainment, using a distance function derived from its dynamic posterior estimate over phase. Wide individual differences in participant sensitivity to microtiming were predicted by a model parameter likened to noisy timekeeping processes in the brain. Overall, this suggests that the cognitive mechanisms underlying perception of microtiming reflect a continuous inferential process, potentially driving qualitative judgements of rhythmic feel.
Collapse
Affiliation(s)
- Thomas Kaplan
- School of Electronic Engineering & Computer Science, Queen Mary University of London, London, United Kingdom.
| | - Lorenzo Jamone
- School of Engineering & Materials Science, Queen Mary University of London, London, United Kingdom
| | - Marcus Pearce
- School of Electronic Engineering & Computer Science, Queen Mary University of London, London, United Kingdom; Department of Clinical Medicine, Aarhus University, Aarhus, Denmark
| |
Collapse
|
7
|
Large EW, Roman I, Kim JC, Cannon J, Pazdera JK, Trainor LJ, Rinzel J, Bose A. Dynamic models for musical rhythm perception and coordination. Front Comput Neurosci 2023; 17:1151895. [PMID: 37265781 PMCID: PMC10229831 DOI: 10.3389/fncom.2023.1151895] [Citation(s) in RCA: 20] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2023] [Accepted: 04/28/2023] [Indexed: 06/03/2023] Open
Abstract
Rhythmicity permeates large parts of human experience. Humans generate various motor and brain rhythms spanning a range of frequencies. We also experience and synchronize to externally imposed rhythmicity, for example from music and song or from the 24-h light-dark cycles of the sun. In the context of music, humans have the ability to perceive, generate, and anticipate rhythmic structures, for example, "the beat." Experimental and behavioral studies offer clues about the biophysical and neural mechanisms that underlie our rhythmic abilities, and about different brain areas that are involved but many open questions remain. In this paper, we review several theoretical and computational approaches, each centered at different levels of description, that address specific aspects of musical rhythmic generation, perception, attention, perception-action coordination, and learning. We survey methods and results from applications of dynamical systems theory, neuro-mechanistic modeling, and Bayesian inference. Some frameworks rely on synchronization of intrinsic brain rhythms that span the relevant frequency range; some formulations involve real-time adaptation schemes for error-correction to align the phase and frequency of a dedicated circuit; others involve learning and dynamically adjusting expectations to make rhythm tracking predictions. Each of the approaches, while initially designed to answer specific questions, offers the possibility of being integrated into a larger framework that provides insights into our ability to perceive and generate rhythmic patterns.
Collapse
Affiliation(s)
- Edward W. Large
- Department of Psychological Sciences, University of Connecticut, Mansfield, CT, United States
- Department of Physics, University of Connecticut, Mansfield, CT, United States
| | - Iran Roman
- Music and Audio Research Laboratory, New York University, New York, NY, United States
| | - Ji Chul Kim
- Department of Psychological Sciences, University of Connecticut, Mansfield, CT, United States
| | - Jonathan Cannon
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, ON, Canada
| | - Jesse K. Pazdera
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, ON, Canada
| | - Laurel J. Trainor
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, ON, Canada
| | - John Rinzel
- Center for Neural Science, New York University, New York, NY, United States
- Courant Institute of Mathematical Sciences, New York University, New York, NY, United States
| | - Amitabha Bose
- Department of Mathematical Sciences, New Jersey Institute of Technology, Newark, NJ, United States
| |
Collapse
|
8
|
Senn O. A predictive coding approach to modelling the perceived complexity of popular music drum patterns. Heliyon 2023; 9:e15199. [PMID: 37123947 PMCID: PMC10130781 DOI: 10.1016/j.heliyon.2023.e15199] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2022] [Revised: 03/29/2023] [Accepted: 03/29/2023] [Indexed: 05/02/2023] Open
Abstract
This study presents a method to estimate the complexity of popular music drum patterns based on a core idea from predictive coding. Specifically, it postulates that the complexity of a drum pattern depends on the quantity of surprisal it causes in the listener. Surprisal, according to predictive coding theory, is a numerical measure that takes large values when the perceiver's internal model of the surrounding world fails to predict the actual stream of sensory data (i.e. when the perception surprises the perceiver), and low values if model predictions and sensory data agree. The proposed new method first approximates a listener's internal model of a popular music drum pattern (using ideas on enculturation and a Bayesian learning process). It then quantifies the listener's surprisal evaluating the discrepancies between the predictions of the internal model and the actual drum pattern. It finally estimates drum pattern complexity from surprisal. The method was optimised and tested using a set of forty popular music drum patterns, for which empirical perceived complexity measurements are available. The new method provided complexity estimates that had a good fit with the empirical measurements ( R 2 = . 852 ). The method was implemented as an R script that can be used to estimate the complexity of popular music drum patterns in the future. Simulations indicate that we can expect the method to predict perceived complexity with a good fit ( R 2 ≥ . 709 ) in 99% of drum pattern sets randomly drawn from the Western popular music repertoire. These results suggest that surprisal indeed captures essential aspects of complexity, and that it may serve as a basis for a general theory of perceived complexity.
Collapse
|