1
|
Fiveash A, Ferreri L, Bouwer FL, Kösem A, Moghimi S, Ravignani A, Keller PE, Tillmann B. Can rhythm-mediated reward boost learning, memory, and social connection? Perspectives for future research. Neurosci Biobehav Rev 2023; 149:105153. [PMID: 37019245 DOI: 10.1016/j.neubiorev.2023.105153] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/24/2022] [Revised: 03/14/2023] [Accepted: 03/31/2023] [Indexed: 04/05/2023]
Abstract
Studies of rhythm processing and of reward have progressed separately, with little connection between the two. However, consistent links between rhythm and reward are beginning to surface, with research suggesting that synchronization to rhythm is rewarding, and that this rewarding element may in turn also boost this synchronization. The current mini review shows that the combined study of rhythm and reward can be beneficial to better understand their independent and combined roles across two central aspects of cognition: 1) learning and memory, and 2) social connection and interpersonal synchronization; which have so far been studied largely independently. From this basis, it is discussed how connections between rhythm and reward can be applied to learning and memory and social connection across different populations, taking into account individual differences, clinical populations, human development, and animal research. Future research will need to consider the rewarding nature of rhythm, and that rhythm can in turn boost reward, potentially enhancing other cognitive and social processes.
Collapse
Affiliation(s)
- A Fiveash
- Lyon Neuroscience Research Center, CRNL, CNRS, UMR 5292, INSERM U1028, F-69000 Lyon, France; University of Lyon 1, Lyon, France; The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia.
| | - L Ferreri
- Department of Brain and Behavioural Sciences, University of Pavia, Pavia, Italy; Laboratoire d'Étude des Mécanismes Cognitifs, Université Lumière Lyon 2, Lyon, France
| | - F L Bouwer
- Department of Psychology, Brain and Cognition, University of Amsterdam, Amsterdam, the Netherlands
| | - A Kösem
- Lyon Neuroscience Research Center, CRNL, CNRS, UMR 5292, INSERM U1028, F-69000 Lyon, France
| | - S Moghimi
- Groupe de Recherches sur l'Analyse Multimodale de la Fonction Cérébrale, INSERM U1105, Amiens, France
| | - A Ravignani
- Comparative Bioacoustics Group, Max Planck Institute for Psycholinguistics, 6525 XD Nijmegen, the Netherlands; Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Denmark
| | - P E Keller
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia; Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Denmark
| | - B Tillmann
- Lyon Neuroscience Research Center, CRNL, CNRS, UMR 5292, INSERM U1028, F-69000 Lyon, France; University of Lyon 1, Lyon, France; Laboratory for Research on Learning and Development, LEAD - CNRS UMR5022, Université de Bourgogne, Dijon, France
| |
Collapse
|
2
|
Abstract
Music is ubiquitous across human cultures - as a source of affective and pleasurable experience, moving us both physically and emotionally - and learning to play music shapes both brain structure and brain function. Music processing in the brain - namely, the perception of melody, harmony and rhythm - has traditionally been studied as an auditory phenomenon using passive listening paradigms. However, when listening to music, we actively generate predictions about what is likely to happen next. This enactive aspect has led to a more comprehensive understanding of music processing involving brain structures implicated in action, emotion and learning. Here we review the cognitive neuroscience literature of music perception. We show that music perception, action, emotion and learning all rest on the human brain's fundamental capacity for prediction - as formulated by the predictive coding of music model. This Review elucidates how this formulation of music perception and expertise in individuals can be extended to account for the dynamics and underlying brain mechanisms of collective music making. This in turn has important implications for human creativity as evinced by music improvisation. These recent advances shed new light on what makes music meaningful from a neuroscientific perspective.
Collapse
Affiliation(s)
- Peter Vuust
- Center for Music in the Brain, Aarhus University and The Royal Academy of Music (Det Jyske Musikkonservatorium), Aarhus, Denmark.
| | - Ole A Heggli
- Center for Music in the Brain, Aarhus University and The Royal Academy of Music (Det Jyske Musikkonservatorium), Aarhus, Denmark
| | - Karl J Friston
- Wellcome Centre for Human Neuroimaging, University College London, London, UK
| | - Morten L Kringelbach
- Center for Music in the Brain, Aarhus University and The Royal Academy of Music (Det Jyske Musikkonservatorium), Aarhus, Denmark.,Department of Psychiatry, University of Oxford, Oxford, UK.,Centre for Eudaimonia and Human Flourishing, Linacre College, University of Oxford, Oxford, UK
| |
Collapse
|
3
|
Niarchou M, Gustavson DE, Sathirapongsasuti JF, Anglada-Tort M, Eising E, Bell E, McArthur E, Straub P, McAuley JD, Capra JA, Ullén F, Creanza N, Mosing MA, Hinds DA, Davis LK, Jacoby N, Gordon RL. Genome-wide association study of musical beat synchronization demonstrates high polygenicity. Nat Hum Behav 2022; 6:1292-1309. [PMID: 35710621 PMCID: PMC9489530 DOI: 10.1038/s41562-022-01359-x] [Citation(s) in RCA: 24] [Impact Index Per Article: 12.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2021] [Accepted: 04/21/2022] [Indexed: 02/02/2023]
Abstract
Moving in synchrony to the beat is a fundamental component of musicality. Here we conducted a genome-wide association study to identify common genetic variants associated with beat synchronization in 606,825 individuals. Beat synchronization exhibited a highly polygenic architecture, with 69 loci reaching genome-wide significance (P < 5 × 10-8) and single-nucleotide-polymorphism-based heritability (on the liability scale) of 13%-16%. Heritability was enriched for genes expressed in brain tissues and for fetal and adult brain-specific gene regulatory elements, underscoring the role of central-nervous-system-expressed genes linked to the genetic basis of the trait. We performed validations of the self-report phenotype (through separate experiments) and of the genome-wide association study (polygenic scores for beat synchronization were associated with patients algorithmically classified as musicians in medical records of a separate biobank). Genetic correlations with breathing function, motor function, processing speed and chronotype suggest shared genetic architecture with beat synchronization and provide avenues for new phenotypic and genetic explorations.
Collapse
Affiliation(s)
- Maria Niarchou
- Vanderbilt Genetics Institute, Vanderbilt University Medical Center, Nashville, TN, USA. .,Division of Genetic Medicine, Department of Medicine, Vanderbilt University Medical Center, Nashville, TN, USA.
| | - Daniel E. Gustavson
- grid.412807.80000 0004 1936 9916Vanderbilt Genetics Institute, Vanderbilt University Medical Center, Nashville, TN USA ,grid.412807.80000 0004 1936 9916Division of Genetic Medicine, Department of Medicine, Vanderbilt University Medical Center, Nashville, TN USA
| | | | - Manuel Anglada-Tort
- grid.461782.e0000 0004 1795 8610Computational Auditory Perception Group, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany
| | - Else Eising
- grid.419550.c0000 0004 0501 3839Department of Language and Genetics, Max Planck Institute for Psycholinguistics, Nijmegen, Netherlands
| | - Eamonn Bell
- grid.21729.3f0000000419368729Department of Music, Columbia University, New York, NY USA ,grid.8250.f0000 0000 8700 0572Department of Computer Science, Durham University, Durham, UK
| | - Evonne McArthur
- grid.412807.80000 0004 1936 9916Vanderbilt Genetics Institute, Vanderbilt University Medical Center, Nashville, TN USA
| | - Peter Straub
- grid.412807.80000 0004 1936 9916Vanderbilt Genetics Institute, Vanderbilt University Medical Center, Nashville, TN USA
| | | | - J. Devin McAuley
- grid.17088.360000 0001 2150 1785Department of Psychology, Michigan State University, East Lansing, MI USA
| | - John A. Capra
- grid.266102.10000 0001 2297 6811Bakar Computational Health Sciences Institute, University of California, San Francisco, CA USA ,grid.266102.10000 0001 2297 6811Department of Epidemiology & Biostatistics, University of California, San Francisco, CA USA
| | - Fredrik Ullén
- grid.465198.7Department of Neuroscience, Karolinska Institutet, Solna, Sweden ,grid.461782.e0000 0004 1795 8610Department of Cognitive Neuropsychology, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany
| | - Nicole Creanza
- grid.152326.10000 0001 2264 7217Department of Biological Sciences, Vanderbilt University, Nashville, TN USA ,grid.152326.10000 0001 2264 7217Evolutionary Studies Initiative, Vanderbilt University, Nashville, TN USA
| | - Miriam A. Mosing
- grid.465198.7Department of Neuroscience, Karolinska Institutet, Solna, Sweden ,grid.461782.e0000 0004 1795 8610Department of Cognitive Neuropsychology, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany ,grid.1008.90000 0001 2179 088XMelbourne School of Psychological Sciences, University of Melbourne, Melbourne, Victoria Australia
| | - David A. Hinds
- grid.420283.f0000 0004 0626 085823andMe, Inc, Sunnyvale, CA USA
| | - Lea K. Davis
- grid.412807.80000 0004 1936 9916Vanderbilt Genetics Institute, Vanderbilt University Medical Center, Nashville, TN USA ,grid.412807.80000 0004 1936 9916Division of Genetic Medicine, Department of Medicine, Vanderbilt University Medical Center, Nashville, TN USA ,grid.412807.80000 0004 1936 9916Department of Biomedical Informatics, Vanderbilt University Medical Center, Nashville, TN USA ,grid.412807.80000 0004 1936 9916Department of Psychiatry and Behavioral Sciences, Vanderbilt University Medical Center, Nashville, TN USA ,grid.152326.10000 0001 2264 7217Department of Molecular Physiology and Biophysics, Vanderbilt University, Nashville, TN USA
| | - Nori Jacoby
- grid.461782.e0000 0004 1795 8610Computational Auditory Perception Group, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany
| | - Reyna L. Gordon
- grid.412807.80000 0004 1936 9916Vanderbilt Genetics Institute, Vanderbilt University Medical Center, Nashville, TN USA ,grid.412807.80000 0004 1936 9916Department of Otolaryngology—Head & Neck Surgery, Vanderbilt University Medical Center, Nashville, TN USA ,grid.152326.10000 0001 2264 7217Department of Psychology, Vanderbilt University, Nashville, TN USA ,grid.152326.10000 0001 2264 7217Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN USA
| |
Collapse
|
4
|
Lenc T, Merchant H, Keller PE, Honing H, Varlet M, Nozaradan S. Mapping between sound, brain and behaviour: four-level framework for understanding rhythm processing in humans and non-human primates. Philos Trans R Soc Lond B Biol Sci 2021; 376:20200325. [PMID: 34420381 PMCID: PMC8380981 DOI: 10.1098/rstb.2020.0325] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/14/2021] [Indexed: 12/16/2022] Open
Abstract
Humans perceive and spontaneously move to one or several levels of periodic pulses (a meter, for short) when listening to musical rhythm, even when the sensory input does not provide prominent periodic cues to their temporal location. Here, we review a multi-levelled framework to understanding how external rhythmic inputs are mapped onto internally represented metric pulses. This mapping is studied using an approach to quantify and directly compare representations of metric pulses in signals corresponding to sensory inputs, neural activity and behaviour (typically body movement). Based on this approach, recent empirical evidence can be drawn together into a conceptual framework that unpacks the phenomenon of meter into four levels. Each level highlights specific functional processes that critically enable and shape the mapping from sensory input to internal meter. We discuss the nature, constraints and neural substrates of these processes, starting with fundamental mechanisms investigated in macaque monkeys that enable basic forms of mapping between simple rhythmic stimuli and internally represented metric pulse. We propose that human evolution has gradually built a robust and flexible system upon these fundamental processes, allowing more complex levels of mapping to emerge in musical behaviours. This approach opens promising avenues to understand the many facets of rhythmic behaviours across individuals and species. This article is part of the theme issue 'Synchrony and rhythm interaction: from the brain to behavioural ecology'.
Collapse
Affiliation(s)
- Tomas Lenc
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, New South Wales 2751, Australia
- Institute of Neuroscience (IONS), Université Catholique de Louvain (UCL), Brussels 1200, Belgium
| | - Hugo Merchant
- Instituto de Neurobiologia, UNAM, Campus Juriquilla, Querétaro 76230, Mexico
| | - Peter E. Keller
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, New South Wales 2751, Australia
| | - Henkjan Honing
- Amsterdam Brain and Cognition (ABC), Institute for Logic, Language and Computation (ILLC), University of Amsterdam, Amsterdam 1090 GE, The Netherlands
| | - Manuel Varlet
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, New South Wales 2751, Australia
- School of Psychology, Western Sydney University, Penrith, New South Wales 2751, Australia
| | - Sylvie Nozaradan
- Institute of Neuroscience (IONS), Université Catholique de Louvain (UCL), Brussels 1200, Belgium
| |
Collapse
|
5
|
Jacoby N, Polak R, London J. Extreme precision in rhythmic interaction is enabled by role-optimized sensorimotor coupling: analysis and modelling of West African drum ensemble music. Philos Trans R Soc Lond B Biol Sci 2021; 376:20200331. [PMID: 34420391 PMCID: PMC8380984 DOI: 10.1098/rstb.2020.0331] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/14/2021] [Indexed: 11/12/2022] Open
Abstract
Human social interactions often involve carefully synchronized behaviours. Musical performance in particular features precise timing and depends on the differentiation and coordination of musical/social roles. Here, we study the influence of musical/social roles, individual musicians and different ensembles on rhythmic synchronization in Malian drum ensemble music, which features synchronization accuracy near the limits of human performance. We analysed 72 recordings of the same piece performed by four trios, in which two drummers in each trio systematically switched roles (lead versus accompaniment). Musical role, rather than individual or group differences, is the main factor influencing synchronization accuracy. Using linear causal modelling, we found a consistent pattern of bi-directional couplings between players, in which the direction and strength of rhythmic adaptation is asymmetrically distributed across musical roles. This differs from notions of musical leadership, which assume that ensemble synchronization relies predominantly on a single dominant personality and/or musical role. We then ran simulations that varied the direction and strength of sensorimotor coupling and found that the coupling pattern used by the Malian musicians affords nearly optimal synchronization. More broadly, our study showcases the importance of ecologically valid and culturally diverse studies of human behaviour. This article is part of the theme issue 'Synchrony and rhythm interaction: from the brain to behavioural ecology'.
Collapse
Affiliation(s)
- Nori Jacoby
- Research Group Computational Auditory Perception, Max Planck Institute for Empirical Aesthetics, Grueneburgweg 14, 60322 Frankfurt, Germany
- The Center for Science and Society, Columbia University, New York, NY 10027, USA
| | - Rainer Polak
- Music Department, Max Planck Institute for Empirical Aesthetics, Grueneburgweg 14, 60322 Frankfurt, Germany
| | - Justin London
- Music Department, Carleton College, 1 North College Street, Northfield, MN 55057, USA
| |
Collapse
|
6
|
Hannon EE, Crittenden AN, Snyder JS, Nave KM. An evolutionary theory of music needs to care about developmental timing. Behav Brain Sci 2021; 44:e74. [PMID: 34588027 DOI: 10.1017/S0140525X20001168] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Both target papers cite evidence from infancy and early childhood to support the notion of human musicality as a somewhat static suite of capacities; however, in our view they do not adequately acknowledge the critical role of developmental timing, the acquisition process, or the dynamics of social learning, especially during later periods of development such as middle childhood.
Collapse
|
7
|
Cannon J. Expectancy-based rhythmic entrainment as continuous Bayesian inference. PLoS Comput Biol 2021; 17:e1009025. [PMID: 34106918 PMCID: PMC8216548 DOI: 10.1371/journal.pcbi.1009025] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2020] [Revised: 06/21/2021] [Accepted: 04/29/2021] [Indexed: 11/18/2022] Open
Abstract
When presented with complex rhythmic auditory stimuli, humans are able to track underlying temporal structure (e.g., a "beat"), both covertly and with their movements. This capacity goes far beyond that of a simple entrained oscillator, drawing on contextual and enculturated timing expectations and adjusting rapidly to perturbations in event timing, phase, and tempo. Previous modeling work has described how entrainment to rhythms may be shaped by event timing expectations, but sheds little light on any underlying computational principles that could unify the phenomenon of expectation-based entrainment with other brain processes. Inspired by the predictive processing framework, we propose that the problem of rhythm tracking is naturally characterized as a problem of continuously estimating an underlying phase and tempo based on precise event times and their correspondence to timing expectations. We present two inference problems formalizing this insight: PIPPET (Phase Inference from Point Process Event Timing) and PATIPPET (Phase and Tempo Inference). Variational solutions to these inference problems resemble previous "Dynamic Attending" models of perceptual entrainment, but introduce new terms representing the dynamics of uncertainty and the influence of expectations in the absence of sensory events. These terms allow us to model multiple characteristics of covert and motor human rhythm tracking not addressed by other models, including sensitivity of error corrections to inter-event interval and perceived tempo changes induced by event omissions. We show that positing these novel influences in human entrainment yields a range of testable behavioral predictions. Guided by recent neurophysiological observations, we attempt to align the phase inference framework with a specific brain implementation. We also explore the potential of this normative framework to guide the interpretation of experimental data and serve as building blocks for even richer predictive processing and active inference models of timing.
Collapse
Affiliation(s)
- Jonathan Cannon
- Department of Brain and Cognitive Science, Massachusetts Institute of Technology, Cambridge, Massachusetts, United States of America
- * E-mail:
| |
Collapse
|
8
|
Durojaye C, Fink L, Roeske T, Wald-Fuhrmann M, Larrouy-Maestri P. Perception of Nigerian Dùndún Talking Drum Performances as Speech-Like vs. Music-Like: The Role of Familiarity and Acoustic Cues. Front Psychol 2021; 12:652673. [PMID: 34093341 PMCID: PMC8173200 DOI: 10.3389/fpsyg.2021.652673] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2021] [Accepted: 04/21/2021] [Indexed: 11/23/2022] Open
Abstract
It seems trivial to identify sound sequences as music or speech, particularly when the sequences come from different sound sources, such as an orchestra and a human voice. Can we also easily distinguish these categories when the sequence comes from the same sound source? On the basis of which acoustic features? We investigated these questions by examining listeners’ classification of sound sequences performed by an instrument intertwining both speech and music: the dùndún talking drum. The dùndún is commonly used in south-west Nigeria as a musical instrument but is also perfectly fit for linguistic usage in what has been described as speech surrogates in Africa. One hundred seven participants from diverse geographical locations (15 different mother tongues represented) took part in an online experiment. Fifty-one participants reported being familiar with the dùndún talking drum, 55% of those being speakers of Yorùbá. During the experiment, participants listened to 30 dùndún samples of about 7s long, performed either as music or Yorùbá speech surrogate (n = 15 each) by a professional musician, and were asked to classify each sample as music or speech-like. The classification task revealed the ability of the listeners to identify the samples as intended by the performer, particularly when they were familiar with the dùndún, though even unfamiliar participants performed above chance. A logistic regression predicting participants’ classification of the samples from several acoustic features confirmed the perceptual relevance of intensity, pitch, timbre, and timing measures and their interaction with listener familiarity. In all, this study provides empirical evidence supporting the discriminating role of acoustic features and the modulatory role of familiarity in teasing apart speech and music.
Collapse
Affiliation(s)
- Cecilia Durojaye
- Department of Music, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany.,Department of Psychology, Arizona State University, Tempe, AZ, United States
| | - Lauren Fink
- Department of Music, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany.,Max Planck-NYU, Center for Language, Music, and Emotion, Frankfurt am Main, Germany
| | - Tina Roeske
- Department of Music, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany
| | - Melanie Wald-Fuhrmann
- Department of Music, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany.,Max Planck-NYU, Center for Language, Music, and Emotion, Frankfurt am Main, Germany
| | - Pauline Larrouy-Maestri
- Max Planck-NYU, Center for Language, Music, and Emotion, Frankfurt am Main, Germany.,Neuroscience Department, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany
| |
Collapse
|
9
|
Benedetto A, Baud-Bovy G. Tapping Force Encodes Metrical Aspects of Rhythm. Front Hum Neurosci 2021; 15:633956. [PMID: 33986651 PMCID: PMC8111927 DOI: 10.3389/fnhum.2021.633956] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2020] [Accepted: 03/26/2021] [Indexed: 11/13/2022] Open
Abstract
Humans possess the ability to extract highly organized perceptual structures from sequences of temporal stimuli. For instance, we can organize specific rhythmical patterns into hierarchical, or metrical, systems. Despite the evidence of a fundamental influence of the motor system in achieving this skill, few studies have attempted to investigate the organization of our motor representation of rhythm. To this aim, we studied-in musicians and non-musicians-the ability to perceive and reproduce different rhythms. In a first experiment participants performed a temporal order-judgment task, for rhythmical sequences presented via auditory or tactile modality. In a second experiment, they were asked to reproduce the same rhythmic sequences, while their tapping force and timing were recorded. We demonstrate that tapping force encodes the metrical aspect of the rhythm, and the strength of the coding correlates with the individual's perceptual accuracy. We suggest that the similarity between perception and tapping-force organization indicates a common representation of rhythm, shared between the perceptual and motor systems.
Collapse
Affiliation(s)
| | - Gabriel Baud-Bovy
- Robotics, Brain and Cognitive Science Unit, Italian Institute of Technology, Genoa, Italy
- Faculty of Psychology, Vita-Salute San Raffaele University, Milan, Italy
| |
Collapse
|
10
|
Lenc T, Keller PE, Varlet M, Nozaradan S. Neural and Behavioral Evidence for Frequency-Selective Context Effects in Rhythm Processing in Humans. Cereb Cortex Commun 2020; 1:tgaa037. [PMID: 34296106 PMCID: PMC8152888 DOI: 10.1093/texcom/tgaa037] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Revised: 06/30/2020] [Accepted: 07/16/2020] [Indexed: 01/17/2023] Open
Abstract
When listening to music, people often perceive and move along with a periodic meter. However, the dynamics of mapping between meter perception and the acoustic cues to meter periodicities in the sensory input remain largely unknown. To capture these dynamics, we recorded the electroencephalography while nonmusician and musician participants listened to nonrepeating rhythmic sequences, where acoustic cues to meter frequencies either gradually decreased (from regular to degraded) or increased (from degraded to regular). The results revealed greater neural activity selectively elicited at meter frequencies when the sequence gradually changed from regular to degraded compared with the opposite. Importantly, this effect was unlikely to arise from overall gain, or low-level auditory processing, as revealed by physiological modeling. Moreover, the context effect was more pronounced in nonmusicians, who also demonstrated facilitated sensory-motor synchronization with the meter for sequences that started as regular. In contrast, musicians showed weaker effects of recent context in their neural responses and robust ability to move along with the meter irrespective of stimulus degradation. Together, our results demonstrate that brain activity elicited by rhythm does not only reflect passive tracking of stimulus features, but represents continuous integration of sensory input with recent context.
Collapse
Affiliation(s)
- Tomas Lenc
- MARCS Institute for Brain, Behaviour, and Development, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
| | - Peter E Keller
- MARCS Institute for Brain, Behaviour, and Development, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
| | - Manuel Varlet
- MARCS Institute for Brain, Behaviour, and Development, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
- School of Psychology, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
| | - Sylvie Nozaradan
- MARCS Institute for Brain, Behaviour, and Development, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
- Institute of Neuroscience (IONS), Université Catholique de Louvain (UCL), Brussels 1200, Belgium
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal QC H3C 3J7, Canada
| |
Collapse
|
11
|
Abstract
This paper presents a model capable of learning the rhythmic characteristics of a music signal through unsupervised learning. The model learns a multi-layer hierarchy of rhythmic patterns ranging from simple structures on lower layers to more complex patterns on higher layers. The learned hierarchy is fully transparent, which enables observation and explanation of the structure of the learned patterns. The model employs tempo-invariant encoding of patterns and can thus learn and perform inference on tempo-varying and noisy input data. We demonstrate the model’s capabilities of learning distinctive rhythmic structures of different music genres using unsupervised learning. To test its robustness, we show how the model can efficiently extract rhythmic structures in songs with changing time signatures and live recordings. Additionally, the model’s time-complexity is empirically tested to show its usability for analysis-related applications.
Collapse
|
12
|
|
13
|
Abstract
Enculturation is known to shape the perception of meter in music but this is not explicitly accounted for by current cognitive models of meter perception. We hypothesize that the induction of meter is a result of predictive coding: interpreting onsets in a rhythm relative to a periodic meter facilitates prediction of future onsets. Such prediction, we hypothesize, is based on previous exposure to rhythms. As such, predictive coding provides a possible explanation for the way meter perception is shaped by the cultural environment. Based on this hypothesis, we present a probabilistic model of meter perception that uses statistical properties of the relation between rhythm and meter to infer meter from quantized rhythms. We show that our model can successfully predict annotated time signatures from quantized rhythmic patterns derived from folk melodies. Furthermore, we show that by inferring meter, our model improves prediction of the onsets of future events compared to a similar probabilistic model that does not infer meter. Finally, as a proof of concept, we demonstrate how our model can be used in a simulation of enculturation. From the results of this simulation, we derive a class of rhythms that are likely to be interpreted differently by enculturated listeners with different histories of exposure to rhythms.
Collapse
Affiliation(s)
- Bastiaan van der Weij
- Music Cognition Group, Amsterdam Brain and Cognition, Institute for Logic, Language, and Computation, University of AmsterdamAmsterdam, Netherlands
| | - Marcus T Pearce
- Music Cognition Lab, School of Electronic Engineering and Computer Science, Queen Mary University of LondonLondon, United Kingdom
| | - Henkjan Honing
- Music Cognition Group, Amsterdam Brain and Cognition, Institute for Logic, Language, and Computation, University of AmsterdamAmsterdam, Netherlands
| |
Collapse
|