1
|
Shen L, Li S, Tian Y, Wang Y, Jiang Y. Cortical tracking of hierarchical rhythms orchestrates the multisensory processing of biological motion. eLife 2025; 13:RP98701. [PMID: 39907560 PMCID: PMC11798571 DOI: 10.7554/elife.98701] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2025] Open
Abstract
When observing others' behaviors, we continuously integrate their movements with the corresponding sounds to enhance perception and develop adaptive responses. However, how the human brain integrates these complex audiovisual cues based on their natural temporal correspondence remains unclear. Using electroencephalogram (EEG), we demonstrated that rhythmic cortical activity tracked the hierarchical rhythmic structures in audiovisually congruent human walking movements and footstep sounds. Remarkably, the cortical tracking effects exhibit distinct multisensory integration modes at two temporal scales: an additive mode in a lower-order, narrower temporal integration window (step cycle) and a super-additive enhancement in a higher-order, broader temporal window (gait cycle). Furthermore, while neural responses at the lower-order timescale reflect a domain-general audiovisual integration process, cortical tracking at the higher-order timescale is exclusively engaged in the integration of biological motion cues. In addition, only this higher-order, domain-specific cortical tracking effect correlates with individuals' autistic traits, highlighting its potential as a neural marker for autism spectrum disorder. These findings unveil the multifaceted mechanism whereby rhythmic cortical activity supports the multisensory integration of human motion, shedding light on how neural coding of hierarchical temporal structures orchestrates the processing of complex, natural stimuli across multiple timescales.
Collapse
Affiliation(s)
- Li Shen
- State Key Laboratory of Cognitive Science and Mental Health, Institute of Psychology, Chinese Academy of SciencesBeijingChina
- Department of Psychology, University of Chinese Academy of SciencesBeijingChina
| | - Shuo Li
- State Key Laboratory of Cognitive Science and Mental Health, Institute of Psychology, Chinese Academy of SciencesBeijingChina
- Department of Psychology, University of Chinese Academy of SciencesBeijingChina
| | - Yuhao Tian
- State Key Laboratory of Cognitive Science and Mental Health, Institute of Psychology, Chinese Academy of SciencesBeijingChina
- Department of Psychology, University of Chinese Academy of SciencesBeijingChina
| | - Ying Wang
- State Key Laboratory of Cognitive Science and Mental Health, Institute of Psychology, Chinese Academy of SciencesBeijingChina
- Department of Psychology, University of Chinese Academy of SciencesBeijingChina
| | - Yi Jiang
- State Key Laboratory of Cognitive Science and Mental Health, Institute of Psychology, Chinese Academy of SciencesBeijingChina
- Department of Psychology, University of Chinese Academy of SciencesBeijingChina
| |
Collapse
|
2
|
Tonelli L, Tichko P, Skoe E. Revisiting the 40-Hz gamma response: Phase-locked neural activity along the human auditory pathway relates to bilingual experience. BRAIN AND LANGUAGE 2025; 260:105506. [PMID: 39673844 DOI: 10.1016/j.bandl.2024.105506] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/04/2023] [Revised: 04/04/2024] [Accepted: 11/27/2024] [Indexed: 12/16/2024]
Abstract
Spoken language experience influences brain responses to sound, but it is unclear whether this neuroplasticity is limited to speech frequencies (>100 Hz) or also affects lower gamma ranges (∼30-60 Hz). Using the frequency-following response (FFR), a far-field phase-locked response to sound, we explore whether bilingualism influences the location of the strongest response in the gamma range. Our results indicate that the strongest gamma response for bilinguals is most often at 43 Hz, compared to 51 Hz for monolinguals. Using a computational model, we show how this group difference could result from differential subcortical activation. These results shed light on the well-known but under-explored variability observed in the gamma range and highlight that FFRs are a composite of neural activity from both subcortical and cortical sources. Additionally, our findings emphasize that individual auditory experiences can uniquely shape subcortical activation, influencing FFRs below speech frequencies.
Collapse
Affiliation(s)
- Luan Tonelli
- Department of Speech, Language, and Hearing Sciences, University of Connecticut, Storrs, CT 06269, USA
| | - Parker Tichko
- Department of Psychological Sciences, Developmental Psychology Program, University of Connecticut, Storrs, CT 06269, USA
| | - Erika Skoe
- Connecticut Institute for the Brain and Cognitive Sciences, University of Connecticut, Storrs, CT 06269, USA.
| |
Collapse
|
3
|
Wu Q, Sun L, Ding N, Yang Y. Musical tension is affected by metrical structure dynamically and hierarchically. Cogn Neurodyn 2024; 18:1955-1976. [PMID: 39104669 PMCID: PMC11297889 DOI: 10.1007/s11571-023-10058-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2022] [Revised: 11/29/2023] [Accepted: 12/11/2023] [Indexed: 08/07/2024] Open
Abstract
As the basis of musical emotions, dynamic tension experience is felt by listeners as music unfolds over time. The effects of musical harmonic and melodic structures on tension have been widely investigated, however, the potential roles of metrical structures in tension perception remain largely unexplored. This experiment examined how different metrical structures affect tension experience and explored the underlying neural activities. The electroencephalogram (EEG) was recorded and subjective tension was rated simultaneously while participants listened to music meter sequences. On large time scale of whole meter sequences, it was found that different overall tension and low-frequency (1 ~ 4 Hz) steady-state evoked potentials were elicited by metrical structures with different periods of strong beats, and the higher overall tension was associated with metrical structure with the shorter intervals between strong beats. On small time scale of measures, dynamic tension fluctuations within measures was found to be associated with the periodic modulations of high-frequency (10 ~ 25 Hz) neural activities. The comparisons between the same beats within measures and across different meters both on small and large time scales verified the contextual effects of meter on tension induced by beats. Our findings suggest that the overall tension is determined by temporal intervals between strong beats, and the dynamic tension experience may arise from cognitive processing of hierarchical temporal expectation and attention, which are discussed under the theoretical frameworks of metrical hierarchy, musical expectation and dynamic attention.
Collapse
Affiliation(s)
- Qiong Wu
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, No. 16 Lincui Road, Chaoyang District, Beijing, 100101 China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Lijun Sun
- College of Arts, Nanjing University of Aeronautics and Astronautics, Nanjing, China
| | - Nai Ding
- Key Laboratory for Biomedical Engineering of Ministry of Education, College of Biomedical Engineering and Instrument Sciences, Zhejiang University, Hangzhou, China
| | - Yufang Yang
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, No. 16 Lincui Road, Chaoyang District, Beijing, 100101 China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
4
|
Jacxsens L, Biot L, Escera C, Gilles A, Cardon E, Van Rompaey V, De Hertogh W, Lammers MJW. Frequency-Following Responses in Sensorineural Hearing Loss: A Systematic Review. J Assoc Res Otolaryngol 2024; 25:131-147. [PMID: 38334887 PMCID: PMC11018579 DOI: 10.1007/s10162-024-00932-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2023] [Accepted: 01/18/2024] [Indexed: 02/10/2024] Open
Abstract
PURPOSE This systematic review aims to assess the impact of sensorineural hearing loss (SNHL) on various frequency-following response (FFR) parameters. METHODS Following PRISMA guidelines, a systematic review was conducted using PubMed, Web of Science, and Scopus databases up to January 2023. Studies evaluating FFRs in patients with SNHL and normal hearing controls were included. RESULTS Sixteen case-control studies were included, revealing variability in acquisition parameters. In the time domain, patients with SNHL exhibited prolonged latencies. The specific waves that were prolonged differed across studies. There was no consensus regarding wave amplitude in the time domain. In the frequency domain, focusing on studies that elicited FFRs with stimuli of 170 ms or longer, participants with SNHL displayed a significantly smaller fundamental frequency (F0). Results regarding changes in the temporal fine structure (TFS) were inconsistent. CONCLUSION Patients with SNHL may require more time for processing (speech) stimuli, reflected in prolonged latencies. However, the exact timing of this delay remains unclear. Additionally, when presenting longer stimuli (≥ 170 ms), patients with SNHL show difficulties tracking the F0 of (speech) stimuli. No definite conclusions could be drawn on changes in wave amplitude in the time domain and the TFS in the frequency domain. Patient characteristics, acquisition parameters, and FFR outcome parameters differed greatly across studies. Future studies should be performed in larger and carefully matched subject groups, using longer stimuli presented at the same intensity in dB HL for both groups, or at a carefully determined maximum comfortable loudness level.
Collapse
Affiliation(s)
- Laura Jacxsens
- Department of Otorhinolaryngology, Head and Neck Surgery, Antwerp University Hospital (UZA), Drie Eikenstraat 655, 2650, Edegem, Belgium.
- Resonant Labs Antwerp, Department of Translational Neurosciences, Faculty of Medicine and Health Sciences, University of Antwerp, Antwerp, Belgium.
- Department of Rehabilitation Sciences and Physiotherapy, Faculty of Medicine and Health Sciences, University of Antwerp, Antwerp, Belgium.
| | - Lana Biot
- Department of Otorhinolaryngology, Head and Neck Surgery, Antwerp University Hospital (UZA), Drie Eikenstraat 655, 2650, Edegem, Belgium
- Resonant Labs Antwerp, Department of Translational Neurosciences, Faculty of Medicine and Health Sciences, University of Antwerp, Antwerp, Belgium
| | - Carles Escera
- Neuroscience Research Group, Department of Clinical Psychology and Psychobiology, Brainlab - Cognitive, University of Barcelona, Catalonia, Spain
- Institute of Neurosciences, University of Barcelona, Catalonia, Spain
- Institut de Recerca Sant Joan de Déu, Santa Rosa 39-57, 08950, Esplugues de Llobregat, Catalonia, Spain
| | - Annick Gilles
- Department of Otorhinolaryngology, Head and Neck Surgery, Antwerp University Hospital (UZA), Drie Eikenstraat 655, 2650, Edegem, Belgium
- Resonant Labs Antwerp, Department of Translational Neurosciences, Faculty of Medicine and Health Sciences, University of Antwerp, Antwerp, Belgium
- Department of Education, Health and Social Work, University College Ghent, Ghent, Belgium
| | - Emilie Cardon
- Department of Otorhinolaryngology, Head and Neck Surgery, Antwerp University Hospital (UZA), Drie Eikenstraat 655, 2650, Edegem, Belgium
- Resonant Labs Antwerp, Department of Translational Neurosciences, Faculty of Medicine and Health Sciences, University of Antwerp, Antwerp, Belgium
| | - Vincent Van Rompaey
- Department of Otorhinolaryngology, Head and Neck Surgery, Antwerp University Hospital (UZA), Drie Eikenstraat 655, 2650, Edegem, Belgium
- Resonant Labs Antwerp, Department of Translational Neurosciences, Faculty of Medicine and Health Sciences, University of Antwerp, Antwerp, Belgium
| | - Willem De Hertogh
- Department of Rehabilitation Sciences and Physiotherapy, Faculty of Medicine and Health Sciences, University of Antwerp, Antwerp, Belgium
| | - Marc J W Lammers
- Department of Otorhinolaryngology, Head and Neck Surgery, Antwerp University Hospital (UZA), Drie Eikenstraat 655, 2650, Edegem, Belgium
- Resonant Labs Antwerp, Department of Translational Neurosciences, Faculty of Medicine and Health Sciences, University of Antwerp, Antwerp, Belgium
| |
Collapse
|
5
|
Aharoni M, Breska A, Müller MM, Schröger E. Mechanisms of sustained perceptual entrainment after stimulus offset. Eur J Neurosci 2024; 59:1047-1060. [PMID: 37150801 DOI: 10.1111/ejn.16032] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2021] [Revised: 04/21/2023] [Accepted: 04/22/2023] [Indexed: 05/09/2023]
Abstract
Temporal alignment of neural activity to rhythmic stimulation has been suggested to result from a resonating internal neural oscillator mechanism, but can also be explained by interval-based temporal prediction. Here, we investigate behavioural and brain responses in the post-stimulation period to compare an oscillatory versus an interval-based account. Hickok et al.'s (2015) behavioural paradigm yielded results that relate to a neural oscillatory entrainment mechanism. We adapted the paradigm to an event-related potential (ERP) suitable design: a periodic sequence was followed, in half of the trials, by near-threshold targets embedded in noise. The targets were played in various phases in relation to the preceding sequences' period. Participants had to detect whether targets were played or not, and their EEG was recorded. Both behavioural results and the P300 component of the ERP were not only partially consistent with an oscillatory mechanism but also partially consistent with an interval-based attentional gain mechanism. Instead, data obtained in the post-entrainment period can best be explained with a combination of both mechanisms.
Collapse
Affiliation(s)
- Moran Aharoni
- Edmund and Lilly Safra Center for Brain Science, The Hebrew University of Jerusalem, Jerusalem, Israel
- Wilhelm Wundt Institute for Psychology, Leipzig University, Leipzig, Germany
| | - Assaf Breska
- Max Planck Institute for Biological Cybernetics, Tübingen, Germany
| | - Matthias M Müller
- Wilhelm Wundt Institute for Psychology, Leipzig University, Leipzig, Germany
| | - Erich Schröger
- Wilhelm Wundt Institute for Psychology, Leipzig University, Leipzig, Germany
| |
Collapse
|
6
|
Merchant H, Mendoza G, Pérez O, Betancourt A, García-Saldivar P, Prado L. Diverse Time Encoding Strategies Within the Medial Premotor Areas of the Primate. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2024; 1455:117-140. [PMID: 38918349 DOI: 10.1007/978-3-031-60183-5_7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/27/2024]
Abstract
The measurement of time in the subsecond scale is critical for many sophisticated behaviors, yet its neural underpinnings are largely unknown. Recent neurophysiological experiments from our laboratory have shown that the neural activity in the medial premotor areas (MPC) of macaques can represent different aspects of temporal processing. During single interval categorization, we found that preSMA encodes a subjective category limit by reaching a peak of activity at a time that divides the set of test intervals into short and long. We also observed neural signals associated with the category selected by the subjects and the reward outcomes of the perceptual decision. On the other hand, we have studied the behavioral and neurophysiological basis of rhythmic timing. First, we have shown in different tapping tasks that macaques are able to produce predictively and accurately intervals that are cued by auditory or visual metronomes or when intervals are produced internally without sensory guidance. In addition, we found that the rhythmic timing mechanism in MPC is governed by different layers of neural clocks. Next, the instantaneous activity of single cells shows ramping activity that encodes the elapsed or remaining time for a tapping movement. In addition, we found MPC neurons that build neural sequences, forming dynamic patterns of activation that flexibly cover all the produced interval depending on the tapping tempo. This rhythmic neural clock resets on every interval providing an internal representation of pulse. Furthermore, the MPC cells show mixed selectivity, encoding not only elapsed time, but also the tempo of the tapping and the serial order element in the rhythmic sequence. Hence, MPC can map different task parameters, including the passage of time, using different cell populations. Finally, the projection of the time varying activity of MPC hundreds of cells into a low dimensional state space showed circular neural trajectories whose geometry represented the internal pulse and the tapping tempo. Overall, these findings support the notion that MPC is part of the core timing mechanism for both single interval and rhythmic timing, using neural clocks with different encoding principles, probably to flexibly encode and mix the timing representation with other task parameters.
Collapse
Affiliation(s)
- Hugo Merchant
- Instituto de Neurobiología, UNAM, Campus Juriquilla, Querétaro, Mexico.
| | - Germán Mendoza
- Instituto de Neurobiología, UNAM, Campus Juriquilla, Querétaro, Mexico
| | - Oswaldo Pérez
- Instituto de Neurobiología, UNAM, Campus Juriquilla, Querétaro, Mexico
| | | | | | - Luis Prado
- Instituto de Neurobiología, UNAM, Campus Juriquilla, Querétaro, Mexico
| |
Collapse
|
7
|
Betancourt A, Pérez O, Gámez J, Mendoza G, Merchant H. Amodal population clock in the primate medial premotor system for rhythmic tapping. Cell Rep 2023; 42:113234. [PMID: 37838944 DOI: 10.1016/j.celrep.2023.113234] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2022] [Revised: 08/09/2023] [Accepted: 09/24/2023] [Indexed: 10/17/2023] Open
Abstract
The neural substrate for beat extraction and response entrainment to rhythms is not fully understood. Here we analyze the activity of medial premotor neurons in monkeys performing isochronous tapping guided by brief flashing stimuli or auditory tones. The population dynamics shared the following properties across modalities: the circular dynamics of the neural trajectories form a regenerating loop for every produced interval; the trajectories converge in similar state space at tapping times resetting the clock; and the tempo of the synchronized tapping is encoded in the trajectories by a combination of amplitude modulation and temporal scaling. Notably, the modality induces displacement in the neural trajectories in the auditory and visual subspaces without greatly altering the time-keeping mechanism. These results suggest that the interaction between the medial premotor cortex's amodal internal representation of pulse and a modality-specific external input generates a neural rhythmic clock whose dynamics govern rhythmic tapping execution across senses.
Collapse
Affiliation(s)
- Abraham Betancourt
- Instituto de Neurobiología, UNAM, Campus Juriquilla, Boulevard Juriquilla No. 3001, Querétaro, Qro 76230, México
| | - Oswaldo Pérez
- Escuela Nacional de Estudios Superiores, Unidad Juriquilla, UNAM, Boulevard Juriquilla No. 3001, Querétaro, Qro 76230, México
| | - Jorge Gámez
- Instituto de Neurobiología, UNAM, Campus Juriquilla, Boulevard Juriquilla No. 3001, Querétaro, Qro 76230, México
| | - Germán Mendoza
- Instituto de Neurobiología, UNAM, Campus Juriquilla, Boulevard Juriquilla No. 3001, Querétaro, Qro 76230, México
| | - Hugo Merchant
- Instituto de Neurobiología, UNAM, Campus Juriquilla, Boulevard Juriquilla No. 3001, Querétaro, Qro 76230, México.
| |
Collapse
|
8
|
Peter V, Goswami U, Burnham D, Kalashnikova M. Impaired neural entrainment to low frequency amplitude modulations in English-speaking children with dyslexia or dyslexia and DLD. BRAIN AND LANGUAGE 2023; 236:105217. [PMID: 36529116 DOI: 10.1016/j.bandl.2022.105217] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/22/2021] [Revised: 08/19/2022] [Accepted: 12/09/2022] [Indexed: 06/17/2023]
Abstract
Neural synchronization to amplitude-modulated noise at three frequencies (2 Hz, 5 Hz, 8 Hz) thought to be important for syllable perception was investigated in English-speaking school-aged children. The theoretically-important delta-band (∼2Hz, stressed syllable level) was included along with two syllable-level rates. The auditory steady state response (ASSR) was recorded using EEG in 36 7-to-12-year-old children. Half of the sample had either dyslexia or dyslexia and DLD (developmental language disorder). In comparison to typically-developing children, children with dyslexia or with dyslexia and DLD showed reduced ASSRs for 2 Hz stimulation but similar ASSRs at 5 Hz and 8 Hz. These novel data for English ASSRs converge with prior data suggesting that children with dyslexia have atypical synchrony between brain oscillations and incoming auditory stimulation at ∼ 2 Hz, the rate of stressed syllable production across languages. This atypical synchronization likely impairs speech processing, phonological processing, and possibly syntactic processing, as predicted by Temporal Sampling theory.
Collapse
Affiliation(s)
- Varghese Peter
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Australia; School of Health and Behavioural Sciences, University of the Sunshine Coast, Australia
| | - Usha Goswami
- Centre for Neuroscience in Education, University of Cambridge, UK
| | - Denis Burnham
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Australia
| | - Marina Kalashnikova
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Australia; BCBL. Basque Center on Cognition, Brain and Language, Spain; Ikerbasque, Basque Foundation for Science, Bilbao, Spain.
| |
Collapse
|
9
|
Ross JM, Balasubramaniam R. Time Perception for Musical Rhythms: Sensorimotor Perspectives on Entrainment, Simulation, and Prediction. Front Integr Neurosci 2022; 16:916220. [PMID: 35865808 PMCID: PMC9294366 DOI: 10.3389/fnint.2022.916220] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2022] [Accepted: 06/16/2022] [Indexed: 11/19/2022] Open
Abstract
Neural mechanisms supporting time perception in continuously changing sensory environments may be relevant to a broader understanding of how the human brain utilizes time in cognition and action. In this review, we describe current theories of sensorimotor engagement in the support of subsecond timing. We focus on musical timing due to the extensive literature surrounding movement with and perception of musical rhythms. First, we define commonly used but ambiguous concepts including neural entrainment, simulation, and prediction in the context of musical timing. Next, we summarize the literature on sensorimotor timing during perception and performance and describe current theories of sensorimotor engagement in the support of subsecond timing. We review the evidence supporting that sensorimotor engagement is critical in accurate time perception. Finally, potential clinical implications for a sensorimotor perspective of timing are highlighted.
Collapse
Affiliation(s)
- Jessica M. Ross
- Veterans Affairs Palo Alto Healthcare System and the Sierra Pacific Mental Illness, Research, Education, and Clinical Center, Palo Alto, CA, United States
- Department of Psychiatry and Behavioral Sciences, Stanford University Medical Center, Stanford, CA, United States
- Berenson-Allen Center for Non-invasive Brain Stimulation, Beth Israel Deaconess Medical Center, Boston, MA, United States
- Department of Neurology, Harvard Medical School, Boston, MA, United States
- *Correspondence: Jessica M. Ross,
| | - Ramesh Balasubramaniam
- Cognitive and Information Sciences, University of California, Merced, Merced, CA, United States
| |
Collapse
|
10
|
Cheng THZ, Creel SC, Iversen JR. How Do You Feel the Rhythm: Dynamic Motor-Auditory Interactions Are Involved in the Imagination of Hierarchical Timing. J Neurosci 2022; 42:500-512. [PMID: 34848500 PMCID: PMC8802922 DOI: 10.1523/jneurosci.1121-21.2021] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2021] [Revised: 11/10/2021] [Accepted: 11/12/2021] [Indexed: 11/21/2022] Open
Abstract
Predicting and organizing patterns of events is important for humans to survive in a dynamically changing world. The motor system has been proposed to be actively, and necessarily, engaged in not only the production but the perception of rhythm by organizing hierarchical timing that influences auditory responses. It is not yet well understood how the motor system interacts with the auditory system to perceive and maintain hierarchical structure in time. This study investigated the dynamic interaction between auditory and motor functional sources during the perception and imagination of musical meters. We pursued this using a novel method combining high-density EEG, EMG, and motion capture with independent component analysis to separate motor and auditory activity during meter imagery while robustly controlling against covert movement. We demonstrated that endogenous brain activity in both auditory and motor functional sources reflects the imagination of binary and ternary meters in the absence of corresponding acoustic cues or overt movement at the meter rate. We found clear evidence for hypothesized motor-to-auditory information flow at the beat rate in all conditions, suggesting a role for top-down influence of the motor system on auditory processing of beat-based rhythms, and reflecting an auditory-motor system with tight reciprocal informational coupling. These findings align with and further extend a set of motor hypotheses from beat perception to hierarchical meter imagination, adding supporting evidence to active engagement of the motor system in auditory processing, which may more broadly speak to the neural mechanisms of temporal processing in other human cognitive functions.SIGNIFICANCE STATEMENT Humans live in a world full of hierarchically structured temporal information, the accurate perception of which is essential for understanding speech and music. Music provides a window into the brain mechanisms of time perception, enabling us to examine how the brain groups musical beats into, for example a march or waltz. Using a novel paradigm combining measurement of electrical brain activity with data-driven analysis, this study directly investigates motor-auditory connectivity during meter imagination. Findings highlight the importance of the motor system in the active imagination of meter. This study sheds new light on a fundamental form of perception by demonstrating how auditory-motor interaction may support hierarchical timing processing, which may have clinical implications for speech and motor rehabilitation.
Collapse
Affiliation(s)
- Tzu-Han Zoe Cheng
- Department of Cognitive Science, University of California-San Diego, La Jolla, California 92093
- Institute for Neural Computation and Swartz Center for Computational Neuroscience, University of California-San Diego, La Jolla, California 92093
| | - Sarah C Creel
- Department of Cognitive Science, University of California-San Diego, La Jolla, California 92093
| | - John R Iversen
- Institute for Neural Computation and Swartz Center for Computational Neuroscience, University of California-San Diego, La Jolla, California 92093
| |
Collapse
|
11
|
Sifuentes-Ortega R, Lenc T, Nozaradan S, Peigneux P. Partially Preserved Processing of Musical Rhythms in REM but Not in NREM Sleep. Cereb Cortex 2021; 32:1508-1519. [PMID: 34491309 DOI: 10.1093/cercor/bhab303] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/01/2023] Open
Abstract
The extent of high-level perceptual processing during sleep remains controversial. In wakefulness, perception of periodicities supports the emergence of high-order representations such as the pulse-like meter perceived while listening to music. Electroencephalography (EEG) frequency-tagged responses elicited at envelope frequencies of musical rhythms have been shown to provide a neural representation of rhythm processing. Specifically, responses at frequencies corresponding to the perceived meter are enhanced over responses at meter-unrelated frequencies. This selective enhancement must rely on higher-level perceptual processes, as it occurs even in irregular (i.e., syncopated) rhythms where meter frequencies are not prominent input features, thus ruling out acoustic confounds. We recorded EEG while presenting a regular (unsyncopated) and an irregular (syncopated) rhythm across sleep stages and wakefulness. Our results show that frequency-tagged responses at meter-related frequencies of the rhythms were selectively enhanced during wakefulness but attenuated across sleep states. Most importantly, this selective attenuation occurred even in response to the irregular rhythm, where meter-related frequencies were not prominent in the stimulus, thus suggesting that neural processes selectively enhancing meter-related frequencies during wakefulness are weakened during rapid eye movement (REM) and further suppressed in non-rapid eye movement (NREM) sleep. These results indicate preserved processing of low-level acoustic properties but limited higher-order processing of auditory rhythms during sleep.
Collapse
Affiliation(s)
- Rebeca Sifuentes-Ortega
- UR2NF - Neuropsychology and Functional Neuroimaging Research Unit at CRCN - Center for Research in Cognition & Neurosciences, and UNI - ULB Neuroscience Institute, Université Libre de Bruxelles (ULB), 1050 Brussels, Belgium
| | - Tomas Lenc
- Institute of Neuroscience (IONS), Université Catholique de Louvain, 1200 Brussels, Belgium
| | - Sylvie Nozaradan
- Institute of Neuroscience (IONS), Université Catholique de Louvain, 1200 Brussels, Belgium
| | - Philippe Peigneux
- UR2NF - Neuropsychology and Functional Neuroimaging Research Unit at CRCN - Center for Research in Cognition & Neurosciences, and UNI - ULB Neuroscience Institute, Université Libre de Bruxelles (ULB), 1050 Brussels, Belgium
| |
Collapse
|
12
|
Regev M, Halpern AR, Owen AM, Patel AD, Zatorre RJ. Mapping Specific Mental Content during Musical Imagery. Cereb Cortex 2021; 31:3622-3640. [PMID: 33749742 DOI: 10.1093/cercor/bhab036] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2020] [Revised: 02/02/2021] [Accepted: 02/05/2021] [Indexed: 11/12/2022] Open
Abstract
Humans can mentally represent auditory information without an external stimulus, but the specificity of these internal representations remains unclear. Here, we asked how similar the temporally unfolding neural representations of imagined music are compared to those during the original perceived experience. We also tested whether rhythmic motion can influence the neural representation of music during imagery as during perception. Participants first memorized six 1-min-long instrumental musical pieces with high accuracy. Functional MRI data were collected during: 1) silent imagery of melodies to the beat of a visual metronome; 2) same but while tapping to the beat; and 3) passive listening. During imagery, inter-subject correlation analysis showed that melody-specific temporal response patterns were reinstated in right associative auditory cortices. When tapping accompanied imagery, the melody-specific neural patterns were reinstated in more extensive temporal-lobe regions bilaterally. These results indicate that the specific contents of conscious experience are encoded similarly during imagery and perception in the dynamic activity of auditory cortices. Furthermore, rhythmic motion can enhance the reinstatement of neural patterns associated with the experience of complex sounds, in keeping with models of motor to sensory influences in auditory processing.
Collapse
Affiliation(s)
- Mor Regev
- Montreal Neurological Institute, McGill University, Montreal, QC H3A 2B4, Canada.,International Laboratory for Brain, Music and Sound Research, Montreal, QC H2V 2J2, Canada.,Centre for Research in Language, Brain, and Music, Montreal, QC H3A 1E3, Canada
| | - Andrea R Halpern
- Department of Psychology, Bucknell University, Lewisburg, PA 17837, USA
| | - Adrian M Owen
- Brain and Mind Institute, Department of Psychology and Department of Physiology and Pharmacology, Western University, London, ON N6A 5B7, Canada.,Canadian Institute for Advanced Research, Brain, Mind, and Consciousness program
| | - Aniruddh D Patel
- Canadian Institute for Advanced Research, Brain, Mind, and Consciousness program.,Department of Psychology, Tufts University, Medford, MA 02155, USA
| | - Robert J Zatorre
- Montreal Neurological Institute, McGill University, Montreal, QC H3A 2B4, Canada.,International Laboratory for Brain, Music and Sound Research, Montreal, QC H2V 2J2, Canada.,Centre for Research in Language, Brain, and Music, Montreal, QC H3A 1E3, Canada.,Canadian Institute for Advanced Research, Brain, Mind, and Consciousness program
| |
Collapse
|
13
|
Alemi R, Nozaradan S, Lehmann A. Free-Field Cortical Steady-State Evoked Potentials in Cochlear Implant Users. Brain Topogr 2021; 34:664-680. [PMID: 34185222 DOI: 10.1007/s10548-021-00860-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2021] [Accepted: 06/18/2021] [Indexed: 11/25/2022]
Abstract
Auditory steady-state evoked potentials (SS-EPs) are phase-locked neural responses to periodic stimuli, believed to reflect specific neural generators. As an objective measure, steady-state responses have been used in different clinical settings, including measuring hearing thresholds of normal and hearing-impaired subjects. Recent studies are in favor of recording these responses as a part of the cochlear implant (CI) device-fitting procedure. Considering these potential benefits, the goals of the present study were to assess the feasibility of recording free-field SS-EPs in CI users and to compare their characteristics between CI users and controls. By taking advantage of a recently developed dual-frequency tagging method, we attempted to record subcortical and cortical SS-EPs from adult CI users and controls and measured reliable subcortical and cortical SS-EPs in the control group. Independent component analysis (ICA) was used to remove CI stimulation artifacts, yet subcortical responses of several CIs were heavily contaminated by these artifacts. Consequently, only cortical SS-EPs were compared between groups, which were found to be larger in the controls. The lower cortical SS-EPs' amplitude in CI users might indicate a reduction in neural synchrony evoked by the modulation rate of the auditory input across different neural assemblies in the auditory pathway. The brain topographies of cortical auditory SS-EPs, the time course of cortical responses, and the reconstructed cortical maps were highly similar between groups, confirming their neural origin and possibility to obtain such responses also in CI recipients. As for subcortical SS-EPs, our results highlight a need for sophisticated denoising algorithms to pinpoint and remove artifactual components from the biological response.
Collapse
Affiliation(s)
- Razieh Alemi
- Faculty of Medicine, Department of Otolaryngology, McGill University, Montreal, QC, Canada.
- Centre for Research On Brain, Language & Music (CRBLM), Montreal, Canada.
- International Laboratory for Brain, Music & Sound Research (BRAMS), Montreal, QC, Canada.
| | - Sylvie Nozaradan
- Institute of Neuroscience (IONS), Université Catholique de Louvain (UCL), Ottignies-Louvain-la-Neuve, Belgium
| | - Alexandre Lehmann
- Faculty of Medicine, Department of Otolaryngology, McGill University, Montreal, QC, Canada
- Centre for Research On Brain, Language & Music (CRBLM), Montreal, Canada
- International Laboratory for Brain, Music & Sound Research (BRAMS), Montreal, QC, Canada
| |
Collapse
|
14
|
Tichko P, Kim JC, Large EW. Bouncing the network: A dynamical systems model of auditory-vestibular interactions underlying infants' perception of musical rhythm. Dev Sci 2021; 24:e13103. [PMID: 33570778 DOI: 10.1111/desc.13103] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2020] [Accepted: 02/03/2021] [Indexed: 11/26/2022]
Abstract
Previous work suggests that auditory-vestibular interactions, which emerge during bodily movement to music, can influence the perception of musical rhythm. In a seminal study on the ontogeny of musical rhythm, Phillips-Silver and Trainor (2005) found that bouncing infants to an unaccented rhythm influenced infants' perceptual preferences for accented rhythms that matched the rate of bouncing. In the current study, we ask whether nascent, diffuse coupling between auditory and motor systems is sufficient to bootstrap short-term Hebbian plasticity in the auditory system and explain infants' preferences for accented rhythms thought to arise from auditory-vestibular interactions. First, we specify a nonlinear, dynamical system in which two oscillatory neural networks, representing developmentally nascent auditory and motor systems, interact through weak, non-specific coupling. The auditory network was equipped with short-term Hebbian plasticity, allowing the auditory network to tune its intrinsic resonant properties. Next, we simulate the effect of vestibular input (e.g., infant bouncing) on infants' perceptual preferences for accented rhythms. We found that simultaneous auditory-vestibular training shaped the model's response to musical rhythm, enhancing vestibular-related frequencies in auditory-network activity. Moreover, simultaneous auditory-vestibular training, relative to auditory- or vestibular-only training, facilitated short-term auditory plasticity in the model, producing stronger oscillator connections in the auditory network. Finally, when tested on a musical rhythm, models which received simultaneous auditory-vestibular training, but not models that received auditory- or vestibular-only training, resonated strongly at frequencies related to their "bouncing," a finding qualitatively similar to infants' preferences for accented rhythms that matched the rate of infant bouncing.
Collapse
Affiliation(s)
- Parker Tichko
- Department of Music, Northeastern University, Boston, MA, USA
| | - Ji Chul Kim
- Department of Psychological Sciences, Perception, Action, Cognition (PAC) Division, University of Connecticut, Storrs, CT, USA
| | - Edward W Large
- Department of Psychological Sciences, Perception, Action, Cognition (PAC) Division, University of Connecticut, Storrs, CT, USA.,Department of Psychological Sciences, Center for the Ecological Study of Perception & Action (CESPA), University of Connecticut, Storrs, CT, USA.,Department of Physics, University of Connecticut, Storrs, CT, USA
| |
Collapse
|
15
|
Mathias B, Zamm A, Gianferrara PG, Ross B, Palmer C. Rhythm Complexity Modulates Behavioral and Neural Dynamics During Auditory–Motor Synchronization. J Cogn Neurosci 2020; 32:1864-1880. [DOI: 10.1162/jocn_a_01601] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
We addressed how rhythm complexity influences auditory–motor synchronization in musically trained individuals who perceived and produced complex rhythms while EEG was recorded. Participants first listened to two-part auditory sequences (Listen condition). Each part featured a single pitch presented at a fixed rate; the integer ratio formed between the two rates varied in rhythmic complexity from low (1:1) to moderate (1:2) to high (3:2). One of the two parts occurred at a constant rate across conditions. Then, participants heard the same rhythms as they synchronized their tapping at a fixed rate (Synchronize condition). Finally, they tapped at the same fixed rate (Motor condition). Auditory feedback from their taps was present in all conditions. Behavioral effects of rhythmic complexity were evidenced in all tasks; detection of missing beats (Listen) worsened in the most complex (3:2) rhythm condition, and tap durations (Synchronize) were most variable and least synchronous with stimulus onsets in the 3:2 condition. EEG power spectral density was lowest at the fixed rate during the 3:2 rhythm and greatest during the 1:1 rhythm (Listen and Synchronize). ERP amplitudes corresponding to an N1 time window were smallest for the 3:2 rhythm and greatest for the 1:1 rhythm (Listen). Finally, synchronization accuracy (Synchronize) decreased as amplitudes in the N1 time window became more positive during the high rhythmic complexity condition (3:2). Thus, measures of neural entrainment corresponded to synchronization accuracy, and rhythmic complexity modulated the behavioral and neural measures similarly.
Collapse
Affiliation(s)
- Brian Mathias
- McGill University
- Max Planck Institute for Human Cognitive and Brain Science
| | - Anna Zamm
- McGill University
- Central European University, Budapest, Hungary
| | | | | | | |
Collapse
|
16
|
Lenc T, Keller PE, Varlet M, Nozaradan S. Neural and Behavioral Evidence for Frequency-Selective Context Effects in Rhythm Processing in Humans. Cereb Cortex Commun 2020; 1:tgaa037. [PMID: 34296106 PMCID: PMC8152888 DOI: 10.1093/texcom/tgaa037] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Revised: 06/30/2020] [Accepted: 07/16/2020] [Indexed: 01/17/2023] Open
Abstract
When listening to music, people often perceive and move along with a periodic meter. However, the dynamics of mapping between meter perception and the acoustic cues to meter periodicities in the sensory input remain largely unknown. To capture these dynamics, we recorded the electroencephalography while nonmusician and musician participants listened to nonrepeating rhythmic sequences, where acoustic cues to meter frequencies either gradually decreased (from regular to degraded) or increased (from degraded to regular). The results revealed greater neural activity selectively elicited at meter frequencies when the sequence gradually changed from regular to degraded compared with the opposite. Importantly, this effect was unlikely to arise from overall gain, or low-level auditory processing, as revealed by physiological modeling. Moreover, the context effect was more pronounced in nonmusicians, who also demonstrated facilitated sensory-motor synchronization with the meter for sequences that started as regular. In contrast, musicians showed weaker effects of recent context in their neural responses and robust ability to move along with the meter irrespective of stimulus degradation. Together, our results demonstrate that brain activity elicited by rhythm does not only reflect passive tracking of stimulus features, but represents continuous integration of sensory input with recent context.
Collapse
Affiliation(s)
- Tomas Lenc
- MARCS Institute for Brain, Behaviour, and Development, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
| | - Peter E Keller
- MARCS Institute for Brain, Behaviour, and Development, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
| | - Manuel Varlet
- MARCS Institute for Brain, Behaviour, and Development, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
- School of Psychology, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
| | - Sylvie Nozaradan
- MARCS Institute for Brain, Behaviour, and Development, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
- Institute of Neuroscience (IONS), Université Catholique de Louvain (UCL), Brussels 1200, Belgium
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal QC H3C 3J7, Canada
| |
Collapse
|
17
|
Why do we move to the beat? A multi-scale approach, from physical principles to brain dynamics. Neurosci Biobehav Rev 2020; 112:553-584. [DOI: 10.1016/j.neubiorev.2019.12.024] [Citation(s) in RCA: 36] [Impact Index Per Article: 7.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2019] [Revised: 10/20/2019] [Accepted: 12/13/2019] [Indexed: 01/08/2023]
|
18
|
Binaural Beats through the Auditory Pathway: From Brainstem to Connectivity Patterns. eNeuro 2020; 7:ENEURO.0232-19.2020. [PMID: 32066611 PMCID: PMC7082494 DOI: 10.1523/eneuro.0232-19.2020] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2019] [Revised: 12/23/2019] [Accepted: 01/20/2020] [Indexed: 01/13/2023] Open
Abstract
Binaural beating is a perceptual auditory illusion occurring when presenting two neighboring frequencies to each ear separately. Several controversial claims have been attributed to binaural beats regarding their ability to entrain human brain activity and mood, in both the scientific literature and the marketing realm. Here, we sought to address those questions in a robust fashion using a single-blind, active-controlled protocol. To do so, we compared the effects of binaural beats with a control beat stimulation (monaural beats, known to entrain brain activity but not mood) across four distinct levels in the human auditory pathway: subcortical and cortical entrainment, scalp-level functional connectivity and self-reports. Both stimuli elicited standard subcortical responses at the pure tone frequencies of the stimulus [i.e., frequency following response (FFR)], and entrained the cortex at the beat frequency [i.e., auditory steady state response (ASSR)]. Furthermore, functional connectivity patterns were modulated differentially by both kinds of stimuli, with binaural beats being the only one eliciting cross-frequency activity. Despite this, we did not find any mood modulation related to our experimental manipulation. Our results provide evidence that binaural beats elicit cross frequency connectivity patterns, but weakly entrain the cortex when compared with monaural beat stimuli. Whether binaural beats have an impact on cognitive performance or other mood measurements remains to be seen and can be further investigated within the proposed methodological framework.
Collapse
|
19
|
Coffey EBJ, Nicol T, White-Schwoch T, Chandrasekaran B, Krizman J, Skoe E, Zatorre RJ, Kraus N. Evolving perspectives on the sources of the frequency-following response. Nat Commun 2019; 10:5036. [PMID: 31695046 PMCID: PMC6834633 DOI: 10.1038/s41467-019-13003-w] [Citation(s) in RCA: 117] [Impact Index Per Article: 19.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2019] [Accepted: 10/14/2019] [Indexed: 11/09/2022] Open
Abstract
The auditory frequency-following response (FFR) is a non-invasive index of the fidelity of sound encoding in the brain, and is used to study the integrity, plasticity, and behavioral relevance of the neural encoding of sound. In this Perspective, we review recent evidence suggesting that, in humans, the FFR arises from multiple cortical and subcortical sources, not just subcortically as previously believed, and we illustrate how the FFR to complex sounds can enhance the wider field of auditory neuroscience. Far from being of use only to study basic auditory processes, the FFR is an uncommonly multifaceted response yielding a wealth of information, with much yet to be tapped.
Collapse
Affiliation(s)
- Emily B J Coffey
- Department of Psychology, Concordia University, 1455 Boulevard de Maisonneuve Ouest, Montréal, QC, H3G 1M8, Canada.
- International Laboratory for Brain, Music, and Sound Research (BRAMS), Montréal, QC, Canada.
- Centre for Research on Brain, Language and Music (CRBLM), McGill University, 3640 de la Montagne, Montréal, QC, H3G 2A8, Canada.
| | - Trent Nicol
- Auditory Neuroscience Laboratory, Department of Communication Sciences, Northwestern University, 2240 Campus Dr., Evanston, IL, 60208, USA
| | - Travis White-Schwoch
- Auditory Neuroscience Laboratory, Department of Communication Sciences, Northwestern University, 2240 Campus Dr., Evanston, IL, 60208, USA
| | - Bharath Chandrasekaran
- Communication Sciences and Disorders, School of Health and Rehabilitation Sciences, University of Pittsburgh, Forbes Tower, 3600 Atwood St, Pittsburgh, PA, 15260, USA
| | - Jennifer Krizman
- Auditory Neuroscience Laboratory, Department of Communication Sciences, Northwestern University, 2240 Campus Dr., Evanston, IL, 60208, USA
| | - Erika Skoe
- Department of Speech, Language, and Hearing Sciences, The Connecticut Institute for the Brain and Cognitive Sciences, University of Connecticut, 2 Alethia Drive, Unit 1085, Storrs, CT, 06269, USA
| | - Robert J Zatorre
- International Laboratory for Brain, Music, and Sound Research (BRAMS), Montréal, QC, Canada
- Centre for Research on Brain, Language and Music (CRBLM), McGill University, 3640 de la Montagne, Montréal, QC, H3G 2A8, Canada
- Montreal Neurological Institute, McGill University, 3801 rue Université, Montréal, QC, H3A 2B4, Canada
| | - Nina Kraus
- Auditory Neuroscience Laboratory, Department of Communication Sciences, Northwestern University, 2240 Campus Dr., Evanston, IL, 60208, USA
- Department of Neurobiology, Northwestern University, 2205 Tech Dr., Evanston, IL, 60208, USA
- Department of Otolaryngology, Northwestern University, 420 E Superior St., Chicago, IL, 6011, USA
| |
Collapse
|
20
|
Lanzilotti C, Dumas R, Grassi M, Schön D. Prolonged exposure to highly rhythmic music affects brain dynamics and perception. Neuropsychologia 2019; 129:191-199. [PMID: 31015025 DOI: 10.1016/j.neuropsychologia.2019.04.011] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2018] [Revised: 04/10/2019] [Accepted: 04/15/2019] [Indexed: 11/17/2022]
Abstract
Rhythmic stimulation is a powerful tool to improve temporal prediction and parsing of the auditory signal. However, for long duration of stimulation, the rhythmic and repetitive aspects of music have often been associated to a trance state. In this study we conceived an auditory monitoring task that allows tracking changes of psychophysical auditory thresholds. Participants performed the task while listening to rhythmically regular and an irregular (scrambled but spectrally identical) music that were presented with an intermittent (short) and continuous (long) type of stimulation. Results show that psychophysical auditory thresholds increase following a Continuous versus Intermittent stimulation and this is accompanied by a reduction of the amplitude of two event-related potentials to target stimuli. These effects are larger with regular music, thus do not simply derive from the duration of stimulation. Interestingly, they seem to be related to a frequency selective neural coupling as well as an increase of network connectivity in the alpha band between frontal and central regions. Our study shows that the idea that rhythmic presentation of sensory stimuli facilitates perception might be limited to short streams, while long, highly regular, repetitive and strongly engaging streams may have an opposite perceptual impact.
Collapse
Affiliation(s)
- Cosima Lanzilotti
- Aix Marseille Univ, Inserm, INS, Inst Neurosci Syst, Marseille, France
| | | | - Massimo Grassi
- Università di Padova, Dipartimento di Psicologia Generale, Padova, Italy
| | - Daniele Schön
- Aix Marseille Univ, Inserm, INS, Inst Neurosci Syst, Marseille, France.
| |
Collapse
|
21
|
Reply to Novembre and Iannetti: Conceptual and methodological issues. Proc Natl Acad Sci U S A 2018; 115:E11004. [PMID: 30425177 DOI: 10.1073/pnas.1815750115] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
|
22
|
Caron-Desrochers L, Schönwiesner M, Focke K, Lehmann A. Assessing visual modulation along the human subcortical auditory pathway. Neurosci Lett 2018; 685:12-17. [PMID: 30009874 DOI: 10.1016/j.neulet.2018.07.020] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2017] [Revised: 07/10/2018] [Accepted: 07/12/2018] [Indexed: 11/17/2022]
Abstract
Experience of the world is inherently multisensory. It has been suggested that audiovisual modulation occurs as early as subcortical auditory stages. However, this was based on the frequency-following response, a measure recently found to be significantly generated from cortical sources. It therefore remains unclear whether subcortical auditory processing can indeed be modulated by visual information. We aimed to trace visual modulation along the auditory pathway by comparing auditory brainstem response (ABR) and middle-latency response (MLR) between unimodal auditory and multimodal audiovisual conditions. EEG activity was recorded while participants attended auditory clicks and visual flashes, either synchronous or asynchronous. No differences between auditory and audiovisual responses were observed at ABR or MLR levels. It suggested that ascending auditory processing does not seem to be modulated by visual cues at subcortical levels, at least for rudimentary stimuli. Multimodal modulation in the auditory brainstem observed in previous studies might therefore originate from cortical sources and top-down processes. More studies are needed to further disentangle subcortical and cortical influences on audiovisual modulation along the auditory pathway.
Collapse
Affiliation(s)
- Laura Caron-Desrochers
- International Laboratory for Brain, Music and Sound Research, Department of Psychology, University of Montreal, Montreal, Canada; Center for Research on Brain, Language and Music, Montreal, Canada.
| | - Marc Schönwiesner
- International Laboratory for Brain, Music and Sound Research, Department of Psychology, University of Montreal, Montreal, Canada; Center for Research on Brain, Language and Music, Montreal, Canada; Department of Biology, University of Leipzig, Leipzig, Germany
| | - Kristin Focke
- International Laboratory for Brain, Music and Sound Research, Department of Psychology, University of Montreal, Montreal, Canada; Center for Research on Brain, Language and Music, Montreal, Canada
| | - Alexandre Lehmann
- International Laboratory for Brain, Music and Sound Research, Department of Psychology, University of Montreal, Montreal, Canada; Center for Research on Brain, Language and Music, Montreal, Canada; Department of Otolaryngology Head and Neck Surgery, McGill University, Canada
| |
Collapse
|
23
|
Chemin B, Huang G, Mulders D, Mouraux A. EEG time-warping to study non-strictly-periodic EEG signals related to the production of rhythmic movements. J Neurosci Methods 2018; 308:106-115. [PMID: 30053483 DOI: 10.1016/j.jneumeth.2018.07.016] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2018] [Revised: 07/23/2018] [Accepted: 07/23/2018] [Indexed: 11/17/2022]
Abstract
BACKGROUND Many sensorimotor functions are intrinsically rhythmic, and are underlined by neural processes that are functionally distinct from neural responses related to the processing of transient events. EEG frequency tagging is a technique that is increasingly used in neuroscience to study these processes. It relies on the fact that perceiving and/or producing rhythms generates periodic neural activity that translates into periodic variations of the EEG signal. In the EEG spectrum, those variations appear as peaks localized at the frequency of the rhythm and its harmonics. NEW METHOD Many natural rhythms, such as music or dance, are not strictly periodic and, instead, show fluctuations of their period over time. Here, we introduce a time-warping method to identify non-strictly-periodic EEG activities in the frequency domain. RESULTS EEG time-warping can be used to characterize the sensorimotor activity related to the performance of self-paced rhythmic finger movements. Furthermore, the EEG time-warping method can disentangle auditory- and movement-related EEG activity produced when participants perform rhythmic movements synchronized to an acoustic rhythm. This is possible because the movement-related activity has different period fluctuations than the auditory-related activity. COMPARISON WITH EXISTING METHODS With the classic frequency-tagging approach, rhythm fluctuations result in a spreading of the peaks to neighboring frequencies, to the point that they cannot be distinguished from background noise. CONCLUSIONS The proposed time-warping procedure is as a simple and effective mean to study natural non-strictly-periodic rhythmic neural processes such as rhythmic movement production, acoustic rhythm perception and sensorimotor synchronization.
Collapse
Affiliation(s)
- B Chemin
- Institute of NeuroScience (IoNS), System and Cognition Department, Université catholique de Louvain, Belgium; International Laboratory for Brain, Music and Sound Research (BRAMS), Université de Montréal, Canada.
| | - G Huang
- Institute of NeuroScience (IoNS), System and Cognition Department, Université catholique de Louvain, Belgium; School of Mobile Information Engineering, Sun Yat-Sen University, China
| | - D Mulders
- Institute of NeuroScience (IoNS), System and Cognition Department, Université catholique de Louvain, Belgium
| | - A Mouraux
- Institute of NeuroScience (IoNS), System and Cognition Department, Université catholique de Louvain, Belgium
| |
Collapse
|
24
|
Zuk NJ, Carney LH, Lalor EC. Preferred Tempo and Low-Audio-Frequency Bias Emerge From Simulated Sub-cortical Processing of Sounds With a Musical Beat. Front Neurosci 2018; 12:349. [PMID: 29896080 PMCID: PMC5987030 DOI: 10.3389/fnins.2018.00349] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2018] [Accepted: 05/07/2018] [Indexed: 11/17/2022] Open
Abstract
Prior research has shown that musical beats are salient at the level of the cortex in humans. Yet below the cortex there is considerable sub-cortical processing that could influence beat perception. Some biases, such as a tempo preference and an audio frequency bias for beat timing, could result from sub-cortical processing. Here, we used models of the auditory-nerve and midbrain-level amplitude modulation filtering to simulate sub-cortical neural activity to various beat-inducing stimuli, and we used the simulated activity to determine the tempo or beat frequency of the music. First, irrespective of the stimulus being presented, the preferred tempo was around 100 beats per minute, which is within the range of tempi where tempo discrimination and tapping accuracy are optimal. Second, sub-cortical processing predicted a stronger influence of lower audio frequencies on beat perception. However, the tempo identification algorithm that was optimized for simple stimuli often failed for recordings of music. For music, the most highly synchronized model activity occurred at a multiple of the beat frequency. Using bottom-up processes alone is insufficient to produce beat-locked activity. Instead, a learned and possibly top-down mechanism that scales the synchronization frequency to derive the beat frequency greatly improves the performance of tempo identification.
Collapse
Affiliation(s)
- Nathaniel J. Zuk
- Department of Biomedical Engineering, University of Rochester, Rochester, NY, United States
| | - Laurel H. Carney
- Department of Biomedical Engineering, University of Rochester, Rochester, NY, United States
- Department of Neuroscience, University of Rochester Medical Center, Rochester, NY, United States
| | - Edmund C. Lalor
- Department of Biomedical Engineering, University of Rochester, Rochester, NY, United States
- Department of Neuroscience, University of Rochester Medical Center, Rochester, NY, United States
- Del Monte Institute for Neuroscience, University of Rochester Medical Center, Rochester, NY, United States
- Trinity Centre for Bioengineering, Trinity College Dublin, Dublin, Ireland
| |
Collapse
|
25
|
Nozaradan S, Schönwiesner M, Keller PE, Lenc T, Lehmann A. Neural bases of rhythmic entrainment in humans: critical transformation between cortical and lower-level representations of auditory rhythm. Eur J Neurosci 2018; 47:321-332. [PMID: 29356161 DOI: 10.1111/ejn.13826] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/20/2017] [Revised: 01/12/2018] [Accepted: 01/15/2018] [Indexed: 11/29/2022]
Abstract
The spontaneous ability to entrain to meter periodicities is central to music perception and production across cultures. There is increasing evidence that this ability involves selective neural responses to meter-related frequencies. This phenomenon has been observed in the human auditory cortex, yet it could be the product of evolutionarily older lower-level properties of brainstem auditory neurons, as suggested by recent recordings from rodent midbrain. We addressed this question by taking advantage of a new method to simultaneously record human EEG activity originating from cortical and lower-level sources, in the form of slow (< 20 Hz) and fast (> 150 Hz) responses to auditory rhythms. Cortical responses showed increased amplitudes at meter-related frequencies compared to meter-unrelated frequencies, regardless of the prominence of the meter-related frequencies in the modulation spectrum of the rhythmic inputs. In contrast, frequency-following responses showed increased amplitudes at meter-related frequencies only in rhythms with prominent meter-related frequencies in the input but not for a more complex rhythm requiring more endogenous generation of the meter. This interaction with rhythm complexity suggests that the selective enhancement of meter-related frequencies does not fully rely on subcortical auditory properties, but is critically shaped at the cortical level, possibly through functional connections between the auditory cortex and other, movement-related, brain structures. This process of temporal selection would thus enable endogenous and motor entrainment to emerge with substantial flexibility and invariance with respect to the rhythmic input in humans in contrast with non-human animals.
Collapse
Affiliation(s)
- Sylvie Nozaradan
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Locked Bag 1797, Penrith, Sydney, NSW, 2751, Australia.,Institute of Neuroscience (IONS), Université catholique de Louvain (UCL), Louvain, Belgium.,International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, QC, Canada
| | - Marc Schönwiesner
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, QC, Canada.,Center for Research on Brain, Language and Music (CRBLM), Montreal, QC, Canada.,Faculty of Psychology, Université de Montréal, Montreal, QC, Canada
| | - Peter E Keller
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Locked Bag 1797, Penrith, Sydney, NSW, 2751, Australia
| | - Tomas Lenc
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Locked Bag 1797, Penrith, Sydney, NSW, 2751, Australia
| | - Alexandre Lehmann
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, QC, Canada.,Center for Research on Brain, Language and Music (CRBLM), Montreal, QC, Canada.,Faculty of Psychology, Université de Montréal, Montreal, QC, Canada.,Otolaryngology Department, Faculty of Medicine, McGill University Hospital, Montreal, QC, Canada
| |
Collapse
|
26
|
Haegens S, Zion Golumbic E. Rhythmic facilitation of sensory processing: A critical review. Neurosci Biobehav Rev 2017; 86:150-165. [PMID: 29223770 DOI: 10.1016/j.neubiorev.2017.12.002] [Citation(s) in RCA: 182] [Impact Index Per Article: 22.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2017] [Revised: 11/02/2017] [Accepted: 12/03/2017] [Indexed: 11/17/2022]
Abstract
Here we review the role of brain oscillations in sensory processing. We examine the idea that neural entrainment of intrinsic oscillations underlies the processing of rhythmic stimuli in the context of simple isochronous rhythms as well as in music and speech. This has been a topic of growing interest over recent years; however, many issues remain highly controversial: how do fluctuations of intrinsic neural oscillations-both spontaneous and entrained to external stimuli-affect perception, and does this occur automatically or can it be actively controlled by top-down factors? Some of the controversy in the literature stems from confounding use of terminology. Moreover, it is not straightforward how theories and findings regarding isochronous rhythms generalize to more complex, naturalistic stimuli, such as speech and music. Here we aim to clarify terminology, and distinguish between different phenomena that are often lumped together as reflecting "neural entrainment" but may actually vary in their mechanistic underpinnings. Furthermore, we discuss specific caveats and confounds related to making inferences about oscillatory mechanisms from human electrophysiological data.
Collapse
Affiliation(s)
- Saskia Haegens
- Department of Neurological Surgery, Columbia University College of Physicians and Surgeons, New York, NY 10032, USA; Centre for Cognitive Neuroimaging, Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, 6500 HB Nijmegen, The Netherlands
| | | |
Collapse
|
27
|
Nozaradan S, Keller PE, Rossion B, Mouraux A. EEG Frequency-Tagging and Input-Output Comparison in Rhythm Perception. Brain Topogr 2017; 31:153-160. [PMID: 29127530 DOI: 10.1007/s10548-017-0605-8] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2017] [Accepted: 10/27/2017] [Indexed: 01/23/2023]
Abstract
The combination of frequency-tagging with electroencephalography (EEG) has recently proved fruitful for understanding the perception of beat and meter in musical rhythm, a common behavior shared by humans of all cultures. EEG frequency-tagging allows the objective measurement of input-output transforms to investigate beat perception, its modulation by exogenous and endogenous factors, development, and neural basis. Recent doubt has been raised about the validity of comparing frequency-domain representations of auditory rhythmic stimuli and corresponding EEG responses, assuming that it implies a one-to-one mapping between the envelope of the rhythmic input and the neural output, and that it neglects the sensitivity of frequency-domain representations to acoustic features making up the rhythms. Here we argue that these elements actually reinforce the strengths of the approach. The obvious fact that acoustic features influence the frequency spectrum of the sound envelope precisely justifies taking into consideration the sounds used to generate a beat percept for interpreting neural responses to auditory rhythms. Most importantly, the many-to-one relationship between rhythmic input and perceived beat actually validates an approach that objectively measures the input-output transforms underlying the perceptual categorization of rhythmic inputs. Hence, provided that a number of potential pitfalls and fallacies are avoided, EEG frequency-tagging to study input-output relationships appears valuable for understanding rhythm perception.
Collapse
Affiliation(s)
- Sylvie Nozaradan
- The MARCS Institute for Brain, Behaviour and Development (WSU), Sydney, NSW, Australia. .,Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Brussels, Belgium. .,International Laboratory for Brain, Music and Sound Research (Brams), Montreal, QC, Canada. .,MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Locked Bag 1797, Penrith, NSW, 2751, Australia.
| | - Peter E Keller
- The MARCS Institute for Brain, Behaviour and Development (WSU), Sydney, NSW, Australia
| | - Bruno Rossion
- Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Brussels, Belgium.,Neurology Unit, Centre Hospitalier Régional Universitaire (CHRU) de Nancy, Nancy, France
| | - André Mouraux
- Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Brussels, Belgium
| |
Collapse
|
28
|
Nozaradan S, Schwartze M, Obermeier C, Kotz SA. Specific contributions of basal ganglia and cerebellum to the neural tracking of rhythm. Cortex 2017; 95:156-168. [DOI: 10.1016/j.cortex.2017.08.015] [Citation(s) in RCA: 60] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2017] [Revised: 07/16/2017] [Accepted: 08/07/2017] [Indexed: 11/29/2022]
|
29
|
Nozaradan S, Mouraux A, Cousineau M. Frequency tagging to track the neural processing of contrast in fast, continuous sound sequences. J Neurophysiol 2017; 118:243-253. [PMID: 28381494 DOI: 10.1152/jn.00971.2016] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/27/2016] [Revised: 03/31/2017] [Accepted: 03/31/2017] [Indexed: 01/23/2023] Open
Abstract
The human auditory system presents a remarkable ability to detect rapid changes in fast, continuous acoustic sequences, as best illustrated in speech and music. However, the neural processing of rapid auditory contrast remains largely unclear, probably due to the lack of methods to objectively dissociate the response components specifically related to the contrast from the other components in response to the sequence of fast continuous sounds. To overcome this issue, we tested a novel use of the frequency-tagging approach allowing contrast-specific neural responses to be tracked based on their expected frequencies. The EEG was recorded while participants listened to 40-s sequences of sounds presented at 8Hz. A tone or interaural time contrast was embedded every fifth sound (AAAAB), such that a response observed in the EEG at exactly 8 Hz/5 (1.6 Hz) or harmonics should be the signature of contrast processing by neural populations. Contrast-related responses were successfully identified, even in the case of very fine contrasts. Moreover, analysis of the time course of the responses revealed a stable amplitude over repetitions of the AAAAB patterns in the sequence, except for the response to perceptually salient contrasts that showed a buildup and decay across repetitions of the sounds. Overall, this new combination of frequency-tagging with an oddball design provides a valuable complement to the classic, transient, evoked potentials approach, especially in the context of rapid auditory information. Specifically, we provide objective evidence on the neural processing of contrast embedded in fast, continuous sound sequences.NEW & NOTEWORTHY Recent theories suggest that the basis of neurodevelopmental auditory disorders such as dyslexia might be an impaired processing of fast auditory changes, highlighting how the encoding of rapid acoustic information is critical for auditory communication. Here, we present a novel electrophysiological approach to capture in humans neural markers of contrasts in fast continuous tone sequences. Contrast-specific responses were successfully identified, even for very fine contrasts, providing direct insight on the encoding of rapid auditory information.
Collapse
Affiliation(s)
- Sylvie Nozaradan
- Institute of Neuroscience, Université Catholique de Louvain, Brussels, Belgium; .,MARCS Institute for Brain, Behavior, and Development, Sydney, Australia; and.,International Laboratory for Brain, Music, and Sound Research (Brams), Montreal, Quebec, Canada
| | - André Mouraux
- Institute of Neuroscience, Université Catholique de Louvain, Brussels, Belgium
| | - Marion Cousineau
- International Laboratory for Brain, Music, and Sound Research (Brams), Montreal, Quebec, Canada
| |
Collapse
|
30
|
Tierney A, White-Schwoch T, MacLean J, Kraus N. Individual Differences in Rhythm Skills: Links with Neural Consistency and Linguistic Ability. J Cogn Neurosci 2017; 29:855-868. [PMID: 28129066 DOI: 10.1162/jocn_a_01092] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Durational patterns provide cues to linguistic structure, thus so variations in rhythm skills may have consequences for language development. Understanding individual differences in rhythm skills, therefore, could help explain variability in language abilities across the population. We investigated the neural foundations of rhythmic proficiency and its relation to language skills in young adults. We hypothesized that rhythmic abilities can be characterized by at least two constructs, which are tied to independent language abilities and neural profiles. Specifically, we hypothesized that rhythm skills that require integration of information across time rely upon the consistency of slow, low-frequency auditory processing, which we measured using the evoked cortical response. On the other hand, we hypothesized that rhythm skills that require fine temporal precision rely upon the consistency of fast, higher-frequency auditory processing, which we measured using the frequency-following response. Performance on rhythm tests aligned with two constructs: rhythm sequencing and synchronization. Rhythm sequencing and synchronization were linked to the consistency of slow cortical and fast frequency-following responses, respectively. Furthermore, whereas rhythm sequencing ability was linked to verbal memory and reading, synchronization ability was linked only to nonverbal auditory temporal processing. Thus, rhythm perception at different time scales reflects distinct abilities, which rely on distinct auditory neural resources. In young adults, slow rhythmic processing makes the more extensive contribution to language skills.
Collapse
|
31
|
Abstract
There is growing interest in whether the motor system plays an essential role in rhythm perception. The motor system is active during the perception of rhythms, but is such motor activity merely a sign of unexecuted motor planning, or does it play a causal role in shaping the perception of rhythm? We present evidence for a causal role of motor planning and simulation, and review theories of internal simulation for beat-based timing prediction. Brain stimulation studies have the potential to conclusively test if the motor system plays a causal role in beat perception and ground theories to their neural underpinnings.
Collapse
Affiliation(s)
- Jessica M Ross
- a Cognitive and Information Sciences , University of California , Merced , CA , USA
| | - John R Iversen
- b Swartz Center for Computational Neuroscience, Institute for Neural Computation , University of California , San Diego , CA , USA
| | | |
Collapse
|