1
|
Rathcke T, Smit E, Zheng Y, Canzi M. Perception of temporal structure in speech is influenced by body movement and individual beat perception ability. Atten Percept Psychophys 2024:10.3758/s13414-024-02893-8. [PMID: 38769276 DOI: 10.3758/s13414-024-02893-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/03/2024] [Indexed: 05/22/2024]
Abstract
The subjective experience of time flow in speech deviates from the sound acoustics in substantial ways. The present study focuses on the perceptual tendency to regularize time intervals found in speech but not in other types of sounds with a similar temporal structure. We investigate to what extent individual beat perception ability is responsible for perceptual regularization and if the effect can be eliminated through the involvement of body movement during listening. Participants performed a musical beat perception task and compared spoken sentences to their drumbeat-based versions either after passive listening or after listening and moving along with the beat of the sentences. The results show that the interval regularization prevails in listeners with a low beat perception ability performing a passive listening task and is eliminated in an active listening task involving body movement. Body movement also helped to promote a veridical percept of temporal structure in speech at the group level. We suggest that body movement engages an internal timekeeping mechanism, promoting the fidelity of auditory encoding even in sounds of high temporal complexity and irregularity such as natural speech.
Collapse
Affiliation(s)
- Tamara Rathcke
- Department of Linguistics, University of Konstanz, Konstanz, 78464, Baden-Württemberg, Germany.
| | - Eline Smit
- Department of Linguistics, University of Konstanz, Konstanz, 78464, Baden-Württemberg, Germany
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Street, Penrith, 2751, NSW, Australia
| | - Yue Zheng
- Department of Psychology, University of York, York, YO10 5DD, UK
- Department of Hearing Sciences, University of Nottingham, Nottingham, NG7 2RD, UK
| | - Massimiliano Canzi
- Department of Linguistics, University of Konstanz, Konstanz, 78464, Baden-Württemberg, Germany
| |
Collapse
|
2
|
Paromov D, Moïn-Darbari K, Cedras AM, Maheu M, Bacon BA, Champoux F. Body representation drives auditory spatial perception. iScience 2024; 27:109196. [PMID: 38433911 PMCID: PMC10906536 DOI: 10.1016/j.isci.2024.109196] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2022] [Revised: 08/23/2023] [Accepted: 02/07/2024] [Indexed: 03/05/2024] Open
Abstract
In contrast to the large body of findings confirming the influence of auditory cues on body perception and movement-related activity, the influence of body representation on spatial hearing remains essentially unexplored. Here, we use a disorientation task to assess whether a change in the body's orientation in space could lead to an illusory shift in the localization of a sound source. While most of the participants were initially able to locate the sound source with great precision, they all made substantial errors in judging the position of the same sound source following the body orientation-altering task. These results demonstrate that a change in body orientation can have a significant impact on the auditory processes underlying sound localization. The illusory errors not only confirm the strong connection between the auditory system and the representation of the body in space but also raise questions about the importance of hearing in determining spatial position.
Collapse
Affiliation(s)
- Daniel Paromov
- Université de Montréal, Montréal, QC, Canada
- Centre de recherche de l’Institut Universitaire de Gériatrie de Montréal, Montréal, QC, Canada
| | - Karina Moïn-Darbari
- Université de Montréal, Montréal, QC, Canada
- Centre de recherche de l’Institut Universitaire de Gériatrie de Montréal, Montréal, QC, Canada
| | | | | | - Benoit-Antoine Bacon
- Department of Psychology, The University of British Columbia, Vancouver, BC, Canada
| | - François Champoux
- Université de Montréal, Montréal, QC, Canada
- Centre de recherche de l’Institut Universitaire de Gériatrie de Montréal, Montréal, QC, Canada
| |
Collapse
|
3
|
Lenc T, Peter V, Hooper C, Keller PE, Burnham D, Nozaradan S. Infants show enhanced neural responses to musical meter frequencies beyond low-level features. Dev Sci 2023; 26:e13353. [PMID: 36415027 DOI: 10.1111/desc.13353] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2022] [Revised: 10/20/2022] [Accepted: 11/16/2022] [Indexed: 11/24/2022]
Abstract
Music listening often entails spontaneous perception and body movement to a periodic pulse-like meter. There is increasing evidence that this cross-cultural ability relates to neural processes that selectively enhance metric periodicities, even when these periodicities are not prominent in the acoustic stimulus. However, whether these neural processes emerge early in development remains largely unknown. Here, we recorded the electroencephalogram (EEG) of 20 healthy 5- to 6-month-old infants, while they were exposed to two rhythms known to induce the perception of meter consistently across Western adults. One rhythm contained prominent acoustic periodicities corresponding to the meter, whereas the other rhythm did not. Infants showed significantly enhanced representations of meter periodicities in their EEG responses to both rhythms. This effect is unlikely to reflect the tracking of salient acoustic features in the stimulus, as it was observed irrespective of the prominence of meter periodicities in the audio signals. Moreover, as previously observed in adults, the neural enhancement of meter was greater when the rhythm was delivered by low-pitched sounds. Together, these findings indicate that the endogenous enhancement of metric periodicities beyond low-level acoustic features is a neural property that is already present soon after birth. These high-level neural processes could set the stage for internal representations of musical meter that are critical for human movement coordination during rhythmic musical behavior. RESEARCH HIGHLIGHTS: 5- to 6-month-old infants were presented with auditory rhythms that induce the perception of a periodic pulse-like meter in adults. Infants showed selective enhancement of EEG activity at meter-related frequencies irrespective of the prominence of these frequencies in the stimulus. Responses at meter-related frequencies were boosted when the rhythm was conveyed by bass sounds. High-level neural processes that transform rhythmic auditory stimuli into internal meter templates emerge early after birth.
Collapse
Affiliation(s)
- Tomas Lenc
- Institute of Neuroscience (IONS), Université catholique de Louvain (UCL), Brussels, Belgium
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
| | - Varghese Peter
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
- School of Health and Behavioural Sciences, University of the Sunshine Coast, Queensland, Australia
| | - Caitlin Hooper
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
| | - Peter E Keller
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
- Center for Music in the Brain & Department of Clinical Medicine, Aarhus University, Aarhus, Denmark
| | - Denis Burnham
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
| | - Sylvie Nozaradan
- Institute of Neuroscience (IONS), Université catholique de Louvain (UCL), Brussels, Belgium
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, Canada
| |
Collapse
|
4
|
Rosso M, Moens B, Leman M, Moumdjian L. Neural entrainment underpins sensorimotor synchronization to dynamic rhythmic stimuli. Neuroimage 2023; 277:120226. [PMID: 37321359 DOI: 10.1016/j.neuroimage.2023.120226] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2023] [Revised: 05/02/2023] [Accepted: 06/12/2023] [Indexed: 06/17/2023] Open
Abstract
Neural entrainment, defined as unidirectional synchronization of neural oscillations to an external rhythmic stimulus, is a topic of major interest in the field of neuroscience. Despite broad scientific consensus on its existence, on its pivotal role in sensory and motor processes, and on its fundamental definition, empirical research struggles in quantifying it with non-invasive electrophysiology. To this date, broadly adopted state-of-the-art methods still fail to capture the dynamic underlying the phenomenon. Here, we present event-related frequency adjustment (ERFA) as a methodological framework to induce and to measure neural entrainment in human participants, optimized for multivariate EEG datasets. By applying dynamic phase and tempo perturbations to isochronous auditory metronomes during a finger-tapping task, we analyzed adaptive changes in instantaneous frequency of entrained oscillatory components during error correction. Spatial filter design allowed us to untangle, from the multivariate EEG signal, perceptual and sensorimotor oscillatory components attuned to the stimulation frequency. Both components dynamically adjusted their frequency in response to perturbations, tracking the stimulus dynamics by slowing down and speeding up the oscillation over time. Source separation revealed that sensorimotor processing enhanced the entrained response, supporting the notion that the active engagement of the motor system plays a critical role in processing rhythmic stimuli. In the case of phase shift, motor engagement was a necessary condition to observe any response, whereas sustained tempo changes induced frequency adjustment even in the perceptual oscillatory component. Although the magnitude of the perturbations was controlled across positive and negative direction, we observed a general bias in the frequency adjustments towards positive changes, which points at the effect of intrinsic dynamics constraining neural entrainment. We conclude that our findings provide compelling evidence for neural entrainment as mechanism underlying overt sensorimotor synchronization, and highlight that our methodology offers a paradigm and a measure for quantifying its oscillatory dynamics by means of non-invasive electrophysiology, rigorously informed by the fundamental definition of entrainment.
Collapse
Affiliation(s)
- Mattia Rosso
- IPEM Institute for Systematic Musicology, Ghent University, Ghent, Belgium; Université de Lille, ULR 4072 - PSITEC - Psychologie: Interactions, Temps, Emotions, Cognition, Lille, France.
| | - Bart Moens
- IPEM Institute for Systematic Musicology, Ghent University, Ghent, Belgium
| | - Marc Leman
- IPEM Institute for Systematic Musicology, Ghent University, Ghent, Belgium
| | - Lousin Moumdjian
- IPEM Institute for Systematic Musicology, Ghent University, Ghent, Belgium; REVAL Rehabilitation Research Center, Faculty of Rehabilitation Sciences, Hasselt University, Hasselt, Belgium; UMSC Hasselt, Pelt, Belgium
| |
Collapse
|
5
|
Foster Vander Elst O, Foster NHD, Vuust P, Keller PE, Kringelbach ML. The Neuroscience of Dance: A Conceptual Framework and Systematic Review. Neurosci Biobehav Rev 2023; 150:105197. [PMID: 37100162 DOI: 10.1016/j.neubiorev.2023.105197] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2022] [Revised: 04/13/2023] [Accepted: 04/21/2023] [Indexed: 04/28/2023]
Abstract
Ancient and culturally universal, dance pervades many areas of life and has multiple benefits. In this article, we provide a conceptual framework and systematic review, as a guide for researching the neuroscience of dance. We identified relevant articles following PRISMA guidelines, and summarised and evaluated all original results. We identified avenues for future research in: the interactive and collective aspects of dance; groove; dance performance; dance observation; and dance therapy. Furthermore, the interactive and collective aspects of dance constitute a vital part of the field but have received almost no attention from a neuroscientific perspective so far. Dance and music engage overlapping brain networks, including common regions involved in perception, action, and emotion. In music and dance, rhythm, melody, and harmony are processed in an active, sustained pleasure cycle giving rise to action, emotion, and learning, led by activity in specific hedonic brain networks. The neuroscience of dance is an exciting field, which may yield information concerning links between psychological processes and behaviour, human flourishing, and the concept of eudaimonia.
Collapse
Affiliation(s)
- Olivia Foster Vander Elst
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Denmark; Centre for Eudaimonia and Human Flourishing, Linacre College, University of Oxford, UK.
| | | | - Peter Vuust
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Denmark
| | - Peter E Keller
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Denmark; The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Australia
| | - Morten L Kringelbach
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Denmark; Centre for Eudaimonia and Human Flourishing, Linacre College, University of Oxford, UK; Department of Psychiatry, University of Oxford, UK
| |
Collapse
|
6
|
Slis A, Savariaux C, Perrier P, Garnier M. Rhythmic tapping difficulties in adults who stutter: A deficit in beat perception, motor execution, or sensorimotor integration? PLoS One 2023; 18:e0276691. [PMID: 36735662 DOI: 10.1371/journal.pone.0276691] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2021] [Accepted: 10/11/2022] [Indexed: 02/04/2023] Open
Abstract
OBJECTIVES The study aims to better understand the rhythmic abilities of people who stutter and to identify which processes potentially are impaired in this population: (1) beat perception and reproduction; (2) the execution of movements, in particular their initiation; (3) sensorimotor integration. MATERIAL AND METHOD Finger tapping behavior of 16 adults who stutter (PWS) was compared with that of 16 matching controls (PNS) in five rhythmic tasks of various complexity: three synchronization tasks - a simple 1:1 isochronous pattern, a complex non-isochronous pattern, and a 4 tap:1 beat isochronous pattern -, a reaction task to an aperiodic and unpredictable pattern, and a reproduction task of an isochronous pattern after passively listening. RESULTS PWS were able to reproduce an isochronous pattern on their own, without external auditory stimuli, with similar accuracy as PNS, but with increased variability. This group difference in variability was observed immediately after passive listening, without prior motor engagement, and was not enhanced or reduced after several seconds of tapping. Although PWS showed increased tapping variability in the reproduction task as well as in synchronization tasks, this timing variability did not correlate significantly with the variability in reaction times or tapping force. Compared to PNS, PWS exhibited larger negative mean asynchronies, and increased synchronization variability in synchronization tasks. These group differences were not affected by beat hierarchy (i.e., "strong" vs. "weak" beats), pattern complexity (non-isochronous vs. isochronous) or presence versus absence of external auditory stimulus (1:1 vs. 1:4 isochronous pattern). Differences between PWS and PNS were not enhanced or reduced with sensorimotor learning, over the first taps of a synchronization task. CONCLUSION Our observations support the hypothesis of a deficit in neuronal oscillators coupling in production, but not in perception, of rhythmic patterns, and a larger delay in multi-modal feedback processing for PWS.
Collapse
|
7
|
O’Connell SR, Nave-Blodgett JE, Wilson GE, Hannon EE, Snyder JS. Elements of musical and dance sophistication predict musical groove perception. Front Psychol 2022; 13:998321. [PMID: 36467160 PMCID: PMC9712211 DOI: 10.3389/fpsyg.2022.998321] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2022] [Accepted: 10/21/2022] [Indexed: 11/02/2023] Open
Abstract
Listening to groovy music is an enjoyable experience and a common human behavior in some cultures. Specifically, many listeners agree that songs they find to be more familiar and pleasurable are more likely to induce the experience of musical groove. While the pleasurable and dance-inducing effects of musical groove are omnipresent, we know less about how subjective feelings toward music, individual musical or dance experiences, or more objective musical perception abilities are correlated with the way we experience groove. Therefore, the present study aimed to evaluate how musical and dance sophistication relates to musical groove perception. One-hundred 24 participants completed an online study during which they rated 20 songs, considered high- or low-groove, and completed the Goldsmiths Musical Sophistication Index, the Goldsmiths Dance Sophistication Index, the Beat and Meter Sensitivity Task, and a modified short version of the Profile for Music Perception Skills. Our results reveal that measures of perceptual abilities, musical training, and social dancing predicted the difference in groove rating between high- and low-groove music. Overall, these findings support the notion that listeners' individual experiences and predispositions may shape their perception of musical groove, although other causal directions are also possible. This research helps elucidate the correlates and possible causes of musical groove perception in a wide range of listeners.
Collapse
Affiliation(s)
- Samantha R. O’Connell
- Caruso Department of Otolaryngology, Head and Neck Surgery, Keck School of Medicine of USC, University of Southern California, Los Angeles, CA, United States
| | | | - Grace E. Wilson
- Department of Psychology, University of Nevada, Las Vegas, NV, United States
| | - Erin E. Hannon
- Department of Psychology, University of Nevada, Las Vegas, NV, United States
| | - Joel S. Snyder
- Department of Psychology, University of Nevada, Las Vegas, NV, United States
| |
Collapse
|
8
|
Peter V, van Ommen S, Kalashnikova M, Mazuka R, Nazzi T, Burnham D. Language specificity in cortical tracking of speech rhythm at the mora, syllable, and foot levels. Sci Rep 2022; 12:13477. [PMID: 35931787 PMCID: PMC9356059 DOI: 10.1038/s41598-022-17401-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2021] [Accepted: 07/25/2022] [Indexed: 11/29/2022] Open
Abstract
Recent research shows that adults’ neural oscillations track the rhythm of the speech signal. However, the extent to which this tracking is driven by the acoustics of the signal, or by language-specific processing remains unknown. Here adult native listeners of three rhythmically different languages (English, French, Japanese) were compared on their cortical tracking of speech envelopes synthesized in their three native languages, which allowed for coding at each of the three language’s dominant rhythmic unit, respectively the foot (2.5 Hz), syllable (5 Hz), or mora (10 Hz) level. The three language groups were also tested with a sequence in a non-native language, Polish, and a non-speech vocoded equivalent, to investigate possible differential speech/nonspeech processing. The results first showed that cortical tracking was most prominent at 5 Hz (syllable rate) for all three groups, but the French listeners showed enhanced tracking at 5 Hz compared to the English and the Japanese groups. Second, across groups, there were no differences in responses for speech versus non-speech at 5 Hz (syllable rate), but there was better tracking for speech than for non-speech at 10 Hz (not the syllable rate). Together these results provide evidence for both language-general and language-specific influences on cortical tracking.
Collapse
Affiliation(s)
- Varghese Peter
- MARCS Institute for Brain Behaviour and Development, Western Sydney University, Penrith, NSW, Australia. .,School of Health and Behavioural Sciences, University of the Sunshine Coast, Sippy Downs, Australia.
| | - Sandrien van Ommen
- Integrative Neuroscience and Cognition Center, CNRS-Université Paris Cité, Paris, France.,Neurosciences Fondamentales, University of Geneva, Geneva, Switzerland
| | - Marina Kalashnikova
- MARCS Institute for Brain Behaviour and Development, Western Sydney University, Penrith, NSW, Australia.,BCBL, Basque Center on Cognition, Brain and Language, San Sebastian, Guipuzcoa, Spain.,IKERBASQUE, Basque Foundation for Science, Bilbao, Bizcaya, Spain
| | - Reiko Mazuka
- Laboratory for Language Development, RIKEN Center for Brain Science, Saitama, Japan.,Department of Psychology and Neuroscience, Duke University, Durham, NC, USA
| | - Thierry Nazzi
- Integrative Neuroscience and Cognition Center, CNRS-Université Paris Cité, Paris, France
| | - Denis Burnham
- MARCS Institute for Brain Behaviour and Development, Western Sydney University, Penrith, NSW, Australia
| |
Collapse
|
9
|
Kabdebon C, Fló A, de Heering A, Aslin R. The power of rhythms: how steady-state evoked responses reveal early neurocognitive development. Neuroimage 2022; 254:119150. [PMID: 35351649 PMCID: PMC9294992 DOI: 10.1016/j.neuroimage.2022.119150] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2021] [Revised: 03/23/2022] [Accepted: 03/24/2022] [Indexed: 12/17/2022] Open
Abstract
Electroencephalography (EEG) is a non-invasive and painless recording of cerebral activity, particularly well-suited for studying young infants, allowing the inspection of cerebral responses in a constellation of different ways. Of particular interest for developmental cognitive neuroscientists is the use of rhythmic stimulation, and the analysis of steady-state evoked potentials (SS-EPs) - an approach also known as frequency tagging. In this paper we rely on the existing SS-EP early developmental literature to illustrate the important advantages of SS-EPs for studying the developing brain. We argue that (1) the technique is both objective and predictive: the response is expected at the stimulation frequency (and/or higher harmonics), (2) its high spectral specificity makes the computed responses particularly robust to artifacts, and (3) the technique allows for short and efficient recordings, compatible with infants' limited attentional spans. We additionally provide an overview of some recent inspiring use of the SS-EP technique in adult research, in order to argue that (4) the SS-EP approach can be implemented creatively to target a wide range of cognitive and neural processes. For all these reasons, we expect SS-EPs to play an increasing role in the understanding of early cognitive processes. Finally, we provide practical guidelines for implementing and analyzing SS-EP studies.
Collapse
Affiliation(s)
- Claire Kabdebon
- Laboratoire de Sciences Cognitives et Psycholinguistique, Département d'études cognitives, ENS, EHESS, CNRS, PSL University, Paris, France; Haskins Laboratories, New Haven, CT, USA.
| | - Ana Fló
- Cognitive Neuroimaging Unit, CNRS ERL 9003, INSERM U992, CEA, Université Paris-Saclay, NeuroSpin Center, Gif/Yvette, France
| | - Adélaïde de Heering
- Center for Research in Cognition & Neuroscience (CRCN), Université libre de Bruxelles (ULB), Brussels, Belgium
| | - Richard Aslin
- Haskins Laboratories, New Haven, CT, USA; Department of Psychology, Yale University, New Haven, CT, USA
| |
Collapse
|
10
|
Flaten E, Marshall SA, Dittrich A, Trainor L. Evidence for Top-down Meter Perception in Infancy as Shown by Primed Neural Responses to an Ambiguous Rhythm. Eur J Neurosci 2022; 55:2003-2023. [PMID: 35445451 DOI: 10.1111/ejn.15671] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2021] [Revised: 03/23/2022] [Accepted: 03/24/2022] [Indexed: 11/30/2022]
Abstract
From auditory rhythm patterns, listeners extract the underlying steady beat, and perceptually group beats to form meters. While previous studies show infants discriminate different auditory meters, it remains unknown whether they can maintain (imagine) a metrical interpretation of an ambiguous rhythm through top-down processes. We investigated this via electroencephalographic mismatch responses. We primed 6-month-old infants (N = 24) to hear a 6-beat ambiguous rhythm either in duple meter (n = 13), or in triple meter (n = 11) through loudness accents either on every second or every third beat. Periods of priming were inserted before sequences of the ambiguous unaccented rhythm. To elicit mismatch responses, occasional pitch deviants occurred on either beat 4 (strong beat in triple meter; weak in duple) or beat 5 (strong in duple; weak in triple) of the unaccented trials. At frontal left sites, we found a significant interaction between beat and priming group in the predicted direction. Post-hoc analyses showed mismatch response amplitudes were significantly larger for beat 5 in the duple- than triple-primed group (p = .047) and were non-significantly larger for beat 4 in the triple- than duple-primed group. Further, amplitudes were generally larger in infants with musically experienced parents. At frontal right sites, mismatch responses were generally larger for those in the duple compared to triple group, which may reflect a processing advantage for duple meter. These results indicate infants can impose a top-down, internally generated meter on ambiguous auditory rhythms, an ability that would aid early language and music learning.
Collapse
Affiliation(s)
- Erica Flaten
- Department of Psychology, Neuroscience and Behaviour, McMaster University
| | - Sara A Marshall
- Department of Psychology, Neuroscience and Behaviour, McMaster University
| | - Angela Dittrich
- Department of Psychology, Neuroscience and Behaviour, McMaster University
| | - Laurel Trainor
- Department of Psychology, Neuroscience and Behaviour, McMaster University.,McMaster Institute for Music and the Mind, McMaster University.,Rotman Research Institute, Baycrest Hospital, Toronto, ON, Canada
| |
Collapse
|
11
|
Lapenta OM, Keller PE, Nozaradan S, Varlet M. Lateralised dynamic modulations of corticomuscular coherence associated with bimanual learning of rhythmic patterns. Sci Rep 2022; 12:6271. [PMID: 35428836 PMCID: PMC9012795 DOI: 10.1038/s41598-022-10342-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2021] [Accepted: 03/28/2022] [Indexed: 11/09/2022] Open
Abstract
Human movements are spontaneously attracted to auditory rhythms, triggering an automatic activation of the motor system, a central phenomenon to music perception and production. Cortico-muscular coherence (CMC) in the theta, alpha, beta and gamma frequencies has been used as an index of the synchronisation between cortical motor regions and the muscles. Here we investigated how learning to produce a bimanual rhythmic pattern composed of low- and high-pitch sounds affects CMC in the beta frequency band. Electroencephalography (EEG) and electromyography (EMG) from the left and right First Dorsal Interosseus and Flexor Digitorum Superficialis muscles were concurrently recorded during constant pressure on a force sensor held between the thumb and index finger while listening to the rhythmic pattern before and after a bimanual training session. During the training, participants learnt to produce the rhythmic pattern guided by visual cues by pressing the force sensors with their left or right hand to produce the low- and high-pitch sounds, respectively. Results revealed no changes after training in overall beta CMC or beta oscillation amplitude, nor in the correlation between the left and right sides for EEG and EMG separately. However, correlation analyses indicated that left- and right-hand beta EEG-EMG coherence were positively correlated over time before training but became uncorrelated after training. This suggests that learning to bimanually produce a rhythmic musical pattern reinforces lateralised and segregated cortico-muscular communication.
Collapse
Affiliation(s)
- Olivia Morgan Lapenta
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia. .,Center for Investigation in Psychology, University of Minho, Braga, Portugal.
| | - Peter E Keller
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia
| | - Sylvie Nozaradan
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia.,Institute of Neuroscience, Catholic University of Louvain, Woluwe-Saint-Lambert, Belgium
| | - Manuel Varlet
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia.,School of Psychology, Western Sydney University, Penrith, Australia
| |
Collapse
|
12
|
Abstract
PURPOSE Humans have a near-automatic tendency to entrain their motor actions to rhythms in the environment. Entrainment has been hypothesized to play an important role in processing naturalistic stimuli, such as speech and music, which have intrinsically rhythmic properties. Here, we studied two facets of entraining one's rhythmic motor actions to an external stimulus: (a) synchronized finger tapping to auditory rhythmic stimuli and (b) memory-paced reproduction of a previously heard rhythm. METHOD Using modifications of the Synchronization-Continuation tapping paradigm, we studied how these two rhythmic behaviors were affected by different stimulus and task features. We tested synchronization and memory-paced tapping for a broad range of rates, from stimulus onset asynchrony of subsecond to suprasecond, both for strictly isochronous tone sequences and for rhythmic speech stimuli (counting from 1 to 10), which are more ecological yet less isochronous. We also asked what role motor engagement plays in forming a stable internal representation for rhythms and guiding memory-paced tapping. RESULTS AND CONCLUSIONS Our results show that individuals can flexibly synchronize their motor actions to a very broad range of rhythms. However, this flexibility does not extend to memory-paced tapping, which is accurate only in a narrower range of rates, around ~1.5 Hz. This pattern suggests that intrinsic rhythmic defaults in the auditory and/or motor system influence the internal representation of rhythms, in the absence of an external pacemaker. Interestingly, memory-paced tapping for speech rhythms and simple tone sequences shared similar "optimal rates," although with reduced accuracy, suggesting that internal constraints on rhythmic entrainment generalize to more ecological stimuli. Last, we found that actively synchronizing to tones versus passively listening to them led to more accurate memory-paced tapping performance, which emphasizes the importance of action-perception interactions in forming stable entrainment to external rhythms.
Collapse
Affiliation(s)
- Anat Kliger Amrani
- The Leslie and Susan Gonda Multidisciplinary Brain Research Center, Bar-Ilan University, Ramat Gan, Israel
| | - Elana Zion Golumbic
- The Leslie and Susan Gonda Multidisciplinary Brain Research Center, Bar-Ilan University, Ramat Gan, Israel
| |
Collapse
|
13
|
Fiveash A, Burger B, Canette LH, Bedoin N, Tillmann B. When Visual Cues Do Not Help the Beat: Evidence for a Detrimental Effect of Moving Point-Light Figures on Rhythmic Priming. Front Psychol 2022; 13:807987. [PMID: 35185727 PMCID: PMC8855071 DOI: 10.3389/fpsyg.2022.807987] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2021] [Accepted: 01/10/2022] [Indexed: 11/13/2022] Open
Abstract
Rhythm perception involves strong auditory-motor connections that can be enhanced with movement. However, it is unclear whether just seeing someone moving to a rhythm can enhance auditory-motor coupling, resulting in stronger entrainment. Rhythmic priming studies show that presenting regular rhythms before naturally spoken sentences can enhance grammaticality judgments compared to irregular rhythms or other baseline conditions. The current study investigated whether introducing a point-light figure moving in time with regular rhythms could enhance the rhythmic priming effect. Three experiments revealed that the addition of a visual cue did not benefit rhythmic priming in comparison to auditory conditions with a static image. In Experiment 1 (27 7–8-year-old children), grammaticality judgments were poorer after audio-visual regular rhythms (with a bouncing point-light figure) compared to auditory-only regular rhythms. In Experiments 2 (31 adults) and 3 (31 different adults), there was no difference in grammaticality judgments after audio-visual regular rhythms compared to auditory-only irregular rhythms for either a bouncing point-light figure (Experiment 2) or a swaying point-light figure (Experiment 3). Comparison of the observed performance with previous data suggested that the audio-visual component removed the regular prime benefit. These findings suggest that the visual cues used in this study do not enhance rhythmic priming and could hinder the effect by potentially creating a dual-task situation. In addition, individual differences in sensory-motor and social scales of music reward influenced the effect of the visual cue. Implications for future audio-visual experiments aiming to enhance beat processing, and the importance of individual differences will be discussed.
Collapse
Affiliation(s)
- Anna Fiveash
- Lyon Neuroscience Research Center, CNRS, UMR 5292, INSERM, U1028, Lyon, France
- University of Lyon 1, Lyon, France
- *Correspondence: Anna Fiveash,
| | - Birgitta Burger
- Institute for Systematic Musicology, University of Hamburg, Hamburg, Germany
| | - Laure-Hélène Canette
- Lyon Neuroscience Research Center, CNRS, UMR 5292, INSERM, U1028, Lyon, France
- University of Lyon 1, Lyon, France
- University of Burgundy, F-21000, LEAD-CNRS UMR 5022, Dijon, France
| | - Nathalie Bedoin
- Lyon Neuroscience Research Center, CNRS, UMR 5292, INSERM, U1028, Lyon, France
- University of Lyon 1, Lyon, France
- University of Lyon 2, Lyon, France
| | - Barbara Tillmann
- Lyon Neuroscience Research Center, CNRS, UMR 5292, INSERM, U1028, Lyon, France
- University of Lyon 1, Lyon, France
| |
Collapse
|
14
|
Cracco E, Lee H, van Belle G, Quenon L, Haggard P, Rossion B, Orgs G. EEG Frequency Tagging Reveals the Integration of Form and Motion Cues into the Perception of Group Movement. Cereb Cortex 2021; 32:2843-2857. [PMID: 34734972 PMCID: PMC9247417 DOI: 10.1093/cercor/bhab385] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2021] [Revised: 09/06/2021] [Accepted: 09/07/2021] [Indexed: 11/14/2022] Open
Abstract
The human brain has dedicated mechanisms for processing other people’s movements. Previous research has revealed how these mechanisms contribute to perceiving the movements of individuals but has left open how we perceive groups of people moving together. Across three experiments, we test whether movement perception depends on the spatiotemporal relationships among the movements of multiple agents. In Experiment 1, we combine EEG frequency tagging with apparent human motion and show that posture and movement perception can be dissociated at harmonically related frequencies of stimulus presentation. We then show that movement but not posture processing is enhanced when observing multiple agents move in synchrony. Movement processing was strongest for fluently moving synchronous groups (Experiment 2) and was perturbed by inversion (Experiment 3). Our findings suggest that processing group movement relies on binding body postures into movements and individual movements into groups. Enhanced perceptual processing of movement synchrony may form the basis for higher order social phenomena such as group alignment and its social consequences.
Collapse
Affiliation(s)
- Emiel Cracco
- Department of Experimental Psychology, Ghent University, 9000 Ghent, Belgium
| | - Haeeun Lee
- Department of Psychology, Goldsmiths, University of London, SE14 6NW London, UK
| | - Goedele van Belle
- Psychological Sciences Research Institute, Université Catholique de Louvain, 1340 Ottignies-Louvain-la-Neuve, Belgium
| | - Lisa Quenon
- Institute of Neuroscience, Université Catholique de Louvain, 1000 Brussels, Belgium
| | - Patrick Haggard
- Institute of Cognitive Neuroscience, University College London, WC1N 3AZ London, UK
| | - Bruno Rossion
- Université de Lorraine, CNRS, CRAN, F-54000 Nancy, France.,CHRU-Nancy, Service de Neurologie, F-54000 Nancy, France
| | - Guido Orgs
- Department of Psychology, Goldsmiths, University of London, SE14 6NW London, UK
| |
Collapse
|
15
|
Lenc T, Merchant H, Keller PE, Honing H, Varlet M, Nozaradan S. Mapping between sound, brain and behaviour: four-level framework for understanding rhythm processing in humans and non-human primates. Philos Trans R Soc Lond B Biol Sci 2021; 376:20200325. [PMID: 34420381 PMCID: PMC8380981 DOI: 10.1098/rstb.2020.0325] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/14/2021] [Indexed: 12/16/2022] Open
Abstract
Humans perceive and spontaneously move to one or several levels of periodic pulses (a meter, for short) when listening to musical rhythm, even when the sensory input does not provide prominent periodic cues to their temporal location. Here, we review a multi-levelled framework to understanding how external rhythmic inputs are mapped onto internally represented metric pulses. This mapping is studied using an approach to quantify and directly compare representations of metric pulses in signals corresponding to sensory inputs, neural activity and behaviour (typically body movement). Based on this approach, recent empirical evidence can be drawn together into a conceptual framework that unpacks the phenomenon of meter into four levels. Each level highlights specific functional processes that critically enable and shape the mapping from sensory input to internal meter. We discuss the nature, constraints and neural substrates of these processes, starting with fundamental mechanisms investigated in macaque monkeys that enable basic forms of mapping between simple rhythmic stimuli and internally represented metric pulse. We propose that human evolution has gradually built a robust and flexible system upon these fundamental processes, allowing more complex levels of mapping to emerge in musical behaviours. This approach opens promising avenues to understand the many facets of rhythmic behaviours across individuals and species. This article is part of the theme issue 'Synchrony and rhythm interaction: from the brain to behavioural ecology'.
Collapse
Affiliation(s)
- Tomas Lenc
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, New South Wales 2751, Australia
- Institute of Neuroscience (IONS), Université Catholique de Louvain (UCL), Brussels 1200, Belgium
| | - Hugo Merchant
- Instituto de Neurobiologia, UNAM, Campus Juriquilla, Querétaro 76230, Mexico
| | - Peter E. Keller
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, New South Wales 2751, Australia
| | - Henkjan Honing
- Amsterdam Brain and Cognition (ABC), Institute for Logic, Language and Computation (ILLC), University of Amsterdam, Amsterdam 1090 GE, The Netherlands
| | - Manuel Varlet
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, New South Wales 2751, Australia
- School of Psychology, Western Sydney University, Penrith, New South Wales 2751, Australia
| | - Sylvie Nozaradan
- Institute of Neuroscience (IONS), Université Catholique de Louvain (UCL), Brussels 1200, Belgium
| |
Collapse
|
16
|
Møller C, Stupacher J, Celma-Miralles A, Vuust P. Beat perception in polyrhythms: Time is structured in binary units. PLoS One 2021; 16:e0252174. [PMID: 34415911 PMCID: PMC8378699 DOI: 10.1371/journal.pone.0252174] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2021] [Accepted: 08/01/2021] [Indexed: 11/19/2022] Open
Abstract
In everyday life, we group and subdivide time to understand the sensory environment surrounding us. Organizing time in units, such as diurnal rhythms, phrases, and beat patterns, is fundamental to behavior, speech, and music. When listening to music, our perceptual system extracts and nests rhythmic regularities to create a hierarchical metrical structure that enables us to predict the timing of the next events. Foot tapping and head bobbing to musical rhythms are observable evidence of this process. In the special case of polyrhythms, at least two metrical structures compete to become the reference for these temporal regularities, rendering several possible beats with which we can synchronize our movements. While there is general agreement that tempo, pitch, and loudness influence beat perception in polyrhythms, we focused on the yet neglected influence of beat subdivisions, i.e., the least common denominator of a polyrhythm ratio. In three online experiments, 300 participants listened to a range of polyrhythms and tapped their index fingers in time with the perceived beat. The polyrhythms consisted of two simultaneously presented isochronous pulse trains with different ratios (2:3, 2:5, 3:4, 3:5, 4:5, 5:6) and different tempi. For ratios 2:3 and 3:4, we additionally manipulated the pitch of the pulse trains. Results showed a highly robust influence of subdivision grouping on beat perception. This was manifested as a propensity towards beats that are subdivided into two or four equally spaced units, as opposed to beats with three or more complex groupings of subdivisions. Additionally, lower pitched pulse trains were more often perceived as the beat. Our findings suggest that subdivisions, not beats, are the basic unit of beat perception, and that the principle underlying the binary grouping of subdivisions reflects a propensity towards simplicity. This preference for simple grouping is widely applicable to human perception and cognition of time.
Collapse
Affiliation(s)
- Cecilie Møller
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Aarhus C, Denmark
| | - Jan Stupacher
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Aarhus C, Denmark
| | - Alexandre Celma-Miralles
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Aarhus C, Denmark
| | - Peter Vuust
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Aarhus C, Denmark
| |
Collapse
|
17
|
Tichko P, Kim JC, Large EW. Bouncing the network: A dynamical systems model of auditory-vestibular interactions underlying infants' perception of musical rhythm. Dev Sci 2021; 24:e13103. [PMID: 33570778 DOI: 10.1111/desc.13103] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2020] [Accepted: 02/03/2021] [Indexed: 11/26/2022]
Abstract
Previous work suggests that auditory-vestibular interactions, which emerge during bodily movement to music, can influence the perception of musical rhythm. In a seminal study on the ontogeny of musical rhythm, Phillips-Silver and Trainor (2005) found that bouncing infants to an unaccented rhythm influenced infants' perceptual preferences for accented rhythms that matched the rate of bouncing. In the current study, we ask whether nascent, diffuse coupling between auditory and motor systems is sufficient to bootstrap short-term Hebbian plasticity in the auditory system and explain infants' preferences for accented rhythms thought to arise from auditory-vestibular interactions. First, we specify a nonlinear, dynamical system in which two oscillatory neural networks, representing developmentally nascent auditory and motor systems, interact through weak, non-specific coupling. The auditory network was equipped with short-term Hebbian plasticity, allowing the auditory network to tune its intrinsic resonant properties. Next, we simulate the effect of vestibular input (e.g., infant bouncing) on infants' perceptual preferences for accented rhythms. We found that simultaneous auditory-vestibular training shaped the model's response to musical rhythm, enhancing vestibular-related frequencies in auditory-network activity. Moreover, simultaneous auditory-vestibular training, relative to auditory- or vestibular-only training, facilitated short-term auditory plasticity in the model, producing stronger oscillator connections in the auditory network. Finally, when tested on a musical rhythm, models which received simultaneous auditory-vestibular training, but not models that received auditory- or vestibular-only training, resonated strongly at frequencies related to their "bouncing," a finding qualitatively similar to infants' preferences for accented rhythms that matched the rate of infant bouncing.
Collapse
Affiliation(s)
- Parker Tichko
- Department of Music, Northeastern University, Boston, MA, USA
| | - Ji Chul Kim
- Department of Psychological Sciences, Perception, Action, Cognition (PAC) Division, University of Connecticut, Storrs, CT, USA
| | - Edward W Large
- Department of Psychological Sciences, Perception, Action, Cognition (PAC) Division, University of Connecticut, Storrs, CT, USA.,Department of Psychological Sciences, Center for the Ecological Study of Perception & Action (CESPA), University of Connecticut, Storrs, CT, USA.,Department of Physics, University of Connecticut, Storrs, CT, USA
| |
Collapse
|
18
|
Bouvet CJ, Bardy BG, Keller PE, Dalla Bella S, Nozaradan S, Varlet M. Accent-induced Modulation of Neural and Movement Patterns during Spontaneous Synchronization to Auditory Rhythms. J Cogn Neurosci 2020; 32:2260-2271. [DOI: 10.1162/jocn_a_01605] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
Human rhythmic movements spontaneously synchronize with auditory rhythms at various frequency ratios. The emergence of more complex relationships—for instance, frequency ratios of 1:2 and 1:3—is enhanced by adding a congruent accentuation pattern (binary for 1:2 and ternary for 1:3), resulting in a 1:1 movement–accentuation relationship. However, this benefit of accentuation on movement synchronization appears to be stronger for the ternary pattern than for the binary pattern. Here, we investigated whether this difference in accent-induced movement synchronization may be related to a difference in the neural tracking of these accentuation profiles. Accented and control unaccented auditory sequences were presented to participants who concurrently produced finger taps at their preferred frequency, and spontaneous movement synchronization was measured. EEG was recorded during passive listening to each auditory sequence. The results revealed that enhanced movement synchronization with ternary accentuation was accompanied by enhanced neural tracking of this pattern. Larger EEG responses at the accentuation frequency were found for the ternary pattern compared with the binary pattern. Moreover, the amplitude of accent-induced EEG responses was positively correlated with the magnitude of accent-induced movement synchronization across participants. Altogether, these findings show that the dynamics of spontaneous auditory–motor synchronization is strongly driven by the multi-time-scale sensory processing of auditory rhythms, highlighting the importance of considering neural responses to rhythmic sequences for understanding and enhancing synchronization performance.
Collapse
Affiliation(s)
| | | | | | - Simone Dalla Bella
- Université Montpellier
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, Canada
- University of Montreal
- University of Economics and Human Sciences in Warsaw
| | - Sylvie Nozaradan
- Western Sydney University
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, Canada
- Université Catholique de Louvain
| | | |
Collapse
|
19
|
Mathias B, Zamm A, Gianferrara PG, Ross B, Palmer C. Rhythm Complexity Modulates Behavioral and Neural Dynamics During Auditory–Motor Synchronization. J Cogn Neurosci 2020; 32:1864-1880. [DOI: 10.1162/jocn_a_01601] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
We addressed how rhythm complexity influences auditory–motor synchronization in musically trained individuals who perceived and produced complex rhythms while EEG was recorded. Participants first listened to two-part auditory sequences (Listen condition). Each part featured a single pitch presented at a fixed rate; the integer ratio formed between the two rates varied in rhythmic complexity from low (1:1) to moderate (1:2) to high (3:2). One of the two parts occurred at a constant rate across conditions. Then, participants heard the same rhythms as they synchronized their tapping at a fixed rate (Synchronize condition). Finally, they tapped at the same fixed rate (Motor condition). Auditory feedback from their taps was present in all conditions. Behavioral effects of rhythmic complexity were evidenced in all tasks; detection of missing beats (Listen) worsened in the most complex (3:2) rhythm condition, and tap durations (Synchronize) were most variable and least synchronous with stimulus onsets in the 3:2 condition. EEG power spectral density was lowest at the fixed rate during the 3:2 rhythm and greatest during the 1:1 rhythm (Listen and Synchronize). ERP amplitudes corresponding to an N1 time window were smallest for the 3:2 rhythm and greatest for the 1:1 rhythm (Listen). Finally, synchronization accuracy (Synchronize) decreased as amplitudes in the N1 time window became more positive during the high rhythmic complexity condition (3:2). Thus, measures of neural entrainment corresponded to synchronization accuracy, and rhythmic complexity modulated the behavioral and neural measures similarly.
Collapse
Affiliation(s)
- Brian Mathias
- McGill University
- Max Planck Institute for Human Cognitive and Brain Science
| | - Anna Zamm
- McGill University
- Central European University, Budapest, Hungary
| | | | | | | |
Collapse
|
20
|
Lenc T, Keller PE, Varlet M, Nozaradan S. Neural and Behavioral Evidence for Frequency-Selective Context Effects in Rhythm Processing in Humans. Cereb Cortex Commun 2020; 1:tgaa037. [PMID: 34296106 PMCID: PMC8152888 DOI: 10.1093/texcom/tgaa037] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Revised: 06/30/2020] [Accepted: 07/16/2020] [Indexed: 01/17/2023] Open
Abstract
When listening to music, people often perceive and move along with a periodic meter. However, the dynamics of mapping between meter perception and the acoustic cues to meter periodicities in the sensory input remain largely unknown. To capture these dynamics, we recorded the electroencephalography while nonmusician and musician participants listened to nonrepeating rhythmic sequences, where acoustic cues to meter frequencies either gradually decreased (from regular to degraded) or increased (from degraded to regular). The results revealed greater neural activity selectively elicited at meter frequencies when the sequence gradually changed from regular to degraded compared with the opposite. Importantly, this effect was unlikely to arise from overall gain, or low-level auditory processing, as revealed by physiological modeling. Moreover, the context effect was more pronounced in nonmusicians, who also demonstrated facilitated sensory-motor synchronization with the meter for sequences that started as regular. In contrast, musicians showed weaker effects of recent context in their neural responses and robust ability to move along with the meter irrespective of stimulus degradation. Together, our results demonstrate that brain activity elicited by rhythm does not only reflect passive tracking of stimulus features, but represents continuous integration of sensory input with recent context.
Collapse
Affiliation(s)
- Tomas Lenc
- MARCS Institute for Brain, Behaviour, and Development, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
| | - Peter E Keller
- MARCS Institute for Brain, Behaviour, and Development, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
| | - Manuel Varlet
- MARCS Institute for Brain, Behaviour, and Development, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
- School of Psychology, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
| | - Sylvie Nozaradan
- MARCS Institute for Brain, Behaviour, and Development, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
- Institute of Neuroscience (IONS), Université Catholique de Louvain (UCL), Brussels 1200, Belgium
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal QC H3C 3J7, Canada
| |
Collapse
|
21
|
Damm L, Varoqui D, De Cock VC, Dalla Bella S, Bardy B. Why do we move to the beat? A multi-scale approach, from physical principles to brain dynamics. Neurosci Biobehav Rev 2020; 112:553-84. [DOI: 10.1016/j.neubiorev.2019.12.024] [Citation(s) in RCA: 36] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2019] [Revised: 10/20/2019] [Accepted: 12/13/2019] [Indexed: 01/08/2023]
|
22
|
|
23
|
Abstract
For both musicians and music psychologists, beat rate (BPM) has often been regarded as a transparent measure of musical speed or tempo, yet recent research has shown that tempo is more than just BPM. In a previous study, London, Burger, Thompson, and Toiviainen (Acta Psychologica, 164, 70-80, 2016) presented participants with original as well as "time-stretched" versions of classic R&B songs; time stretching slows down or speeds up a recording without changing its pitch or timbre. In that study we discovered a tempo anchoring effect (TAE): Although relative tempo judgments (original vs. time-stretched versions of the same song) were correct, they were at odds with BPM rates of each stimulus. As previous studies have shown that synchronous movement enhances rhythm perception, we hypothesized that tapping along to the beat of these songs would reduce or eliminate the TAE and increase the salience of the beat rate of each stimulus. In the current study participants were presented with the London et al. (Acta Psychologica, 164, 70-80, 2016) stimuli in nonmovement and movement conditions. We found that although participants were able to make BPM-based tempo judgments of generic drumming patterns, and were able to tap along to the R&B stimuli at the correct beat rates, the TAE persisted in both movement and nonmovement conditions. Thus, contrary to our hypothesis that movement would reduce or eliminate the TAE, we found a disjunction between correctly synchronized motor behavior and tempo judgment. The implications of the tapping-TAE dissociation in the broader context of tempo and rhythm perception are discussed, and further approaches to studying the TAE-tapping dissociation are suggested.
Collapse
Affiliation(s)
- Justin London
- Department of Music, Carleton College, Northfield, MN, 55057, USA.
| | | | | | - Molly Hildreth
- Department of Music, Carleton College, Northfield, MN, 55057, USA
| | | |
Collapse
|
24
|
Schmidt-Kassow M, Thöne K, Kaiser J. Auditory-motor coupling affects phonetic encoding. Brain Res 2019; 1716:39-49. [PMID: 29191770 DOI: 10.1016/j.brainres.2017.11.022] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2017] [Revised: 10/24/2017] [Accepted: 11/21/2017] [Indexed: 10/18/2022]
Abstract
Recent studies have shown that moving in synchrony with auditory stimuli boosts attention allocation and verbal learning. Furthermore rhythmic tones are processed more efficiently than temporally random tones ('timing effect'), and this effect is increased when participants actively synchronize their motor performance with the rhythm of the tones, resulting in auditory-motor synchronization. Here, we investigated whether this applies also to sequences of linguistic stimuli (syllables). We compared temporally irregular syllable sequences with two temporally regular conditions where either the interval between syllable onsets (stimulus onset asynchrony, SOA) or the interval between the syllables' vowel onsets was kept constant. Entrainment to the stimulus presentation frequency (1 Hz) and event-related potentials were assessed in 24 adults who were instructed to detect pre-defined deviant syllables while they either pedaled or sat still on a stationary exercise bike. We found larger 1 Hz entrainment and P300 amplitudes for the SOA presentation during motor activity. Furthermore, the magnitude of the P300 component correlated with the motor variability in the SOA condition and 1 Hz entrainment, while in turn 1 Hz entrainment correlated with auditory-motor synchronization performance. These findings demonstrate that acute auditory-motor coupling facilitates phonetic encoding.
Collapse
Affiliation(s)
| | - Katharina Thöne
- Institute of Medical Psychology, Goethe University, Frankfurt, Germany
| | - Jochen Kaiser
- Institute of Medical Psychology, Goethe University, Frankfurt, Germany
| |
Collapse
|
25
|
Abstract
Infant development has rarely been informed by the behavior of infants with sensory differences despite increasing recognition that infant behavior itself creates sensory learning opportunities. The purpose of this study of object exploration was to compare the behavior of hearing and deaf infants, with and without cochlear implants, in order to identify the effects of profound sensorineural hearing loss on infant exploration before cochlear implantation, the behavioral effects of access to auditory feedback after cochlear implantation, and the sensory motivation for exploration behaviors performed by hearing infants as well. The results showed that 9-month-old deaf infants explored objects as often as hearing infants but they used systematically different approaches and less variation before compared to after cochlear implantation. Potential associations between these early experiences and later learning are discussed in the context of embodied developmental theory, comparative studies, and research with adults. The data call for increased recognition of the active sensorimotor nature of infant learning and future research that investigates differences in sensorimotor experience as potential mechanisms in later learning and sequential memory development.
Collapse
Affiliation(s)
- Mary K Fagan
- Department of Communication Sciences and Disorders, Chapman University
| |
Collapse
|
26
|
Chemin B, Huang G, Mulders D, Mouraux A. EEG time-warping to study non-strictly-periodic EEG signals related to the production of rhythmic movements. J Neurosci Methods 2018; 308:106-115. [PMID: 30053483 DOI: 10.1016/j.jneumeth.2018.07.016] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2018] [Revised: 07/23/2018] [Accepted: 07/23/2018] [Indexed: 11/17/2022]
Abstract
BACKGROUND Many sensorimotor functions are intrinsically rhythmic, and are underlined by neural processes that are functionally distinct from neural responses related to the processing of transient events. EEG frequency tagging is a technique that is increasingly used in neuroscience to study these processes. It relies on the fact that perceiving and/or producing rhythms generates periodic neural activity that translates into periodic variations of the EEG signal. In the EEG spectrum, those variations appear as peaks localized at the frequency of the rhythm and its harmonics. NEW METHOD Many natural rhythms, such as music or dance, are not strictly periodic and, instead, show fluctuations of their period over time. Here, we introduce a time-warping method to identify non-strictly-periodic EEG activities in the frequency domain. RESULTS EEG time-warping can be used to characterize the sensorimotor activity related to the performance of self-paced rhythmic finger movements. Furthermore, the EEG time-warping method can disentangle auditory- and movement-related EEG activity produced when participants perform rhythmic movements synchronized to an acoustic rhythm. This is possible because the movement-related activity has different period fluctuations than the auditory-related activity. COMPARISON WITH EXISTING METHODS With the classic frequency-tagging approach, rhythm fluctuations result in a spreading of the peaks to neighboring frequencies, to the point that they cannot be distinguished from background noise. CONCLUSIONS The proposed time-warping procedure is as a simple and effective mean to study natural non-strictly-periodic rhythmic neural processes such as rhythmic movement production, acoustic rhythm perception and sensorimotor synchronization.
Collapse
Affiliation(s)
- B Chemin
- Institute of NeuroScience (IoNS), System and Cognition Department, Université catholique de Louvain, Belgium; International Laboratory for Brain, Music and Sound Research (BRAMS), Université de Montréal, Canada.
| | - G Huang
- Institute of NeuroScience (IoNS), System and Cognition Department, Université catholique de Louvain, Belgium; School of Mobile Information Engineering, Sun Yat-Sen University, China
| | - D Mulders
- Institute of NeuroScience (IoNS), System and Cognition Department, Université catholique de Louvain, Belgium
| | - A Mouraux
- Institute of NeuroScience (IoNS), System and Cognition Department, Université catholique de Louvain, Belgium
| |
Collapse
|
27
|
Abstract
Bass sounds play a special role in conveying the rhythm and stimulating motor entrainment to the beat of music. However, the biological roots of this culturally widespread musical practice remain mysterious, despite its fundamental relevance in the sciences and arts, and also for music-assisted clinical rehabilitation of motor disorders. Here, we show that this musical convention may exploit a neurophysiological mechanism whereby low-frequency sounds shape neural representations of rhythmic input at the cortical level by boosting selective neural locking to the beat, thus explaining the privileged role of bass sounds in driving people to move along with the musical beat. Music makes us move, and using bass instruments to build the rhythmic foundations of music is especially effective at inducing people to dance to periodic pulse-like beats. Here, we show that this culturally widespread practice may exploit a neurophysiological mechanism whereby low-frequency sounds shape the neural representations of rhythmic input by boosting selective locking to the beat. Cortical activity was captured using electroencephalography (EEG) while participants listened to a regular rhythm or to a relatively complex syncopated rhythm conveyed either by low tones (130 Hz) or high tones (1236.8 Hz). We found that cortical activity at the frequency of the perceived beat is selectively enhanced compared with other frequencies in the EEG spectrum when rhythms are conveyed by bass sounds. This effect is unlikely to arise from early cochlear processes, as revealed by auditory physiological modeling, and was particularly pronounced for the complex rhythm requiring endogenous generation of the beat. The effect is likewise not attributable to differences in perceived loudness between low and high tones, as a control experiment manipulating sound intensity alone did not yield similar results. Finally, the privileged role of bass sounds is contingent on allocation of attentional resources to the temporal properties of the stimulus, as revealed by a further control experiment examining the role of a behavioral task. Together, our results provide a neurobiological basis for the convention of using bass instruments to carry the rhythmic foundations of music and to drive people to move to the beat.
Collapse
|
28
|
Abstract
Studies in the literature have provided conflicting evidence about the effects of background noise or music on concurrent cognitive tasks. Some studies have shown a detrimental effect, while others have shown a beneficial effect of background auditory stimuli. The aim of this study was to investigate the influence of agitating, happy or touching music, as opposed to environmental sounds or silence, on the ability of non-musician subjects to perform arithmetic operations. Fifty university students (25 women and 25 men, 25 introverts and 25 extroverts) volunteered for the study. The participants were administered 180 easy or difficult arithmetic operations (division, multiplication, subtraction and addition) while listening to heavy rain sounds, silence or classical music. Silence was detrimental when participants were faced with difficult arithmetic operations, as it was associated with significantly worse accuracy and slower RTs than music or rain sound conditions. This finding suggests that the benefit of background stimulation was not music-specific but possibly due to an enhanced cerebral alertness level induced by the auditory stimulation. Introverts were always faster than extroverts in solving mathematical problems, except when the latter performed calculations accompanied by the sound of heavy rain, a condition that made them as fast as introverts. While the background auditory stimuli had no effect on the arithmetic ability of either group in the easy condition, it strongly affected extroverts in the difficult condition, with RTs being faster during agitating or joyful music as well as rain sounds, compared to the silent condition. For introverts, agitating music was associated with faster response times than the silent condition. This group difference may be explained on the basis of the notion that introverts have a generally higher arousal level compared to extroverts and would therefore benefit less from the background auditory stimuli.
Collapse
Affiliation(s)
- Alice Mado Proverbio
- Neuro-Mi Center for Neuroscience, Dept. of Psychology, University of Milano-Bicocca, Milan, Italy
- * E-mail:
| | - Francesco De Benedetto
- Neuro-Mi Center for Neuroscience, Dept. of Psychology, University of Milano-Bicocca, Milan, Italy
| | - Maria Vittoria Ferrari
- Neuro-Mi Center for Neuroscience, Dept. of Psychology, University of Milano-Bicocca, Milan, Italy
| | - Giorgia Ferrarini
- Neuro-Mi Center for Neuroscience, Dept. of Psychology, University of Milano-Bicocca, Milan, Italy
| |
Collapse
|
29
|
Nozaradan S, Schönwiesner M, Keller PE, Lenc T, Lehmann A. Neural bases of rhythmic entrainment in humans: critical transformation between cortical and lower-level representations of auditory rhythm. Eur J Neurosci 2018; 47:321-332. [PMID: 29356161 DOI: 10.1111/ejn.13826] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/20/2017] [Revised: 01/12/2018] [Accepted: 01/15/2018] [Indexed: 11/29/2022]
Abstract
The spontaneous ability to entrain to meter periodicities is central to music perception and production across cultures. There is increasing evidence that this ability involves selective neural responses to meter-related frequencies. This phenomenon has been observed in the human auditory cortex, yet it could be the product of evolutionarily older lower-level properties of brainstem auditory neurons, as suggested by recent recordings from rodent midbrain. We addressed this question by taking advantage of a new method to simultaneously record human EEG activity originating from cortical and lower-level sources, in the form of slow (< 20 Hz) and fast (> 150 Hz) responses to auditory rhythms. Cortical responses showed increased amplitudes at meter-related frequencies compared to meter-unrelated frequencies, regardless of the prominence of the meter-related frequencies in the modulation spectrum of the rhythmic inputs. In contrast, frequency-following responses showed increased amplitudes at meter-related frequencies only in rhythms with prominent meter-related frequencies in the input but not for a more complex rhythm requiring more endogenous generation of the meter. This interaction with rhythm complexity suggests that the selective enhancement of meter-related frequencies does not fully rely on subcortical auditory properties, but is critically shaped at the cortical level, possibly through functional connections between the auditory cortex and other, movement-related, brain structures. This process of temporal selection would thus enable endogenous and motor entrainment to emerge with substantial flexibility and invariance with respect to the rhythmic input in humans in contrast with non-human animals.
Collapse
Affiliation(s)
- Sylvie Nozaradan
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Locked Bag 1797, Penrith, Sydney, NSW, 2751, Australia.,Institute of Neuroscience (IONS), Université catholique de Louvain (UCL), Louvain, Belgium.,International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, QC, Canada
| | - Marc Schönwiesner
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, QC, Canada.,Center for Research on Brain, Language and Music (CRBLM), Montreal, QC, Canada.,Faculty of Psychology, Université de Montréal, Montreal, QC, Canada
| | - Peter E Keller
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Locked Bag 1797, Penrith, Sydney, NSW, 2751, Australia
| | - Tomas Lenc
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Locked Bag 1797, Penrith, Sydney, NSW, 2751, Australia
| | - Alexandre Lehmann
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, QC, Canada.,Center for Research on Brain, Language and Music (CRBLM), Montreal, QC, Canada.,Faculty of Psychology, Université de Montréal, Montreal, QC, Canada.,Otolaryngology Department, Faculty of Medicine, McGill University Hospital, Montreal, QC, Canada
| |
Collapse
|
30
|
Affiliation(s)
- Daniel J. Levitin
- Department of Psychology, McGill University, Montreal, QC H3A 1G1, Canada
| | - Jessica A. Grahn
- Department of Psychology and Brain and Mind Institute, Western University, London, Ontario N6A 5B7, Canada
| | - Justin London
- Departments of Music and Cognitive Science, Carleton College, Northfield, Minnesota 55057
| |
Collapse
|
31
|
Nozaradan S, Keller PE, Rossion B, Mouraux A. EEG Frequency-Tagging and Input-Output Comparison in Rhythm Perception. Brain Topogr 2017; 31:153-160. [PMID: 29127530 DOI: 10.1007/s10548-017-0605-8] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2017] [Accepted: 10/27/2017] [Indexed: 01/23/2023]
Abstract
The combination of frequency-tagging with electroencephalography (EEG) has recently proved fruitful for understanding the perception of beat and meter in musical rhythm, a common behavior shared by humans of all cultures. EEG frequency-tagging allows the objective measurement of input-output transforms to investigate beat perception, its modulation by exogenous and endogenous factors, development, and neural basis. Recent doubt has been raised about the validity of comparing frequency-domain representations of auditory rhythmic stimuli and corresponding EEG responses, assuming that it implies a one-to-one mapping between the envelope of the rhythmic input and the neural output, and that it neglects the sensitivity of frequency-domain representations to acoustic features making up the rhythms. Here we argue that these elements actually reinforce the strengths of the approach. The obvious fact that acoustic features influence the frequency spectrum of the sound envelope precisely justifies taking into consideration the sounds used to generate a beat percept for interpreting neural responses to auditory rhythms. Most importantly, the many-to-one relationship between rhythmic input and perceived beat actually validates an approach that objectively measures the input-output transforms underlying the perceptual categorization of rhythmic inputs. Hence, provided that a number of potential pitfalls and fallacies are avoided, EEG frequency-tagging to study input-output relationships appears valuable for understanding rhythm perception.
Collapse
Affiliation(s)
- Sylvie Nozaradan
- The MARCS Institute for Brain, Behaviour and Development (WSU), Sydney, NSW, Australia. .,Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Brussels, Belgium. .,International Laboratory for Brain, Music and Sound Research (Brams), Montreal, QC, Canada. .,MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Locked Bag 1797, Penrith, NSW, 2751, Australia.
| | - Peter E Keller
- The MARCS Institute for Brain, Behaviour and Development (WSU), Sydney, NSW, Australia
| | - Bruno Rossion
- Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Brussels, Belgium.,Neurology Unit, Centre Hospitalier Régional Universitaire (CHRU) de Nancy, Nancy, France
| | - André Mouraux
- Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Brussels, Belgium
| |
Collapse
|
32
|
Nozaradan S, Schwartze M, Obermeier C, Kotz SA. Specific contributions of basal ganglia and cerebellum to the neural tracking of rhythm. Cortex 2017; 95:156-168. [DOI: 10.1016/j.cortex.2017.08.015] [Citation(s) in RCA: 60] [Impact Index Per Article: 8.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2017] [Revised: 07/16/2017] [Accepted: 08/07/2017] [Indexed: 11/29/2022]
|
33
|
Falk S, Volpi-Moncorger C, Dalla Bella S. Auditory-Motor Rhythms and Speech Processing in French and German Listeners. Front Psychol 2017; 8:395. [PMID: 28443036 PMCID: PMC5387104 DOI: 10.3389/fpsyg.2017.00395] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2016] [Accepted: 03/02/2017] [Indexed: 11/25/2022] Open
Abstract
Moving to a speech rhythm can enhance verbal processing in the listener by increasing temporal expectancies (Falk and Dalla Bella, 2016). Here we tested whether this hypothesis holds for prosodically diverse languages such as German (a lexical stress-language) and French (a non-stress language). Moreover, we examined the relation between motor performance and the benefits for verbal processing as a function of language. Sixty-four participants, 32 German and 32 French native speakers detected subtle word changes in accented positions in metrically structured sentences to which they previously tapped with their index finger. Before each sentence, they were cued by a metronome to tap either congruently (i.e., to accented syllables) or incongruently (i.e., to non-accented parts) to the following speech stimulus. Both French and German speakers detected words better when cued to tap congruently compared to incongruent tapping. Detection performance was predicted by participants' motor performance in the non-verbal cueing phase. Moreover, tapping rate while participants tapped to speech predicted detection differently for the two language groups, in particular in the incongruent tapping condition. We discuss our findings in light of the rhythmic differences of both languages and with respect to recent theories of expectancy-driven and multisensory speech processing.
Collapse
Affiliation(s)
- Simone Falk
- Institut für Deutsche Philologie, Ludwig-Maximilians-UniversityMunich, Germany.,Laboratoire Parole et Langage, UMR 7309, Centre National de la Recherche Scientifique, Aix-Marseille UniversityAix-en-Provence, France.,Laboratoire Phonétique et Phonologie, UMR 7018, CNRS, Université Sorbonne Nouvelle Paris-3Paris, France
| | - Chloé Volpi-Moncorger
- Laboratoire Parole et Langage, UMR 7309, Centre National de la Recherche Scientifique, Aix-Marseille UniversityAix-en-Provence, France
| | - Simone Dalla Bella
- EuroMov, University of MontpellierMontpellier, France.,Institut Universitaire de FranceParis, France.,International Laboratory for Brain, Music, and Sound ResearchMontreal, QC, Canada.,Department of Cognitive Psychology, Wyższa Szkoła Finansów i Zarządzania w Warszawie (WSFiZ)Warsaw, Poland
| |
Collapse
|
34
|
Abstract
Musical rhythm positively impacts on subsequent speech processing. However, the neural mechanisms underlying this phenomenon are so far unclear. We investigated whether carryover effects from a preceding musical cue to a speech stimulus result from a continuation of neural phase entrainment to periodicities that are present in both music and speech. Participants listened and memorized French metrical sentences that contained (quasi-)periodic recurrences of accents and syllables. Speech stimuli were preceded by a rhythmically regular or irregular musical cue. Our results show that the presence of a regular cue modulates neural response as estimated by EEG power spectral density, intertrial coherence, and source analyses at critical frequencies during speech processing compared with the irregular condition. Importantly, intertrial coherences for regular cues were indicative of the participants' success in memorizing the subsequent speech stimuli. These findings underscore the highly adaptive nature of neural phase entrainment across fundamentally different auditory stimuli. They also support current models of neural phase entrainment as a tool of predictive timing and attentional selection across cognitive domains.
Collapse
Affiliation(s)
- Simone Falk
- Aix-Marseille Univ, LPL, UMR 7309, CNRS, Aix-en-Provence, France.,Université Sorbonne Nouvelle Paris-3, LPP, UMR 7018, CNRS, Paris, France.,Ludwig-Maximilians-University, Munich, Germany
| | | | | |
Collapse
|
35
|
Nozaradan S, Mouraux A, Cousineau M. Frequency tagging to track the neural processing of contrast in fast, continuous sound sequences. J Neurophysiol 2017; 118:243-253. [PMID: 28381494 DOI: 10.1152/jn.00971.2016] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/27/2016] [Revised: 03/31/2017] [Accepted: 03/31/2017] [Indexed: 01/23/2023] Open
Abstract
The human auditory system presents a remarkable ability to detect rapid changes in fast, continuous acoustic sequences, as best illustrated in speech and music. However, the neural processing of rapid auditory contrast remains largely unclear, probably due to the lack of methods to objectively dissociate the response components specifically related to the contrast from the other components in response to the sequence of fast continuous sounds. To overcome this issue, we tested a novel use of the frequency-tagging approach allowing contrast-specific neural responses to be tracked based on their expected frequencies. The EEG was recorded while participants listened to 40-s sequences of sounds presented at 8Hz. A tone or interaural time contrast was embedded every fifth sound (AAAAB), such that a response observed in the EEG at exactly 8 Hz/5 (1.6 Hz) or harmonics should be the signature of contrast processing by neural populations. Contrast-related responses were successfully identified, even in the case of very fine contrasts. Moreover, analysis of the time course of the responses revealed a stable amplitude over repetitions of the AAAAB patterns in the sequence, except for the response to perceptually salient contrasts that showed a buildup and decay across repetitions of the sounds. Overall, this new combination of frequency-tagging with an oddball design provides a valuable complement to the classic, transient, evoked potentials approach, especially in the context of rapid auditory information. Specifically, we provide objective evidence on the neural processing of contrast embedded in fast, continuous sound sequences.NEW & NOTEWORTHY Recent theories suggest that the basis of neurodevelopmental auditory disorders such as dyslexia might be an impaired processing of fast auditory changes, highlighting how the encoding of rapid acoustic information is critical for auditory communication. Here, we present a novel electrophysiological approach to capture in humans neural markers of contrasts in fast continuous tone sequences. Contrast-specific responses were successfully identified, even for very fine contrasts, providing direct insight on the encoding of rapid auditory information.
Collapse
Affiliation(s)
- Sylvie Nozaradan
- Institute of Neuroscience, Université Catholique de Louvain, Brussels, Belgium; .,MARCS Institute for Brain, Behavior, and Development, Sydney, Australia; and.,International Laboratory for Brain, Music, and Sound Research (Brams), Montreal, Quebec, Canada
| | - André Mouraux
- Institute of Neuroscience, Université Catholique de Louvain, Brussels, Belgium
| | - Marion Cousineau
- International Laboratory for Brain, Music, and Sound Research (Brams), Montreal, Quebec, Canada
| |
Collapse
|
36
|
Henry MJ, Herrmann B, Grahn JA. What can we learn about beat perception by comparing brain signals and stimulus envelopes? PLoS One 2017; 12:e0172454. [PMID: 28225796 PMCID: PMC5321456 DOI: 10.1371/journal.pone.0172454] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2015] [Accepted: 02/06/2017] [Indexed: 01/30/2023] Open
Abstract
Entrainment of neural oscillations on multiple time scales is important for the perception of speech. Musical rhythms, and in particular the perception of a regular beat in musical rhythms, is also likely to rely on entrainment of neural oscillations. One recently proposed approach to studying beat perception in the context of neural entrainment and resonance (the "frequency-tagging" approach) has received an enthusiastic response from the scientific community. A specific version of the approach involves comparing frequency-domain representations of acoustic rhythm stimuli to the frequency-domain representations of neural responses to those rhythms (measured by electroencephalography, EEG). The relative amplitudes at specific EEG frequencies are compared to the relative amplitudes at the same stimulus frequencies, and enhancements at beat-related frequencies in the EEG signal are interpreted as reflecting an internal representation of the beat. Here, we show that frequency-domain representations of rhythms are sensitive to the acoustic features of the tones making up the rhythms (tone duration, onset/offset ramp duration); in fact, relative amplitudes at beat-related frequencies can be completely reversed by manipulating tone acoustics. Crucially, we show that changes to these acoustic tone features, and in turn changes to the frequency-domain representations of rhythms, do not affect beat perception. Instead, beat perception depends on the pattern of onsets (i.e., whether a rhythm has a simple or complex metrical structure). Moreover, we show that beat perception can differ for rhythms that have numerically identical frequency-domain representations. Thus, frequency-domain representations of rhythms are dissociable from beat perception. For this reason, we suggest caution in interpreting direct comparisons of rhythms and brain signals in the frequency domain. Instead, we suggest that combining EEG measurements of neural signals with creative behavioral paradigms is of more benefit to our understanding of beat perception.
Collapse
Affiliation(s)
- Molly J. Henry
- Brain and Mind Institute, Department of Psychology The University of Western Ontario, London, ON, Canada
| | - Björn Herrmann
- Brain and Mind Institute, Department of Psychology The University of Western Ontario, London, ON, Canada
| | - Jessica A. Grahn
- Brain and Mind Institute, Department of Psychology The University of Western Ontario, London, ON, Canada
| |
Collapse
|
37
|
Nozaradan S, Mouraux A, Jonas J, Colnat-Coulbois S, Rossion B, Maillard L. Intracerebral evidence of rhythm transform in the human auditory cortex. Brain Struct Funct 2016; 222:2389-2404. [PMID: 27990557 DOI: 10.1007/s00429-016-1348-0] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2016] [Accepted: 12/06/2016] [Indexed: 01/23/2023]
Abstract
Musical entrainment is shared by all human cultures and the perception of a periodic beat is a cornerstone of this entrainment behavior. Here, we investigated whether beat perception might have its roots in the earliest stages of auditory cortical processing. Local field potentials were recorded from 8 patients implanted with depth-electrodes in Heschl's gyrus and the planum temporale (55 recording sites in total), usually considered as human primary and secondary auditory cortices. Using a frequency-tagging approach, we show that both low-frequency (<30 Hz) and high-frequency (>30 Hz) neural activities in these structures faithfully track auditory rhythms through frequency-locking to the rhythm envelope. A selective gain in amplitude of the response frequency-locked to the beat frequency was observed for the low-frequency activities but not for the high-frequency activities, and was sharper in the planum temporale, especially for the more challenging syncopated rhythm. Hence, this gain process is not systematic in all activities produced in these areas and depends on the complexity of the rhythmic input. Moreover, this gain was disrupted when the rhythm was presented at fast speed, revealing low-pass response properties which could account for the propensity to perceive a beat only within the musical tempo range. Together, these observations show that, even though part of these neural transforms of rhythms could already take place in subcortical auditory processes, the earliest auditory cortical processes shape the neural representation of rhythmic inputs in favor of the emergence of a periodic beat.
Collapse
Affiliation(s)
- Sylvie Nozaradan
- Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), 53, Avenue Mounier, UCL 53.75, 1200, Brussels, Belgium. .,The MARCS Institute, Western Sydney University, Sydney, NSW, 2214, Australia. .,International Laboratory for Brain, Music and Sound Research (Brams), Montreal, H3C 3J7, Canada.
| | - André Mouraux
- Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), 53, Avenue Mounier, UCL 53.75, 1200, Brussels, Belgium
| | - Jacques Jonas
- Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), 53, Avenue Mounier, UCL 53.75, 1200, Brussels, Belgium.,Service de Neurologie, Centre Hospitalier Universitaire de Nancy, 54035, Nancy, France.,CRAN UMR 7039 CNRS Université de Lorraine, 54035, Nancy, France
| | - Sophie Colnat-Coulbois
- Neurosurgery Department, Centre Hospitalier Universitaire de Nancy, 54035, Nancy, France
| | - Bruno Rossion
- Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), 53, Avenue Mounier, UCL 53.75, 1200, Brussels, Belgium.,Service de Neurologie, Centre Hospitalier Universitaire de Nancy, 54035, Nancy, France.,Psychological Sciences Research Institute, Université Catholique de Louvain (UCL), 1348, Louvain-la-Neuve, Belgium
| | - Louis Maillard
- Service de Neurologie, Centre Hospitalier Universitaire de Nancy, 54035, Nancy, France.,CRAN UMR 7039 CNRS Université de Lorraine, 54035, Nancy, France
| |
Collapse
|
38
|
Manning FC, Harris J, Schutz M. Temporal prediction abilities are mediated by motor effector and rhythmic expertise. Exp Brain Res 2016; 235:861-871. [DOI: 10.1007/s00221-016-4845-8] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2016] [Accepted: 11/23/2016] [Indexed: 10/20/2022]
|
39
|
Schmidt-Kassow M, Wilkinson D, Denby E, Ferguson H. Synchronised vestibular signals increase the P300 event-related potential elicited by auditory oddballs. Brain Res 2016; 1648:224-231. [DOI: 10.1016/j.brainres.2016.07.019] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2015] [Revised: 07/01/2016] [Accepted: 07/13/2016] [Indexed: 11/29/2022]
|
40
|
Stupacher J, Witte M, Hove MJ, Wood G. Neural Entrainment in Drum Rhythms with Silent Breaks: Evidence from Steady-state Evoked and Event-related Potentials. J Cogn Neurosci 2016; 28:1865-1877. [PMID: 27458750 DOI: 10.1162/jocn_a_01013] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The fusion of rhythm, beat perception, and movement is often summarized under the term "entrainment" and becomes obvious when we effortlessly tap our feet or snap our fingers to the pulse of music. Entrainment to music involves a large network of brain structures, and neural oscillations at beat-related frequencies can help elucidate how this network is connected. Here, we used EEG to investigate steady-state evoked potentials (SSEPs) and event-related potentials (ERPs) during listening and tapping to drum clips with different rhythmic structures that were interrupted by silent breaks of 2-6 sec. This design allowed us to address the question of whether neural entrainment processes persist after the physical presence of musical rhythms and to link neural oscillations and event-related neural responses. During stimulus presentation, SSEPs were elicited in both tasks (listening and tapping). During silent breaks, SSEPs were only present in the tapping task. Notably, the amplitude of the N1 ERP component was more negative after longer silent breaks, and both N1 and SSEP results indicate that neural entrainment was increased when listening to drum rhythms compared with an isochronous metronome. Taken together, this suggests that neural entrainment to music is not solely driven by the physical input but involves endogenous timing processes. Our findings break ground for a tighter linkage between steady-state and transient evoked neural responses in rhythm processing. Beyond music perception, they further support the crucial role of entrained oscillatory activity in shaping sensory, motor, and cognitive processes in general.
Collapse
|
41
|
Nozaradan S, Schönwiesner M, Caron-Desrochers L, Lehmann A. Enhanced brainstem and cortical encoding of sound during synchronized movement. Neuroimage 2016; 142:231-40. [PMID: 27397623 DOI: 10.1016/j.neuroimage.2016.07.015] [Citation(s) in RCA: 28] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2016] [Revised: 07/05/2016] [Accepted: 07/06/2016] [Indexed: 01/23/2023] Open
Abstract
Movement to a steady beat has been widely studied as a model of alignment of motor outputs on sensory inputs. However, how the encoding of sensory inputs is shaped during synchronized movements along the sensory pathway remains unknown. To investigate this, we simultaneously recorded brainstem and cortical electro-encephalographic activity while participants listened to periodic amplitude-modulated tones. Participants listened either without moving or while tapping in sync on every second beat. Cortical responses were identified at the envelope modulation rate (beat frequency), whereas brainstem responses were identified at the partials frequencies of the chord and at their modulation by the beat frequency (sidebands). During sensorimotor synchronization, cortical responses at beat frequency were larger than during passive listening. Importantly, brainstem responses were also enhanced, with a selective amplification of the sidebands, in particular at the lower-pitched tone of the chord, and no significant correlation with electromyographic measures at tapping frequency. These findings provide first evidence for an online gain in the cortical and subcortical encoding of sounds during synchronized movement, selective to behavior-relevant sound features. Moreover, the frequency-tagging method to isolate concurrent brainstem and cortical activities even during actual movements appears promising to reveal coordinated processes along the human auditory pathway.
Collapse
|
42
|
Bedoin N, Brisseau L, Molinier P, Roch D, Tillmann B. Temporally Regular Musical Primes Facilitate Subsequent Syntax Processing in Children with Specific Language Impairment. Front Neurosci 2016; 10:245. [PMID: 27378833 PMCID: PMC4913515 DOI: 10.3389/fnins.2016.00245] [Citation(s) in RCA: 33] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2016] [Accepted: 05/17/2016] [Indexed: 11/15/2022] Open
Abstract
Children with developmental language disorders have been shown to be also impaired in rhythm and meter perception. Temporal processing and its link to language processing can be understood within the dynamic attending theory. An external stimulus can stimulate internal oscillators, which orient attention over time and drive speech signal segmentation to provide benefits for syntax processing, which is impaired in various patient populations. For children with Specific Language Impairment (SLI) and dyslexia, previous research has shown the influence of an external rhythmic stimulation on subsequent language processing by comparing the influence of a temporally regular musical prime to that of a temporally irregular prime. Here we tested whether the observed rhythmic stimulation effect is indeed due to a benefit provided by the regular musical prime (rather than a cost subsequent to the temporally irregular prime). Sixteen children with SLI and 16 age-matched controls listened to either a regular musical prime sequence or an environmental sound scene (without temporal regularities in event occurrence; i.e., referred to as “baseline condition”) followed by grammatically correct and incorrect sentences. They were required to perform grammaticality judgments for each auditorily presented sentence. Results revealed that performance for the grammaticality judgments was better after the regular prime sequences than after the baseline sequences. Our findings are interpreted in the theoretical framework of the dynamic attending theory (Jones, 1976) and the temporal sampling (oscillatory) framework for developmental language disorders (Goswami, 2011). Furthermore, they encourage the use of rhythmic structures (even in non-verbal materials) to boost linguistic structure processing and outline perspectives for rehabilitation.
Collapse
Affiliation(s)
- Nathalie Bedoin
- Dynamique Du Langage Laboratory, Centre National de la Recherche Scientifique UMR 5596 and University Lyon 2 Lyon, France
| | - Lucie Brisseau
- Institut Médico-Educatif Franchemont Franchemont, France
| | | | - Didier Roch
- Institut Médico-Educatif Franchemont Franchemont, France
| | - Barbara Tillmann
- Lyon Neuroscience Research Center, Auditory Cognition and Psychoacoustics Team, Centre National de la Recherche Scientifique -UMR 5292, INSERM U 1082, University Lyon 1 Lyon, France
| |
Collapse
|
43
|
Vongpaisal T, Caruso D, Yuan Z. Dance Movements Enhance Song Learning in Deaf Children with Cochlear Implants. Front Psychol 2016; 7:835. [PMID: 27378964 PMCID: PMC4908111 DOI: 10.3389/fpsyg.2016.00835] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2015] [Accepted: 05/18/2016] [Indexed: 11/13/2022] Open
Abstract
Music perception of cochlear implants (CI) users is constrained by the absence of salient musical pitch cues crucial for melody identification, but is made possible by timing cues that are largely preserved by current devices. While musical timing cues, including beats and rhythms, are a potential route to music learning, it is not known what extent they are perceptible to CI users in complex sound scenes, especially when pitch and timbral features can co-occur and obscure these musical features. The task at hand, then, becomes one of optimizing the available timing cues for young CI users by exploring ways that they might be perceived and encoded simultaneously across multiple modalities. Accordingly, we examined whether training tasks that engage active music listening through dance might enhance the song identification skills of deaf children with CIs. Nine CI children learned new songs in two training conditions: (a) listening only (auditory learning), and (2) listening and dancing (auditory-motor learning). We examined children's ability to identify original song excerpts, as well as mistuned, and piano versions from a closed-set task. While CI children were less accurate than their normal hearing peers, they showed greater song identification accuracies in versions that preserved the original instrumental beats following learning that engaged active listening with dance. The observed performance advantage is further qualified by a medium effect size, indicating that the gains afforded by auditory-motor learning are practically meaningful. Furthermore, kinematic analyses of body movements showed that CI children synchronized to temporal structures in music in a manner that was comparable to normal hearing age-matched peers. Our findings are the first to indicate that input from CI devices enables good auditory-motor integration of timing cues in child CI users for the purposes of listening and dancing to music. Beyond the heightened arousal from active engagement with music, our findings indicate that a more robust representation or memory of musical timing features was made possible by multimodal processing. Methods that encourage CI children to entrain, or track musical timing with body movements, may be particularly effective in consolidating musical knowledge than methods that engage listening only.
Collapse
Affiliation(s)
- Tara Vongpaisal
- Department of Psychology, MacEwan University Edmonton, AB, Canada
| | - Daniela Caruso
- Department of Psychology, MacEwan University Edmonton, AB, Canada
| | - Zhicheng Yuan
- Department of Psychology, MacEwan University Edmonton, AB, Canada
| |
Collapse
|
44
|
Cirelli LK, Spinelli C, Nozaradan S, Trainor LJ. Measuring Neural Entrainment to Beat and Meter in Infants: Effects of Music Background. Front Neurosci 2016; 10:229. [PMID: 27252619 PMCID: PMC4877507 DOI: 10.3389/fnins.2016.00229] [Citation(s) in RCA: 55] [Impact Index Per Article: 6.9] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2016] [Accepted: 05/09/2016] [Indexed: 11/28/2022] Open
Abstract
Caregivers often engage in musical interactions with their infants. For example, parents across cultures sing lullabies and playsongs to their infants from birth. Behavioral studies indicate that infants not only extract beat information, but also group these beats into metrical hierarchies by as early as 6 months of age. However, it is not known how this is accomplished in the infant brain. An EEG frequency-tagging approach has been used successfully with adults to measure neural entrainment to auditory rhythms. The current study is the first to use this technique with infants in order to investigate how infants' brains encode rhythms. Furthermore, we examine how infant and parent music background is associated with individual differences in rhythm encoding. In Experiment 1, EEG was recorded while 7-month-old infants listened to an ambiguous rhythmic pattern that could be perceived to be in two different meters. In Experiment 2, EEG was recorded while 15-month-old infants listened to a rhythmic pattern with an unambiguous meter. In both age groups, information about music background (parent music training, infant music classes, hours of music listening) was collected. Both age groups showed clear EEG responses frequency-locked to the rhythms, at frequencies corresponding to both beat and meter. For the younger infants (Experiment 1), the amplitudes at duple meter frequencies were selectively enhanced for infants enrolled in music classes compared to those who had not engaged in such classes. For the older infants (Experiment 2), amplitudes at beat and meter frequencies were larger for infants with musically-trained compared to musically-untrained parents. These results suggest that the frequency-tagging method is sensitive to individual differences in beat and meter processing in infancy and could be used to track developmental changes.
Collapse
Affiliation(s)
- Laura K Cirelli
- Department of Psychology, Neuroscience and Behaviour, McMaster University Hamilton, ON, Canada
| | - Christina Spinelli
- Department of Psychology, Neuroscience and Behaviour, McMaster University Hamilton, ON, Canada
| | - Sylvie Nozaradan
- MARCS Institute, Western Sydney UniversityMilperra, NSW, Australia; Institute of Neuroscience, Université Catholique de LouvainLouvain-la-Neuve, Belgium; BRAMS, Université de MontréalOutremont, QC, Canada
| | - Laurel J Trainor
- Department of Psychology, Neuroscience and Behaviour, McMaster UniversityHamilton, ON, Canada; McMaster Institute for Music and the Mind, McMaster UniversityHamilton, ON, Canada; Rotman Research Institute, Baycrest HospitalToronto, ON, Canada
| |
Collapse
|
45
|
Celma-Miralles A, de Menezes RF, Toro JM. Look at the Beat, Feel the Meter: Top-Down Effects of Meter Induction on Auditory and Visual Modalities. Front Hum Neurosci 2016; 10:108. [PMID: 27047358 PMCID: PMC4803728 DOI: 10.3389/fnhum.2016.00108] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2015] [Accepted: 02/28/2016] [Indexed: 11/13/2022] Open
Abstract
Recent research has demonstrated top-down effects on meter induction in the auditory modality. However, little is known about these effects in the visual domain, especially without the involvement of motor acts such as tapping. In the present study, we aim to assess whether the projection of meter on auditory beats is also present in the visual domain. We asked 16 musicians to internally project binary (i.e., a strong-weak pattern) and ternary (i.e., a strong-weak-weak pattern) meter onto separate, but analog, visual and auditory isochronous stimuli. Participants were presented with sequences of tones or blinking circular shapes (i.e., flashes) at 2.4 Hz while their electrophysiological responses were recorded. A frequency analysis of the elicited steady-state evoked potentials allowed us to compare the frequencies of the beat (2.4 Hz), its first harmonic (4.8 Hz), the binary subharmonic (1.2 Hz), and the ternary subharmonic (0.8 Hz) within and across modalities. Taking the amplitude spectra into account, we observed an enhancement of the amplitude at 0.8 Hz in the ternary condition for both modalities, suggesting meter induction across modalities. There was an interaction between modality and voltage at 2.4 and 4.8 Hz. Looking at the power spectra, we also observed significant differences from zero in the auditory, but not in the visual, binary condition at 1.2 Hz. These findings suggest that meter processing is modulated by top-down mechanisms that interact with our perception of rhythmic events and that such modulation can also be found in the visual domain. The reported cross-modal effects of meter may shed light on the origins of our timing mechanisms, partially developed in primates and allowing humans to synchronize across modalities accurately.
Collapse
Affiliation(s)
- Alexandre Celma-Miralles
- Information and Communication Technologies Engineering (ETIC), Language and Comparative Cognition Group - Center for Brain and Cognition, Universitat Pompeu Fabra Barcelona, Spain
| | - Robert F de Menezes
- Information and Communication Technologies Engineering (ETIC), Language and Comparative Cognition Group - Center for Brain and Cognition, Universitat Pompeu Fabra Barcelona, Spain
| | - Juan M Toro
- Information and Communication Technologies Engineering (ETIC), Language and Comparative Cognition Group - Center for Brain and Cognition, Universitat Pompeu FabraBarcelona, Spain; Institució Catalana de Recerca i Estudis AvançatsBarcelona, Spain
| |
Collapse
|
46
|
Nozaradan S, Peretz I, Keller PE. Individual Differences in Rhythmic Cortical Entrainment Correlate with Predictive Behavior in Sensorimotor Synchronization. Sci Rep 2016; 6:20612. [PMID: 26847160 PMCID: PMC4742877 DOI: 10.1038/srep20612] [Citation(s) in RCA: 73] [Impact Index Per Article: 9.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2015] [Accepted: 01/08/2016] [Indexed: 11/21/2022] Open
Abstract
The current study aims at characterizing the mechanisms that allow humans to entrain the mind and body to incoming rhythmic sensory inputs in real time. We addressed this unresolved issue by examining the relationship between covert neural processes and overt behavior in the context of musical rhythm. We measured temporal prediction abilities, sensorimotor synchronization accuracy and neural entrainment to auditory rhythms as captured using an EEG frequency-tagging approach. Importantly, movement synchronization accuracy with a rhythmic beat could be explained by the amplitude of neural activity selectively locked with the beat period when listening to the rhythmic inputs. Furthermore, stronger endogenous neural entrainment at the beat frequency was associated with superior temporal prediction abilities. Together, these results reveal a direct link between cortical and behavioral measures of rhythmic entrainment, thus providing evidence that frequency-tagged brain activity has functional relevance for beat perception and synchronization.
Collapse
Affiliation(s)
- Sylvie Nozaradan
- Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Belgium
- International Laboratory for Brain, Music and Sound Research (BRAMS), Université de Montréal, Canada
| | - Isabelle Peretz
- International Laboratory for Brain, Music and Sound Research (BRAMS), Université de Montréal, Canada
| | - Peter E. Keller
- The MARCS Institute, Western Sydney University, Sydney, Australia
- Music Cognition & Action Group, Max Planck Institute for Human Cognitive & Brain Sciences, Leipzig, Germany
| |
Collapse
|
47
|
Lee KM, Barrett KC, Kim Y, Lim Y, Lee K. Dance and Music in "Gangnam Style": How Dance Observation Affects Meter Perception. PLoS One 2015; 10:e0134725. [PMID: 26308092 PMCID: PMC4550453 DOI: 10.1371/journal.pone.0134725] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2015] [Accepted: 07/13/2015] [Indexed: 11/18/2022] Open
Abstract
Dance and music often co-occur as evidenced when viewing choreographed dances or singers moving while performing. This study investigated how the viewing of dance motions shapes sound perception. Previous research has shown that dance reflects the temporal structure of its accompanying music, communicating musical meter (i.e. a hierarchical organization of beats) via coordinated movement patterns that indicate where strong and weak beats occur. Experiments here investigated the effects of dance cues on meter perception, hypothesizing that dance could embody the musical meter, thereby shaping participant reaction times (RTs) to sound targets occurring at different metrical positions.In experiment 1, participants viewed a video with dance choreography indicating 4/4 meter (dance condition) or a series of color changes repeated in sequences of four to indicate 4/4 meter (picture condition). A sound track accompanied these videos and participants reacted to timbre targets at different metrical positions. Participants had the slowest RT’s at the strongest beats in the dance condition only. In experiment 2, participants viewed the choreography of the horse-riding dance from Psy’s “Gangnam Style” in order to examine how a familiar dance might affect meter perception. Moreover, participants in this experiment were divided into a group with experience dancing this choreography and a group without experience. Results again showed slower RTs to stronger metrical positions and the group with experience demonstrated a more refined perception of metrical hierarchy. Results likely stem from the temporally selective division of attention between auditory and visual domains. This study has implications for understanding: 1) the impact of splitting attention among different sensory modalities, and 2) the impact of embodiment, on perception of musical meter. Viewing dance may interfere with sound processing, particularly at critical metrical positions, but embodied familiarity with dance choreography may facilitate meter awareness. Results shed light on the processing of multimedia environments.
Collapse
Affiliation(s)
- Kyung Myun Lee
- Smart Humanity Convergence Center, Graduate School of Convergence Science and Technology, Seoul National University, Seoul, Korea
- * E-mail:
| | - Karen Chan Barrett
- Peabody Institute of Music at Johns Hopkins University, Baltimore, Maryland, United States of America
| | - Yeonhwa Kim
- Graduate School of Convergence Science and Technology, Seoul National University, Seoul, Korea
| | - Yeoeun Lim
- College of Music, Seoul National University, Seoul, Korea
| | - Kyogu Lee
- Smart Humanity Convergence Center, Graduate School of Convergence Science and Technology, Seoul National University, Seoul, Korea
- Graduate School of Convergence Science and Technology, Seoul National University, Seoul, Korea
| |
Collapse
|