1
|
Herff SA, Bonetti L, Cecchetti G, Vuust P, Kringelbach ML, Rohrmeier MA. Hierarchical syntax model of music predicts theta power during music listening. Neuropsychologia 2024; 199:108905. [PMID: 38740179 DOI: 10.1016/j.neuropsychologia.2024.108905] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2023] [Revised: 03/07/2024] [Accepted: 05/06/2024] [Indexed: 05/16/2024]
Abstract
Linguistic research showed that the depth of syntactic embedding is reflected in brain theta power. Here, we test whether this also extends to non-linguistic stimuli, specifically music. We used a hierarchical model of musical syntax to continuously quantify two types of expert-annotated harmonic dependencies throughout a piece of Western classical music: prolongation and preparation. Prolongations can roughly be understood as a musical analogue to linguistic coordination between constituents that share the same function (e.g., 'pizza' and 'pasta' in 'I ate pizza and pasta'). Preparation refers to the dependency between two harmonies whereby the first implies a resolution towards the second (e.g., dominant towards tonic; similar to how the adjective implies the presence of a noun in 'I like spicy … '). Source reconstructed MEG data of sixty-five participants listening to the musical piece was then analysed. We used Bayesian Mixed Effects models to predict theta envelope in the brain, using the number of open prolongation and preparation dependencies as predictors whilst controlling for audio envelope. We observed that prolongation and preparation both carry independent and distinguishable predictive value for theta band fluctuation in key linguistic areas such as the Angular, Superior Temporal, and Heschl's Gyri, or their right-lateralised homologues, with preparation showing additional predictive value for areas associated with the reward system and prediction. Musical expertise further mediated these effects in language-related brain areas. Results show that predictions of precisely formalised music-theoretical models are reflected in the brain activity of listeners which furthers our understanding of the perception and cognition of musical structure.
Collapse
Affiliation(s)
- Steffen A Herff
- Sydney Conservatorium of Music, University of Sydney, Sydney, Australia; The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia; Digital and Cognitive Musicology Lab, College of Humanities, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland.
| | - Leonardo Bonetti
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music, Aarhus/Aalborg, Denmark; Centre for Eudaimonia and Human Flourishing, Linacre College, University of Oxford, Oxford, United Kingdom; Department of Psychiatry, University of Oxford, Oxford, United Kingdom
| | - Gabriele Cecchetti
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia; Digital and Cognitive Musicology Lab, College of Humanities, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Peter Vuust
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music, Aarhus/Aalborg, Denmark
| | - Morten L Kringelbach
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music, Aarhus/Aalborg, Denmark; Centre for Eudaimonia and Human Flourishing, Linacre College, University of Oxford, Oxford, United Kingdom; Department of Psychiatry, University of Oxford, Oxford, United Kingdom
| | - Martin A Rohrmeier
- Digital and Cognitive Musicology Lab, College of Humanities, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| |
Collapse
|
2
|
Betancourt A, Pérez O, Gámez J, Mendoza G, Merchant H. Amodal population clock in the primate medial premotor system for rhythmic tapping. Cell Rep 2023; 42:113234. [PMID: 37838944 DOI: 10.1016/j.celrep.2023.113234] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2022] [Revised: 08/09/2023] [Accepted: 09/24/2023] [Indexed: 10/17/2023] Open
Abstract
The neural substrate for beat extraction and response entrainment to rhythms is not fully understood. Here we analyze the activity of medial premotor neurons in monkeys performing isochronous tapping guided by brief flashing stimuli or auditory tones. The population dynamics shared the following properties across modalities: the circular dynamics of the neural trajectories form a regenerating loop for every produced interval; the trajectories converge in similar state space at tapping times resetting the clock; and the tempo of the synchronized tapping is encoded in the trajectories by a combination of amplitude modulation and temporal scaling. Notably, the modality induces displacement in the neural trajectories in the auditory and visual subspaces without greatly altering the time-keeping mechanism. These results suggest that the interaction between the medial premotor cortex's amodal internal representation of pulse and a modality-specific external input generates a neural rhythmic clock whose dynamics govern rhythmic tapping execution across senses.
Collapse
Affiliation(s)
- Abraham Betancourt
- Instituto de Neurobiología, UNAM, Campus Juriquilla, Boulevard Juriquilla No. 3001, Querétaro, Qro 76230, México
| | - Oswaldo Pérez
- Escuela Nacional de Estudios Superiores, Unidad Juriquilla, UNAM, Boulevard Juriquilla No. 3001, Querétaro, Qro 76230, México
| | - Jorge Gámez
- Instituto de Neurobiología, UNAM, Campus Juriquilla, Boulevard Juriquilla No. 3001, Querétaro, Qro 76230, México
| | - Germán Mendoza
- Instituto de Neurobiología, UNAM, Campus Juriquilla, Boulevard Juriquilla No. 3001, Querétaro, Qro 76230, México
| | - Hugo Merchant
- Instituto de Neurobiología, UNAM, Campus Juriquilla, Boulevard Juriquilla No. 3001, Querétaro, Qro 76230, México.
| |
Collapse
|
3
|
Jacques C, Jonas J, Colnat-Coulbois S, Maillard L, Rossion B. Low and high frequency intracranial neural signals match in the human associative cortex. eLife 2022; 11:76544. [PMID: 36074548 PMCID: PMC9457683 DOI: 10.7554/elife.76544] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2021] [Accepted: 08/18/2022] [Indexed: 11/13/2022] Open
Abstract
In vivo intracranial recordings of neural activity offer a unique opportunity to understand human brain function. Intracranial electrophysiological (iEEG) activity related to sensory, cognitive or motor events manifests mostly in two types of signals: event-related local field potentials in lower frequency bands (<30 Hz, LF) and broadband activity in the higher end of the frequency spectrum (>30 Hz, High frequency, HF). While most current studies rely exclusively on HF, thought to be more focal and closely related to spiking activity, the relationship between HF and LF signals is unclear, especially in human associative cortex. Here, we provide a large-scale in-depth investigation of the spatial and functional relationship between these 2 signals based on intracranial recordings from 121 individual brains (8000 recording sites). We measure category-selective responses to complex ecologically salient visual stimuli - human faces - across a wide cortical territory in the ventral occipito-temporal cortex (VOTC), with a frequency-tagging method providing high signal-to-noise ratio (SNR) and the same objective quantification of signal and noise for the two frequency ranges. While LF face-selective activity has higher SNR across the VOTC, leading to a larger number of significant electrode contacts especially in the anterior temporal lobe, LF and HF display highly similar spatial, functional, and timing properties. Specifically, and contrary to a widespread assumption, our results point to nearly identical spatial distribution and local spatial extent of LF and HF activity at equal SNR. These observations go a long way towards clarifying the relationship between the two main iEEG signals and reestablish the informative value of LF iEEG to understand human brain function.
Collapse
Affiliation(s)
- Corentin Jacques
- Université de Lorraine, CNRS, CRAN, Nancy, France.,Psychological Sciences Research Institute (IPSY), Université Catholique de Louvain (UCLouvain), Louvain-la-Neuve, Belgium
| | - Jacques Jonas
- Université de Lorraine, CNRS, CRAN, Nancy, France.,Université de Lorraine, CHRU-Nancy, Service de Neurologie, Nancy, France
| | | | - Louis Maillard
- Université de Lorraine, CNRS, CRAN, Nancy, France.,Université de Lorraine, CHRU-Nancy, Service de Neurologie, Nancy, France
| | - Bruno Rossion
- Université de Lorraine, CNRS, CRAN, Nancy, France.,Université de Lorraine, CHRU-Nancy, Service de Neurologie, Nancy, France
| |
Collapse
|
4
|
Sauvé SA, Bolt ELW, Nozaradan S, Zendel BR. Aging effects on neural processing of rhythm and meter. Front Aging Neurosci 2022; 14:848608. [PMID: 36118692 PMCID: PMC9475293 DOI: 10.3389/fnagi.2022.848608] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2022] [Accepted: 08/01/2022] [Indexed: 11/13/2022] Open
Abstract
When listening to musical rhythm, humans can perceive and move to beat-like metrical pulses. Recently, it has been hypothesized that meter perception is related to brain activity responding to the acoustic fluctuation of the rhythmic input, with selective enhancement of the brain response elicited at meter-related frequencies. In the current study, electroencephalography (EEG) was recorded while younger (<35) and older (>60) adults listened to rhythmic patterns presented at two different tempi while intermittently performing a tapping task. Despite significant hearing loss compared to younger adults, older adults showed preserved brain activity to the rhythms. However, age effects were observed in the distribution of amplitude across frequencies. Specifically, in contrast with younger adults, older adults showed relatively larger amplitude at the frequency corresponding to the rate of individual events making up the rhythms as compared to lower meter-related frequencies. This difference is compatible with larger N1-P2 potentials as generally observed in older adults in response to acoustic onsets, irrespective of meter perception. These larger low-level responses to sounds have been linked to processes by which age-related hearing loss would be compensated by cortical sensory mechanisms. Importantly, this low-level effect would be associated here with relatively reduced neural activity at lower frequencies corresponding to higher-level metrical grouping of the acoustic events, as compared to younger adults.
Collapse
|
5
|
Lenc T, Merchant H, Keller PE, Honing H, Varlet M, Nozaradan S. Mapping between sound, brain and behaviour: four-level framework for understanding rhythm processing in humans and non-human primates. Philos Trans R Soc Lond B Biol Sci 2021; 376:20200325. [PMID: 34420381 PMCID: PMC8380981 DOI: 10.1098/rstb.2020.0325] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/14/2021] [Indexed: 12/16/2022] Open
Abstract
Humans perceive and spontaneously move to one or several levels of periodic pulses (a meter, for short) when listening to musical rhythm, even when the sensory input does not provide prominent periodic cues to their temporal location. Here, we review a multi-levelled framework to understanding how external rhythmic inputs are mapped onto internally represented metric pulses. This mapping is studied using an approach to quantify and directly compare representations of metric pulses in signals corresponding to sensory inputs, neural activity and behaviour (typically body movement). Based on this approach, recent empirical evidence can be drawn together into a conceptual framework that unpacks the phenomenon of meter into four levels. Each level highlights specific functional processes that critically enable and shape the mapping from sensory input to internal meter. We discuss the nature, constraints and neural substrates of these processes, starting with fundamental mechanisms investigated in macaque monkeys that enable basic forms of mapping between simple rhythmic stimuli and internally represented metric pulse. We propose that human evolution has gradually built a robust and flexible system upon these fundamental processes, allowing more complex levels of mapping to emerge in musical behaviours. This approach opens promising avenues to understand the many facets of rhythmic behaviours across individuals and species. This article is part of the theme issue 'Synchrony and rhythm interaction: from the brain to behavioural ecology'.
Collapse
Affiliation(s)
- Tomas Lenc
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, New South Wales 2751, Australia
- Institute of Neuroscience (IONS), Université Catholique de Louvain (UCL), Brussels 1200, Belgium
| | - Hugo Merchant
- Instituto de Neurobiologia, UNAM, Campus Juriquilla, Querétaro 76230, Mexico
| | - Peter E. Keller
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, New South Wales 2751, Australia
| | - Henkjan Honing
- Amsterdam Brain and Cognition (ABC), Institute for Logic, Language and Computation (ILLC), University of Amsterdam, Amsterdam 1090 GE, The Netherlands
| | - Manuel Varlet
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, New South Wales 2751, Australia
- School of Psychology, Western Sydney University, Penrith, New South Wales 2751, Australia
| | - Sylvie Nozaradan
- Institute of Neuroscience (IONS), Université Catholique de Louvain (UCL), Brussels 1200, Belgium
| |
Collapse
|
6
|
Sifuentes-Ortega R, Lenc T, Nozaradan S, Peigneux P. Partially Preserved Processing of Musical Rhythms in REM but Not in NREM Sleep. Cereb Cortex 2021; 32:1508-1519. [PMID: 34491309 DOI: 10.1093/cercor/bhab303] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/01/2023] Open
Abstract
The extent of high-level perceptual processing during sleep remains controversial. In wakefulness, perception of periodicities supports the emergence of high-order representations such as the pulse-like meter perceived while listening to music. Electroencephalography (EEG) frequency-tagged responses elicited at envelope frequencies of musical rhythms have been shown to provide a neural representation of rhythm processing. Specifically, responses at frequencies corresponding to the perceived meter are enhanced over responses at meter-unrelated frequencies. This selective enhancement must rely on higher-level perceptual processes, as it occurs even in irregular (i.e., syncopated) rhythms where meter frequencies are not prominent input features, thus ruling out acoustic confounds. We recorded EEG while presenting a regular (unsyncopated) and an irregular (syncopated) rhythm across sleep stages and wakefulness. Our results show that frequency-tagged responses at meter-related frequencies of the rhythms were selectively enhanced during wakefulness but attenuated across sleep states. Most importantly, this selective attenuation occurred even in response to the irregular rhythm, where meter-related frequencies were not prominent in the stimulus, thus suggesting that neural processes selectively enhancing meter-related frequencies during wakefulness are weakened during rapid eye movement (REM) and further suppressed in non-rapid eye movement (NREM) sleep. These results indicate preserved processing of low-level acoustic properties but limited higher-order processing of auditory rhythms during sleep.
Collapse
Affiliation(s)
- Rebeca Sifuentes-Ortega
- UR2NF - Neuropsychology and Functional Neuroimaging Research Unit at CRCN - Center for Research in Cognition & Neurosciences, and UNI - ULB Neuroscience Institute, Université Libre de Bruxelles (ULB), 1050 Brussels, Belgium
| | - Tomas Lenc
- Institute of Neuroscience (IONS), Université Catholique de Louvain, 1200 Brussels, Belgium
| | - Sylvie Nozaradan
- Institute of Neuroscience (IONS), Université Catholique de Louvain, 1200 Brussels, Belgium
| | - Philippe Peigneux
- UR2NF - Neuropsychology and Functional Neuroimaging Research Unit at CRCN - Center for Research in Cognition & Neurosciences, and UNI - ULB Neuroscience Institute, Université Libre de Bruxelles (ULB), 1050 Brussels, Belgium
| |
Collapse
|
7
|
Zhang J, Firestone E, Elattma A. Animal Models of Tinnitus Treatment: Cochlear and Brain Stimulation. Curr Top Behav Neurosci 2021; 51:83-129. [PMID: 34282563 DOI: 10.1007/7854_2021_227] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
Abstract
Neuromodulation, via stimulation of a variety of peripheral and central structures, is used to suppress tinnitus. However, investigative limitations in humans due to ethical reasons have made it difficult to decipher the mechanisms underlying treatment-induced tinnitus relief, so a number of animal models have arisen to address these unknowns. This chapter reviews animal models of cochlear and brain stimulation and assesses their modulatory effects on behavioral evidence of tinnitus and its related neural correlates. When a structure is stimulated, localized modulation, often presenting as downregulation of spontaneous neuronal spike firing rate, bursting and neurosynchrony, occurs within the brain area. Through anatomical projections and transmitter pathways, the interventions activate both auditory- and non-auditory structures by taking bottom-up ascending and top-down descending modes to influence their target brain structures. Furthermore, it is the brain oscillations that cochlear or brain stimulation evoke and connect the prefrontal cortex, striatal systems, and other limbic structures to refresh neural networks and relieve auditory, attentive, conscious, as well as emotional reactive aspects of tinnitus. This oscillatory neural network connectivity is achieved via the thalamocorticothalamic circuitry including the lemniscal and non-lemniscal auditory brain structures. Beyond existing technologies, the review also reveals opportunities for developing advanced animal models using new modalities to achieve precision neuromodulation and tinnitus abatement, such as optogenetic cochlear and/or brain stimulation.
Collapse
Affiliation(s)
- Jinsheng Zhang
- Department of Otolaryngology-Head and Neck Surgery, Wayne State University School of Medicine, Detroit, MI, USA. .,Department of Communication Sciences and Disorders, Wayne State University College of Liberal Arts and Sciences, Detroit, MI, USA.
| | - Ethan Firestone
- Department of Otolaryngology-Head and Neck Surgery, Wayne State University School of Medicine, Detroit, MI, USA
| | - Ahmed Elattma
- Department of Otolaryngology-Head and Neck Surgery, Wayne State University School of Medicine, Detroit, MI, USA
| |
Collapse
|
8
|
Prefrontal High Gamma in ECoG Tags Periodicity of Musical Rhythms in Perception and Imagination. eNeuro 2020; 7:ENEURO.0413-19.2020. [PMID: 32586843 PMCID: PMC7405071 DOI: 10.1523/eneuro.0413-19.2020] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2019] [Revised: 05/19/2020] [Accepted: 06/01/2020] [Indexed: 01/08/2023] Open
Abstract
Rhythmic auditory stimuli are known to elicit matching activity patterns in neural populations. Furthermore, recent research has established the particular importance of high-gamma brain activity in auditory processing by showing its involvement in auditory phrase segmentation and envelope tracking. Here, we use electrocorticographic (ECoG) recordings from eight human listeners to see whether periodicities in high-gamma activity track the periodicities in the envelope of musical rhythms during rhythm perception and imagination. Rhythm imagination was elicited by instructing participants to imagine the rhythm to continue during pauses of several repetitions. To identify electrodes whose periodicities in high-gamma activity track the periodicities in the musical rhythms, we compute the correlation between the autocorrelations (ACCs) of both the musical rhythms and the neural signals. A condition in which participants listened to white noise was used to establish a baseline. High-gamma autocorrelations in auditory areas in the superior temporal gyrus and in frontal areas on both hemispheres significantly matched the autocorrelations of the musical rhythms. Overall, numerous significant electrodes are observed on the right hemisphere. Of particular interest is a large cluster of electrodes in the right prefrontal cortex that is active during both rhythm perception and imagination. This indicates conscious processing of the rhythms’ structure as opposed to mere auditory phenomena. The autocorrelation approach clearly highlights that high-gamma activity measured from cortical electrodes tracks both attended and imagined rhythms.
Collapse
|
9
|
Lenc T, Keller PE, Varlet M, Nozaradan S. Neural and Behavioral Evidence for Frequency-Selective Context Effects in Rhythm Processing in Humans. Cereb Cortex Commun 2020; 1:tgaa037. [PMID: 34296106 PMCID: PMC8152888 DOI: 10.1093/texcom/tgaa037] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Revised: 06/30/2020] [Accepted: 07/16/2020] [Indexed: 01/17/2023] Open
Abstract
When listening to music, people often perceive and move along with a periodic meter. However, the dynamics of mapping between meter perception and the acoustic cues to meter periodicities in the sensory input remain largely unknown. To capture these dynamics, we recorded the electroencephalography while nonmusician and musician participants listened to nonrepeating rhythmic sequences, where acoustic cues to meter frequencies either gradually decreased (from regular to degraded) or increased (from degraded to regular). The results revealed greater neural activity selectively elicited at meter frequencies when the sequence gradually changed from regular to degraded compared with the opposite. Importantly, this effect was unlikely to arise from overall gain, or low-level auditory processing, as revealed by physiological modeling. Moreover, the context effect was more pronounced in nonmusicians, who also demonstrated facilitated sensory-motor synchronization with the meter for sequences that started as regular. In contrast, musicians showed weaker effects of recent context in their neural responses and robust ability to move along with the meter irrespective of stimulus degradation. Together, our results demonstrate that brain activity elicited by rhythm does not only reflect passive tracking of stimulus features, but represents continuous integration of sensory input with recent context.
Collapse
Affiliation(s)
- Tomas Lenc
- MARCS Institute for Brain, Behaviour, and Development, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
| | - Peter E Keller
- MARCS Institute for Brain, Behaviour, and Development, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
| | - Manuel Varlet
- MARCS Institute for Brain, Behaviour, and Development, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
- School of Psychology, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
| | - Sylvie Nozaradan
- MARCS Institute for Brain, Behaviour, and Development, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
- Institute of Neuroscience (IONS), Université Catholique de Louvain (UCL), Brussels 1200, Belgium
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal QC H3C 3J7, Canada
| |
Collapse
|
10
|
Liberati G, Klöcker A, Algoet M, Mulders D, Maia Safronova M, Ferrao Santos S, Ribeiro Vaz JG, Raftopoulos C, Mouraux A. Gamma-Band Oscillations Preferential for Nociception can be Recorded in the Human Insula. Cereb Cortex 2018; 28:3650-3664. [PMID: 29028955 PMCID: PMC6366557 DOI: 10.1093/cercor/bhx237] [Citation(s) in RCA: 37] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2017] [Indexed: 12/17/2022] Open
Abstract
Transient nociceptive stimuli elicit robust phase-locked local field potentials (LFPs) in the human insula. However, these responses are not preferential for nociception, as they are also elicited by transient non-nociceptive vibrotactile, auditory, and visual stimuli. Here, we investigated whether another feature of insular activity, namely gamma-band oscillations (GBOs), is preferentially observed in response to nociceptive stimuli. Although nociception-evoked GBOs have never been explored in the insula, previous scalp electroencephalography and magnetoencephalography studies suggest that nociceptive stimuli elicit GBOs in other areas such as the primary somatosensory and prefrontal cortices, and that this activity could be closely related to pain perception. Furthermore, tracing studies showed that the insula is a primary target of spinothalamic input. Using depth electrodes implanted in 9 patients investigated for epilepsy, we acquired insular responses to brief thermonociceptive stimuli and similarly arousing non-nociceptive vibrotactile, auditory, and visual stimuli (59 insular sites). As compared with non-nociceptive stimuli, nociceptive stimuli elicited a markedly stronger enhancement of GBOs (150-300 ms poststimulus) at all insular sites, suggesting that this feature of insular activity is preferential for thermonociception. Although this activity was also present in temporal and frontal regions, its magnitude was significantly greater in the insula as compared with these other regions.
Collapse
Affiliation(s)
- Giulia Liberati
- Institute of Neuroscience, Université catholique de Louvain,
1200 Brussels, Belgium
| | - Anne Klöcker
- Institute of Neuroscience, Université catholique de Louvain,
1200 Brussels, Belgium
| | - Maxime Algoet
- Institute of Neuroscience, Université catholique de Louvain,
1200 Brussels, Belgium
| | - Dounia Mulders
- Institute of Neuroscience, Université catholique de Louvain,
1200 Brussels, Belgium
| | - Marta Maia Safronova
- Department of Radiology, Neuroradiology Clinic, Erasme Hospital,
1070 Brussels, Belgium
| | | | | | | | - André Mouraux
- Institute of Neuroscience, Université catholique de Louvain,
1200 Brussels, Belgium
| |
Collapse
|
11
|
Abstract
Bass sounds play a special role in conveying the rhythm and stimulating motor entrainment to the beat of music. However, the biological roots of this culturally widespread musical practice remain mysterious, despite its fundamental relevance in the sciences and arts, and also for music-assisted clinical rehabilitation of motor disorders. Here, we show that this musical convention may exploit a neurophysiological mechanism whereby low-frequency sounds shape neural representations of rhythmic input at the cortical level by boosting selective neural locking to the beat, thus explaining the privileged role of bass sounds in driving people to move along with the musical beat. Music makes us move, and using bass instruments to build the rhythmic foundations of music is especially effective at inducing people to dance to periodic pulse-like beats. Here, we show that this culturally widespread practice may exploit a neurophysiological mechanism whereby low-frequency sounds shape the neural representations of rhythmic input by boosting selective locking to the beat. Cortical activity was captured using electroencephalography (EEG) while participants listened to a regular rhythm or to a relatively complex syncopated rhythm conveyed either by low tones (130 Hz) or high tones (1236.8 Hz). We found that cortical activity at the frequency of the perceived beat is selectively enhanced compared with other frequencies in the EEG spectrum when rhythms are conveyed by bass sounds. This effect is unlikely to arise from early cochlear processes, as revealed by auditory physiological modeling, and was particularly pronounced for the complex rhythm requiring endogenous generation of the beat. The effect is likewise not attributable to differences in perceived loudness between low and high tones, as a control experiment manipulating sound intensity alone did not yield similar results. Finally, the privileged role of bass sounds is contingent on allocation of attentional resources to the temporal properties of the stimulus, as revealed by a further control experiment examining the role of a behavioral task. Together, our results provide a neurobiological basis for the convention of using bass instruments to carry the rhythmic foundations of music and to drive people to move to the beat.
Collapse
|
12
|
Rossion B, Jacques C, Jonas J. Mapping face categorization in the human ventral occipitotemporal cortex with direct neural intracranial recordings. Ann N Y Acad Sci 2018; 1426:5-24. [PMID: 29479704 DOI: 10.1111/nyas.13596] [Citation(s) in RCA: 38] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2017] [Revised: 12/07/2017] [Accepted: 12/14/2017] [Indexed: 11/29/2022]
Abstract
The neural basis of face categorization has been widely investigated with functional magnetic resonance imaging (fMRI), identifying a set of face-selective local regions in the ventral occipitotemporal cortex (VOTC). However, indirect recording of neural activity with fMRI is associated with large fluctuations of signal across regions, often underestimating face-selective responses in the anterior VOTC. While direct recording of neural activity with subdural grids of electrodes (electrocorticography, ECoG) or depth electrodes (stereotactic electroencephalography, SEEG) offers a unique opportunity to fill this gap in knowledge, these studies rather reveal widely distributed face-selective responses. Moreover, intracranial recordings are complicated by interindividual variability in neuroanatomy, ambiguity in definition, and quantification of responses of interest, as well as limited access to sulci with ECoG. Here, we propose to combine SEEG in large samples of individuals with fast periodic visual stimulation to objectively define, quantify, and characterize face categorization across the whole VOTC. This approach reconciles the wide distribution of neural face categorization responses with their (right) hemispheric and regional specialization, and reveals several face-selective regions in anterior VOTC sulci. We outline the challenges of this research program to understand the neural basis of face categorization and high-level visual recognition in general.
Collapse
Affiliation(s)
- Bruno Rossion
- Psychological Sciences Research Institute, Institute of Neuroscience, University of Louvain (UCLouvain), Louvain-la-Neuve, Belgium
- Service de Neurologie, Centre Hospitalier Régional Universitaire (CHRU) de Nancy, Nancy, France
- CRAN, UMR 7039, CNRS et Université de Lorraine, Nancy, France
| | - Corentin Jacques
- Psychological Sciences Research Institute, Institute of Neuroscience, University of Louvain (UCLouvain), Louvain-la-Neuve, Belgium
- Research Group Psychiatry, Department of Neuroscience, University of Leuven, Leuven, Belgium
| | - Jacques Jonas
- Psychological Sciences Research Institute, Institute of Neuroscience, University of Louvain (UCLouvain), Louvain-la-Neuve, Belgium
- Service de Neurologie, Centre Hospitalier Régional Universitaire (CHRU) de Nancy, Nancy, France
- CRAN, UMR 7039, CNRS et Université de Lorraine, Nancy, France
| |
Collapse
|
13
|
Nozaradan S, Schönwiesner M, Keller PE, Lenc T, Lehmann A. Neural bases of rhythmic entrainment in humans: critical transformation between cortical and lower-level representations of auditory rhythm. Eur J Neurosci 2018; 47:321-332. [PMID: 29356161 DOI: 10.1111/ejn.13826] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/20/2017] [Revised: 01/12/2018] [Accepted: 01/15/2018] [Indexed: 11/29/2022]
Abstract
The spontaneous ability to entrain to meter periodicities is central to music perception and production across cultures. There is increasing evidence that this ability involves selective neural responses to meter-related frequencies. This phenomenon has been observed in the human auditory cortex, yet it could be the product of evolutionarily older lower-level properties of brainstem auditory neurons, as suggested by recent recordings from rodent midbrain. We addressed this question by taking advantage of a new method to simultaneously record human EEG activity originating from cortical and lower-level sources, in the form of slow (< 20 Hz) and fast (> 150 Hz) responses to auditory rhythms. Cortical responses showed increased amplitudes at meter-related frequencies compared to meter-unrelated frequencies, regardless of the prominence of the meter-related frequencies in the modulation spectrum of the rhythmic inputs. In contrast, frequency-following responses showed increased amplitudes at meter-related frequencies only in rhythms with prominent meter-related frequencies in the input but not for a more complex rhythm requiring more endogenous generation of the meter. This interaction with rhythm complexity suggests that the selective enhancement of meter-related frequencies does not fully rely on subcortical auditory properties, but is critically shaped at the cortical level, possibly through functional connections between the auditory cortex and other, movement-related, brain structures. This process of temporal selection would thus enable endogenous and motor entrainment to emerge with substantial flexibility and invariance with respect to the rhythmic input in humans in contrast with non-human animals.
Collapse
Affiliation(s)
- Sylvie Nozaradan
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Locked Bag 1797, Penrith, Sydney, NSW, 2751, Australia.,Institute of Neuroscience (IONS), Université catholique de Louvain (UCL), Louvain, Belgium.,International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, QC, Canada
| | - Marc Schönwiesner
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, QC, Canada.,Center for Research on Brain, Language and Music (CRBLM), Montreal, QC, Canada.,Faculty of Psychology, Université de Montréal, Montreal, QC, Canada
| | - Peter E Keller
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Locked Bag 1797, Penrith, Sydney, NSW, 2751, Australia
| | - Tomas Lenc
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Locked Bag 1797, Penrith, Sydney, NSW, 2751, Australia
| | - Alexandre Lehmann
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, QC, Canada.,Center for Research on Brain, Language and Music (CRBLM), Montreal, QC, Canada.,Faculty of Psychology, Université de Montréal, Montreal, QC, Canada.,Otolaryngology Department, Faculty of Medicine, McGill University Hospital, Montreal, QC, Canada
| |
Collapse
|
14
|
Nozaradan S, Keller PE, Rossion B, Mouraux A. EEG Frequency-Tagging and Input-Output Comparison in Rhythm Perception. Brain Topogr 2017; 31:153-160. [PMID: 29127530 DOI: 10.1007/s10548-017-0605-8] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2017] [Accepted: 10/27/2017] [Indexed: 01/23/2023]
Abstract
The combination of frequency-tagging with electroencephalography (EEG) has recently proved fruitful for understanding the perception of beat and meter in musical rhythm, a common behavior shared by humans of all cultures. EEG frequency-tagging allows the objective measurement of input-output transforms to investigate beat perception, its modulation by exogenous and endogenous factors, development, and neural basis. Recent doubt has been raised about the validity of comparing frequency-domain representations of auditory rhythmic stimuli and corresponding EEG responses, assuming that it implies a one-to-one mapping between the envelope of the rhythmic input and the neural output, and that it neglects the sensitivity of frequency-domain representations to acoustic features making up the rhythms. Here we argue that these elements actually reinforce the strengths of the approach. The obvious fact that acoustic features influence the frequency spectrum of the sound envelope precisely justifies taking into consideration the sounds used to generate a beat percept for interpreting neural responses to auditory rhythms. Most importantly, the many-to-one relationship between rhythmic input and perceived beat actually validates an approach that objectively measures the input-output transforms underlying the perceptual categorization of rhythmic inputs. Hence, provided that a number of potential pitfalls and fallacies are avoided, EEG frequency-tagging to study input-output relationships appears valuable for understanding rhythm perception.
Collapse
Affiliation(s)
- Sylvie Nozaradan
- The MARCS Institute for Brain, Behaviour and Development (WSU), Sydney, NSW, Australia. .,Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Brussels, Belgium. .,International Laboratory for Brain, Music and Sound Research (Brams), Montreal, QC, Canada. .,MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Locked Bag 1797, Penrith, NSW, 2751, Australia.
| | - Peter E Keller
- The MARCS Institute for Brain, Behaviour and Development (WSU), Sydney, NSW, Australia
| | - Bruno Rossion
- Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Brussels, Belgium.,Neurology Unit, Centre Hospitalier Régional Universitaire (CHRU) de Nancy, Nancy, France
| | - André Mouraux
- Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Brussels, Belgium
| |
Collapse
|
15
|
Nicolaou N, Malik A, Daly I, Weaver J, Hwang F, Kirke A, Roesch EB, Williams D, Miranda ER, Nasuto SJ. Directed Motor-Auditory EEG Connectivity Is Modulated by Music Tempo. Front Hum Neurosci 2017; 11:502. [PMID: 29093672 PMCID: PMC5651276 DOI: 10.3389/fnhum.2017.00502] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2017] [Accepted: 10/02/2017] [Indexed: 11/18/2022] Open
Abstract
Beat perception is fundamental to how we experience music, and yet the mechanism behind this spontaneous building of the internal beat representation is largely unknown. Existing findings support links between the tempo (speed) of the beat and enhancement of electroencephalogram (EEG) activity at tempo-related frequencies, but there are no studies looking at how tempo may affect the underlying long-range interactions between EEG activity at different electrodes. The present study investigates these long-range interactions using EEG activity recorded from 21 volunteers listening to music stimuli played at 4 different tempi (50, 100, 150 and 200 beats per minute). The music stimuli consisted of piano excerpts designed to convey the emotion of “peacefulness”. Noise stimuli with an identical acoustic content to the music excerpts were also presented for comparison purposes. The brain activity interactions were characterized with the imaginary part of coherence (iCOH) in the frequency range 1.5–18 Hz (δ, θ, α and lower β) between all pairs of EEG electrodes for the four tempi and the music/noise conditions, as well as a baseline resting state (RS) condition obtained at the start of the experimental task. Our findings can be summarized as follows: (a) there was an ongoing long-range interaction in the RS engaging fronto-posterior areas; (b) this interaction was maintained in both music and noise, but its strength and directionality were modulated as a result of acoustic stimulation; (c) the topological patterns of iCOH were similar for music, noise and RS, however statistically significant differences in strength and direction of iCOH were identified; and (d) tempo had an effect on the direction and strength of motor-auditory interactions. Our findings are in line with existing literature and illustrate a part of the mechanism by which musical stimuli with different tempi can entrain changes in cortical activity.
Collapse
Affiliation(s)
- Nicoletta Nicolaou
- Brain Embodiment Laboratory, Biomedical Engineering Section, School of Biological Sciences, University of Reading, Reading, United Kingdom.,Department of Electrical and Electronic Engineering, Imperial College London, London, United Kingdom
| | - Asad Malik
- Brain Embodiment Laboratory, Biomedical Engineering Section, School of Biological Sciences, University of Reading, Reading, United Kingdom.,School of Psychology, University of Reading, Reading, United Kingdom.,Centre for Integrative Neuroscience and Neurodynamics, University of Reading, Reading, United Kingdom
| | - Ian Daly
- Brain-Computer Interfacing and Neural Engineering Laboratory, Department of Computer Science and Electronic Engineering, University of Essex, Colchester, United Kingdom
| | - James Weaver
- Brain Embodiment Laboratory, Biomedical Engineering Section, School of Biological Sciences, University of Reading, Reading, United Kingdom
| | - Faustina Hwang
- Brain Embodiment Laboratory, Biomedical Engineering Section, School of Biological Sciences, University of Reading, Reading, United Kingdom
| | - Alexis Kirke
- Interdisciplinary Centre for Computer Music Research, University of Plymouth, Plymouth, United Kingdom
| | - Etienne B Roesch
- School of Psychology, University of Reading, Reading, United Kingdom.,Centre for Integrative Neuroscience and Neurodynamics, University of Reading, Reading, United Kingdom
| | - Duncan Williams
- Interdisciplinary Centre for Computer Music Research, University of Plymouth, Plymouth, United Kingdom
| | - Eduardo R Miranda
- Interdisciplinary Centre for Computer Music Research, University of Plymouth, Plymouth, United Kingdom
| | - Slawomir J Nasuto
- Brain Embodiment Laboratory, Biomedical Engineering Section, School of Biological Sciences, University of Reading, Reading, United Kingdom
| |
Collapse
|
16
|
Nozaradan S, Mouraux A, Cousineau M. Frequency tagging to track the neural processing of contrast in fast, continuous sound sequences. J Neurophysiol 2017; 118:243-253. [PMID: 28381494 DOI: 10.1152/jn.00971.2016] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/27/2016] [Revised: 03/31/2017] [Accepted: 03/31/2017] [Indexed: 01/23/2023] Open
Abstract
The human auditory system presents a remarkable ability to detect rapid changes in fast, continuous acoustic sequences, as best illustrated in speech and music. However, the neural processing of rapid auditory contrast remains largely unclear, probably due to the lack of methods to objectively dissociate the response components specifically related to the contrast from the other components in response to the sequence of fast continuous sounds. To overcome this issue, we tested a novel use of the frequency-tagging approach allowing contrast-specific neural responses to be tracked based on their expected frequencies. The EEG was recorded while participants listened to 40-s sequences of sounds presented at 8Hz. A tone or interaural time contrast was embedded every fifth sound (AAAAB), such that a response observed in the EEG at exactly 8 Hz/5 (1.6 Hz) or harmonics should be the signature of contrast processing by neural populations. Contrast-related responses were successfully identified, even in the case of very fine contrasts. Moreover, analysis of the time course of the responses revealed a stable amplitude over repetitions of the AAAAB patterns in the sequence, except for the response to perceptually salient contrasts that showed a buildup and decay across repetitions of the sounds. Overall, this new combination of frequency-tagging with an oddball design provides a valuable complement to the classic, transient, evoked potentials approach, especially in the context of rapid auditory information. Specifically, we provide objective evidence on the neural processing of contrast embedded in fast, continuous sound sequences.NEW & NOTEWORTHY Recent theories suggest that the basis of neurodevelopmental auditory disorders such as dyslexia might be an impaired processing of fast auditory changes, highlighting how the encoding of rapid acoustic information is critical for auditory communication. Here, we present a novel electrophysiological approach to capture in humans neural markers of contrasts in fast continuous tone sequences. Contrast-specific responses were successfully identified, even for very fine contrasts, providing direct insight on the encoding of rapid auditory information.
Collapse
Affiliation(s)
- Sylvie Nozaradan
- Institute of Neuroscience, Université Catholique de Louvain, Brussels, Belgium; .,MARCS Institute for Brain, Behavior, and Development, Sydney, Australia; and.,International Laboratory for Brain, Music, and Sound Research (Brams), Montreal, Quebec, Canada
| | - André Mouraux
- Institute of Neuroscience, Université Catholique de Louvain, Brussels, Belgium
| | - Marion Cousineau
- International Laboratory for Brain, Music, and Sound Research (Brams), Montreal, Quebec, Canada
| |
Collapse
|
17
|
Henry MJ, Herrmann B, Grahn JA. What can we learn about beat perception by comparing brain signals and stimulus envelopes? PLoS One 2017; 12:e0172454. [PMID: 28225796 PMCID: PMC5321456 DOI: 10.1371/journal.pone.0172454] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2015] [Accepted: 02/06/2017] [Indexed: 01/30/2023] Open
Abstract
Entrainment of neural oscillations on multiple time scales is important for the perception of speech. Musical rhythms, and in particular the perception of a regular beat in musical rhythms, is also likely to rely on entrainment of neural oscillations. One recently proposed approach to studying beat perception in the context of neural entrainment and resonance (the "frequency-tagging" approach) has received an enthusiastic response from the scientific community. A specific version of the approach involves comparing frequency-domain representations of acoustic rhythm stimuli to the frequency-domain representations of neural responses to those rhythms (measured by electroencephalography, EEG). The relative amplitudes at specific EEG frequencies are compared to the relative amplitudes at the same stimulus frequencies, and enhancements at beat-related frequencies in the EEG signal are interpreted as reflecting an internal representation of the beat. Here, we show that frequency-domain representations of rhythms are sensitive to the acoustic features of the tones making up the rhythms (tone duration, onset/offset ramp duration); in fact, relative amplitudes at beat-related frequencies can be completely reversed by manipulating tone acoustics. Crucially, we show that changes to these acoustic tone features, and in turn changes to the frequency-domain representations of rhythms, do not affect beat perception. Instead, beat perception depends on the pattern of onsets (i.e., whether a rhythm has a simple or complex metrical structure). Moreover, we show that beat perception can differ for rhythms that have numerically identical frequency-domain representations. Thus, frequency-domain representations of rhythms are dissociable from beat perception. For this reason, we suggest caution in interpreting direct comparisons of rhythms and brain signals in the frequency domain. Instead, we suggest that combining EEG measurements of neural signals with creative behavioral paradigms is of more benefit to our understanding of beat perception.
Collapse
Affiliation(s)
- Molly J. Henry
- Brain and Mind Institute, Department of Psychology The University of Western Ontario, London, ON, Canada
| | - Björn Herrmann
- Brain and Mind Institute, Department of Psychology The University of Western Ontario, London, ON, Canada
| | - Jessica A. Grahn
- Brain and Mind Institute, Department of Psychology The University of Western Ontario, London, ON, Canada
| |
Collapse
|