1
|
A genomic basis of vocal rhythm in birds. Nat Commun 2024; 15:3095. [PMID: 38653976 DOI: 10.1038/s41467-024-47305-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2023] [Accepted: 03/22/2024] [Indexed: 04/25/2024] Open
Abstract
Vocal rhythm plays a fundamental role in sexual selection and species recognition in birds, but little is known of its genetic basis due to the confounding effect of vocal learning in model systems. Uncovering its genetic basis could facilitate identifying genes potentially important in speciation. Here we investigate the genomic underpinnings of rhythm in vocal non-learning Pogoniulus tinkerbirds using 135 individual whole genomes distributed across a southern African hybrid zone. We find rhythm speed is associated with two genes that are also known to affect human speech, Neurexin-1 and Coenzyme Q8A. Models leveraging ancestry reveal these candidate loci also impact rhythmic stability, a trait linked with motor performance which is an indicator of quality. Character displacement in rhythmic stability suggests possible reinforcement against hybridization, supported by evidence of asymmetric assortative mating in the species producing faster, more stable rhythms. Because rhythm is omnipresent in animal communication, candidate genes identified here may shape vocal rhythm across birds and other vertebrates.
Collapse
|
2
|
Spontaneous tempo production in cockatiels (Nymphicus hollandicus) and jungle crows (Corvus macrorhynchos). Behav Processes 2024; 217:105007. [PMID: 38368968 DOI: 10.1016/j.beproc.2024.105007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2020] [Revised: 02/08/2024] [Accepted: 02/14/2024] [Indexed: 02/20/2024]
Abstract
Musical and rhythmical abilities are poorly documented in non-human animals. Most of the existing studies focused on synchronisation performances to external rhythms. In humans, studies demonstrated that rhythmical processing (e. g. rhythm discrimination or synchronisation to external rhythm) is dependent of an individual measure: the individual tempo. It is assessed by asking participants to produce an endogenous isochronous rhythm (known as spontaneous motor tempo) without any specific instructions nor temporal cue. In non-human animal literature, studies describing spontaneous and endogenous production of motor tempo without any temporal clue are rare. This exploratory study aims to describe and compare the spontaneous motor tempo of cockatiels and jungle crows. Data were collected on spontaneous beak drumming behaviours of birds housed in laboratory. Inter beak strokes intervals were calculated from sound tracks of videos. The analyses revealed that inter beak strokes intervals are non-randomly distributed intervals and are isochronous. Recorded spontaneous motor tempos are significantly different among some cockatiels. Since we could only conduct statistical analysis with one corvid, we cannot conclude about this species. Our results suggest that cockatiels and jungle crows have individual tempos, thus encouraging further investigations.
Collapse
|
3
|
Isochronous rhythms: Facilitating song coordination across taxa? Curr Biol 2024; 34:R201-R203. [PMID: 38471449 DOI: 10.1016/j.cub.2024.01.020] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/14/2024]
Abstract
The biological expression of isochronous rhythms - rhythms like those produced by a metronome - was once thought to be unique to humans. A new study reports that faster and more isochronous rhythms lead to more successful duets in singing gibbons: isochronous rhythms might be an important component of song coordination across taxa.
Collapse
|
4
|
Mobile version of the Battery for the Assessment of Auditory Sensorimotor and Timing Abilities (BAASTA): Implementation and adult norms. Behav Res Methods 2024:10.3758/s13428-024-02363-x. [PMID: 38459221 DOI: 10.3758/s13428-024-02363-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/02/2024] [Indexed: 03/10/2024]
Abstract
Timing and rhythm abilities are complex and multidimensional skills that are highly widespread in the general population. This complexity can be partly captured by the Battery for the Assessment of Auditory Sensorimotor and Timing Abilities (BAASTA). The battery, consisting of four perceptual and five sensorimotor tests (finger-tapping), has been used in healthy adults and in clinical populations (e.g., Parkinson's disease, ADHD, developmental dyslexia, stuttering), and shows sensitivity to individual differences and impairment. However, major limitations for the generalized use of this tool are the lack of reliable and standardized norms and of a version of the battery that can be used outside the lab. To circumvent these caveats, we put forward a new version of BAASTA on a tablet device capable of ensuring lab-equivalent measurements of timing and rhythm abilities. We present normative data obtained with this version of BAASTA from over 100 healthy adults between the ages of 18 and 87 years in a test-retest protocol. Moreover, we propose a new composite score to summarize beat-based rhythm capacities, the Beat Tracking Index (BTI), with close to excellent test-retest reliability. BTI derives from two BAASTA tests (beat alignment, paced tapping), and offers a swift and practical way of measuring rhythmic abilities when research imposes strong time constraints. This mobile BAASTA implementation is more inclusive and far-reaching, while opening new possibilities for reliable remote testing of rhythmic abilities by leveraging accessible and cost-efficient technologies.
Collapse
|
5
|
A review of psychological and neuroscientific research on musical groove. Neurosci Biobehav Rev 2024; 158:105522. [PMID: 38141692 DOI: 10.1016/j.neubiorev.2023.105522] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2023] [Revised: 12/18/2023] [Accepted: 12/19/2023] [Indexed: 12/25/2023]
Abstract
When listening to music, we naturally move our bodies rhythmically to the beat, which can be pleasurable and difficult to resist. This pleasurable sensation of wanting to move the body to music has been called "groove." Following pioneering humanities research, psychological and neuroscientific studies have provided insights on associated musical features, behavioral responses, phenomenological aspects, and brain structural and functional correlates of the groove experience. Groove research has advanced the field of music science and more generally informed our understanding of bidirectional links between perception and action, and the role of the motor system in prediction. Activity in motor and reward-related brain networks during music listening is associated with the groove experience, and this neural activity is linked to temporal prediction and learning. This article reviews research on groove as a psychological phenomenon with neurophysiological correlates that link musical rhythm perception, sensorimotor prediction, and reward processing. Promising future research directions range from elucidating specific neural mechanisms to exploring clinical applications and socio-cultural implications of groove.
Collapse
|
6
|
Isochrony in barks of Cape fur seal ( Arctocephalus pusillus pusillus) pups and adults. Ecol Evol 2024; 14:e11085. [PMID: 38463637 PMCID: PMC10920323 DOI: 10.1002/ece3.11085] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2023] [Revised: 02/09/2024] [Accepted: 02/13/2024] [Indexed: 03/12/2024] Open
Abstract
Animal vocal communication often relies on call sequences. The temporal patterns of such sequences can be adjusted to other callers, follow complex rhythmic structures or exhibit a metronome-like pattern (i.e., isochronous). How regular are the temporal patterns in animal signals, and what influences their precision? If present, are rhythms already there early in ontogeny? Here, we describe an exploratory study of Cape fur seal (Arctocephalus pusillus pusillus) barks-a vocalisation type produced across many pinniped species in rhythmic, percussive bouts. This study is the first quantitative description of barking in Cape fur seal pups. We analysed the rhythmic structures of spontaneous barking bouts of pups and adult females from the breeding colony in Cape Cross, Namibia. Barks of adult females exhibited isochrony, that is they were produced at fairly regular points in time. Instead, intervals between pup barks were more variable, that is skipping a bark in the isochronous series occasionally. In both age classes, beat precision, that is how well the barks followed a perfect template, was worse when barking at higher rates. Differences could be explained by physiological factors, such as respiration or arousal. Whether, and how, isochrony develops in this species remains an open question. This study provides evidence towards a rhythmic production of barks in Cape fur seal pups and lays the groundwork for future studies to investigate the development of rhythm using multidimensional metrics.
Collapse
|
7
|
Pace setting as an adaptive precursor of rhythmic musicality. Ann N Y Acad Sci 2024; 1533:5-15. [PMID: 38412090 DOI: 10.1111/nyas.15120] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/29/2024]
Abstract
Human musicality (the capacity to make and appreciate music) is difficult to explain in evolutionary terms, though many theories attempt to do so. This paper focuses on musicality's potential adaptive precursors, particularly as related to rhythm. It suggests that pace setting for walking and running long distances over extended time periods (endurance locomotion, EL) is a good candidate for an adaptive building block of rhythmic musicality. The argument is as follows: (1) over time, our hominin lineage developed a host of adaptations for efficient EL; (2) the ability to set and maintain a regular pace was a crucial adaptation in the service of EL, providing proximate rewards for successful execution; (3) maintaining a pace in EL occasioned hearing, feeling, and attending to regular rhythmic patterns; (4) these rhythmic patterns, as well as proximate rewards for maintaining them, became disassociated from locomotion and entrained in new proto-musical contexts. Support for the model and possibilities for generating predictions to test it are discussed.
Collapse
|
8
|
Rhythmic properties of Sciaena umbra calls across space and time in the Mediterranean Sea. PLoS One 2024; 19:e0295589. [PMID: 38381755 PMCID: PMC10881014 DOI: 10.1371/journal.pone.0295589] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2023] [Accepted: 11/22/2023] [Indexed: 02/23/2024] Open
Abstract
In animals, the rhythmical properties of calls are known to be shaped by physical constraints and the necessity of conveying information. As a consequence, investigating rhythmical properties in relation to different environmental conditions can help to shed light on the relationship between environment and species behavior from an evolutionary perspective. Sciaena umbra (fam. Sciaenidae) male fish emit reproductive calls characterized by a simple isochronous, i.e., metronome-like rhythm (the so-called R-pattern). Here, S. umbra R-pattern rhythm properties were assessed and compared between four different sites located along the Mediterranean basin (Mallorca, Venice, Trieste, Crete); furthermore, for one location, two datasets collected 10 years apart were available. Recording sites differed in habitat types, vessel density and acoustic richness; despite this, S. umbra R-calls were isochronous across all locations. A degree of variability was found only when considering the beat frequency, which was temporally stable, but spatially variable, with the beat frequency being faster in one of the sites (Venice). Statistically, the beat frequency was found to be dependent on the season (i.e. month of recording) and potentially influenced by the presence of soniferous competitors and human-generated underwater noise. Overall, the general consistency in the measured rhythmical properties (isochrony and beat frequency) suggests their nature as a fitness-related trait in the context of the S. umbra reproductive behavior and calls for further evaluation as a communicative cue.
Collapse
|
9
|
Inner sense of rhythm: percussionist brain activity during rhythmic encoding and synchronization. Front Neurosci 2024; 18:1342326. [PMID: 38419665 PMCID: PMC10899486 DOI: 10.3389/fnins.2024.1342326] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2023] [Accepted: 01/24/2024] [Indexed: 03/02/2024] Open
Abstract
Introduction The main objective of this research is to explore the core cognitive mechanisms utilized by exceptionally skilled percussionists as they navigate complex rhythms. Our specific focus is on understanding the dynamic interactions among brain regions, respectively, related to externally directed cognition (EDC), internally directed cognition (IDC), and rhythm processing, defined as the neural correlates of rhythm processing (NCRP). Methods The research involved 26 participants each in the percussionist group (PG) and control group (CG), who underwent task-functional magnetic resonance imaging (fMRI) sessions focusing on rhythm encoding and synchronization. Comparative analyses were performed between the two groups under each of these conditions. Results Rhythmic encoding showed decreased activity in EDC areas, specifically in the right calcarine cortex, left middle occipital gyrus, right fusiform gyrus, and left inferior parietal lobule, along with reduced NCRP activity in the left dorsal premotor, right sensorimotor cortex, and left superior parietal lobule. During rhythmic synchronization, there was increased activity in IDC areas, particularly in the default mode network, and in NCRP areas including the left inferior frontal gyrus and bilateral putamen. Conversely, EDC areas like the right dorsolateral prefrontal gyrus, right superior temporal gyrus, right middle occipital gyrus, and bilateral inferior parietal lobule showed decreased activity, as did NCRP areas including the bilateral dorsal premotor cortex, bilateral ventral insula, bilateral inferior frontal gyrus, and left superior parietal lobule. Discussion PG's rhythm encoding is characterized by reduced cognitive effort compared to CG, as evidenced by decreased activity in brain regions associated with EDC and the NCRP. Rhythmic synchronization reveals up-regulated IDC, down-regulated EDC involvement, and dynamic interplay among regions with the NCRP, suggesting that PG engages in both automatic and spontaneous processing simultaneously. These findings provide valuable insights into expert performance and present opportunities for improving music education.
Collapse
|
10
|
Linking vestibular, tactile, and somatosensory rhythm perception to language development in infancy. Cognition 2024; 243:105688. [PMID: 38101080 DOI: 10.1016/j.cognition.2023.105688] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2023] [Revised: 11/30/2023] [Accepted: 12/06/2023] [Indexed: 12/17/2023]
Abstract
First experiences with rhythm occur in the womb, with different rhythmic sources being available to the human fetus. Among sensory modalities, vestibular, tactile, and somatosensory perception plays a crucial role in early processing. However, a limited number of studies so far have specifically focused on VTS rhythms in language development. The present work investigated VTS rhythmic abilities and their role in language acquisition through two experiments with 45 infants (21 females, sex assigned at birth; M age = 661.6 days, SD = 192.6) with middle/high socioeconomic status. Specifically, 37 infants from the original sample completed Experiment 1, assessing VTS rhythmic abilities through a vibrotactile tool for music perception. In Experiment 2, linguistic abilities were evaluated in 40 participants from the same cohort, specifically testing phonological and prosodic processing. Discrimination abilities for rhythmic and linguistic stimuli were inferred from changes in pupil diameter to contingent visual stimuli over time, through a Tobii X-60 eye-tracker. The predictive effect of VTS rhythmic abilities on linguistic processing and the developmental changes occurring across ages were explored in the 32 infants who completed both Experiments 1 and 2 by means of generalized, additive and linear, mixed-effect models. Results are discussed in terms of cross-sensory (i.e., haptic to hearing) and cross-domain (i.e., music to language) effects of rhythm on language acquisition, with implications for typical and atypical development.
Collapse
|
11
|
Beat processing in newborn infants cannot be explained by statistical learning based on transition probabilities. Cognition 2024; 243:105670. [PMID: 38016227 DOI: 10.1016/j.cognition.2023.105670] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2023] [Revised: 11/08/2023] [Accepted: 11/17/2023] [Indexed: 11/30/2023]
Abstract
Newborn infants have been shown to extract temporal regularities from sound sequences, both in the form of learning regular sequential properties, and extracting periodicity in the input, commonly referred to as a regular pulse or the 'beat'. However, these two types of regularities are often indistinguishable in isochronous sequences, as both statistical learning and beat perception can be elicited by the regular alternation of accented and unaccented sounds. Here, we manipulated the isochrony of sound sequences in order to disentangle statistical learning from beat perception in sleeping newborn infants in an EEG experiment, as previously done in adults and macaque monkeys. We used a binary accented sequence that induces a beat when presented with isochronous timing, but not when presented with randomly jittered timing. We compared mismatch responses to infrequent deviants falling on either accented or unaccented (i.e., odd and even) positions. Results showed a clear difference between metrical positions in the isochronous sequence, but not in the equivalent jittered sequence. This suggests that beat processing is present in newborns. Despite previous evidence for statistical learning in newborns the effects of this ability were not detected in the jittered condition. These results show that statistical learning by itself does not fully explain beat processing in newborn infants.
Collapse
|
12
|
Short-Term Effect of Auditory Stimulation on Neural Activities: A Scoping Review of Longitudinal Electroencephalography and Magnetoencephalography Studies. Brain Sci 2024; 14:131. [PMID: 38391706 PMCID: PMC10887208 DOI: 10.3390/brainsci14020131] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/03/2023] [Revised: 12/24/2023] [Accepted: 01/24/2024] [Indexed: 02/24/2024] Open
Abstract
Explored through EEG/MEG, auditory stimuli function as a suitable research probe to reveal various neural activities, including event-related potentials, brain oscillations and functional connectivity. Accumulating evidence in this field stems from studies investigating neuroplasticity induced by long-term auditory training, specifically cross-sectional studies comparing musicians and non-musicians as well as longitudinal studies with musicians. In contrast, studies that address the neural effects of short-term interventions whose duration lasts from minutes to hours are only beginning to be featured. Over the past decade, an increasing body of evidence has shown that short-term auditory interventions evoke rapid changes in neural activities, and oscillatory fluctuations can be observed even in the prestimulus period. In this scoping review, we divided the extracted neurophysiological studies into three groups to discuss neural activities with short-term auditory interventions: the pre-stimulus period, during stimulation, and a comparison of before and after stimulation. We show that oscillatory activities vary depending on the context of the stimuli and are greatly affected by the interplay of bottom-up and top-down modulational mechanisms, including attention. We conclude that the observed rapid changes in neural activitiesin the auditory cortex and the higher-order cognitive part of the brain are causally attributed to short-term auditory interventions.
Collapse
|
13
|
Unravelling individual rhythmic abilities using machine learning. Sci Rep 2024; 14:1135. [PMID: 38212632 PMCID: PMC10784578 DOI: 10.1038/s41598-024-51257-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2023] [Accepted: 01/02/2024] [Indexed: 01/13/2024] Open
Abstract
Humans can easily extract the rhythm of a complex sound, like music, and move to its regular beat, like in dance. These abilities are modulated by musical training and vary significantly in untrained individuals. The causes of this variability are multidimensional and typically hard to grasp in single tasks. To date we lack a comprehensive model capturing the rhythmic fingerprints of both musicians and non-musicians. Here we harnessed machine learning to extract a parsimonious model of rhythmic abilities, based on behavioral testing (with perceptual and motor tasks) of individuals with and without formal musical training (n = 79). We demonstrate that variability in rhythmic abilities and their link with formal and informal music experience can be successfully captured by profiles including a minimal set of behavioral measures. These findings highlight that machine learning techniques can be employed successfully to distill profiles of rhythmic abilities, and ultimately shed light on individual variability and its relationship with both formal musical training and informal musical experiences.
Collapse
|
14
|
Why do dogs wag their tails? Biol Lett 2024; 20:20230407. [PMID: 38229554 PMCID: PMC10792393 DOI: 10.1098/rsbl.2023.0407] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2023] [Accepted: 12/11/2023] [Indexed: 01/18/2024] Open
Abstract
Tail wagging is a conspicuous behaviour in domestic dogs (Canis familiaris). Despite how much meaning humans attribute to this display, its quantitative description and evolutionary history are rarely studied. We summarize what is known about the mechanism, ontogeny, function and evolution of this behaviour. We suggest two hypotheses to explain its increased occurrence and frequency in dogs compared to other canids. During the domestication process, enhanced rhythmic tail wagging behaviour could have (i) arisen as a by-product of selection for other traits, such as docility and tameness, or (ii) been directly selected by humans, due to our proclivity for rhythmic stimuli. We invite testing of these hypotheses through neurobiological and ethological experiments, which will shed light on one of the most readily observed yet understudied animal behaviours. Targeted tail wagging research can be a window into both canine ethology and the evolutionary history of characteristic human traits, such as our ability to perceive and produce rhythmic behaviours.
Collapse
|
15
|
The shared genetic architecture and evolution of human language and musical rhythm. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.11.01.564908. [PMID: 37961248 PMCID: PMC10634981 DOI: 10.1101/2023.11.01.564908] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/15/2023]
Abstract
Rhythm and language-related traits are phenotypically correlated, but their genetic overlap is largely unknown. Here, we leveraged two large-scale genome-wide association studies performed to shed light on the shared genetics of rhythm (N=606,825) and dyslexia (N=1,138,870). Our results reveal an intricate shared genetic and neurobiological architecture, and lay groundwork for resolving longstanding debates about the potential co-evolution of human language and musical traits.
Collapse
|
16
|
Individual differences in neural markers of beat processing relate to spoken grammar skills in six-year-old children. BRAIN AND LANGUAGE 2023; 246:105345. [PMID: 37994830 DOI: 10.1016/j.bandl.2023.105345] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/20/2023] [Revised: 10/05/2023] [Accepted: 10/10/2023] [Indexed: 11/24/2023]
Abstract
Based on the idea that neural entrainment establishes regular attentional fluctuations that facilitate hierarchical processing in both music and language, we hypothesized that individual differences in syntactic (grammatical) skills will be partly explained by patterns of neural responses to musical rhythm. To test this hypothesis, we recorded neural activity using electroencephalography (EEG) while children (N = 25) listened passively to rhythmic patterns that induced different beat percepts. Analysis of evoked beta and gamma activity revealed that individual differences in the magnitude of neural responses to rhythm explained variance in six-year-olds' expressive grammar abilities, beyond and complementarily to their performance in a behavioral rhythm perception task. These results reinforce the idea that mechanisms of neural beat entrainment may be a shared neural resource supporting hierarchical processing across music and language and suggest a relevant marker of the relationship between rhythm processing and grammar abilities in elementary-school-age children, previously observed only behaviorally.
Collapse
|
17
|
Macaque monkeys and humans sample temporal regularities in the acoustic environment. Prog Neurobiol 2023; 229:102502. [PMID: 37442410 DOI: 10.1016/j.pneurobio.2023.102502] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2023] [Revised: 07/06/2023] [Accepted: 07/10/2023] [Indexed: 07/15/2023]
Abstract
Many animal species show comparable abilities to detect basic rhythms and produce rhythmic behavior. Yet, the capacities to process complex rhythms and synchronize rhythmic behavior appear to be species-specific: vocal learning animals can, but some primates might not. This discrepancy is of high interest as there is a putative link between rhythm processing and the development of sophisticated sensorimotor behavior in humans. Do our closest ancestors show comparable endogenous dispositions to sample the acoustic environment in the absence of task instructions and training? We recorded EEG from macaque monkeys and humans while they passively listened to isochronous equitone sequences. Individual- and trial-level analyses showed that macaque monkeys' and humans' delta-band neural oscillations encoded and tracked the timing of auditory events. Further, mu- (8-15 Hz) and beta-band (12-20 Hz) oscillations revealed the superimposition of varied accentuation patterns on a subset of trials. These observations suggest convergence in the encoding and dynamic attending of temporal regularities in the acoustic environment, bridging a gap in the phylogenesis of rhythm cognition.
Collapse
|
18
|
"What" and "when" predictions modulate auditory processing in a mutually congruent manner. Front Neurosci 2023; 17:1180066. [PMID: 37781257 PMCID: PMC10540699 DOI: 10.3389/fnins.2023.1180066] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2023] [Accepted: 08/04/2023] [Indexed: 10/03/2023] Open
Abstract
Introduction Extracting regularities from ongoing stimulus streams to form predictions is crucial for adaptive behavior. Such regularities exist in terms of the content of the stimuli and their timing, both of which are known to interactively modulate sensory processing. In real-world stimulus streams such as music, regularities can occur at multiple levels, both in terms of contents (e.g., predictions relating to individual notes vs. their more complex groups) and timing (e.g., pertaining to timing between intervals vs. the overall beat of a musical phrase). However, it is unknown whether the brain integrates predictions in a manner that is mutually congruent (e.g., if "beat" timing predictions selectively interact with "what" predictions falling on pulses which define the beat), and whether integrating predictions in different timing conditions relies on dissociable neural correlates. Methods To address these questions, our study manipulated "what" and "when" predictions at different levels - (local) interval-defining and (global) beat-defining - within the same stimulus stream, while neural activity was recorded using electroencephalogram (EEG) in participants (N = 20) performing a repetition detection task. Results Our results reveal that temporal predictions based on beat or interval timing modulated mismatch responses to violations of "what" predictions happening at the predicted time points, and that these modulations were shared between types of temporal predictions in terms of the spatiotemporal distribution of EEG signals. Effective connectivity analysis using dynamic causal modeling showed that the integration of "what" and "when" predictions selectively increased connectivity at relatively late cortical processing stages, between the superior temporal gyrus and the fronto-parietal network. Discussion Taken together, these results suggest that the brain integrates different predictions with a high degree of mutual congruence, but in a shared and distributed cortical network. This finding contrasts with recent studies indicating separable mechanisms for beat-based and memory-based predictive processing.
Collapse
|
19
|
Learning to pause: Fidelity of and biases in the developmental acquisition of gaps in the communicative signals of a songbird. Dev Sci 2023; 26:e13382. [PMID: 36861437 DOI: 10.1111/desc.13382] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2022] [Revised: 01/21/2023] [Accepted: 02/10/2023] [Indexed: 03/03/2023]
Abstract
The temporal organization of sounds used in social contexts can provide information about signal function and evoke varying responses in listeners (receivers). For example, music is a universal and learned human behavior that is characterized by different rhythms and tempos that can evoke disparate responses in listeners. Similarly, birdsong is a social behavior in songbirds that is learned during critical periods in development and used to evoke physiological and behavioral responses in receivers. Recent investigations have begun to reveal the breadth of universal patterns in birdsong and their similarities to common patterns in speech and music, but relatively little is known about the degree to which biological predispositions and developmental experiences interact to shape the temporal patterning of birdsong. Here, we investigated how biological predispositions modulate the acquisition and production of an important temporal feature of birdsong, namely the duration of silent pauses ("gaps") between vocal elements ("syllables"). Through analyses of semi-naturally raised and experimentally tutored zebra finches, we observed that juvenile zebra finches imitate the durations of the silent gaps in their tutor's song. Further, when juveniles were experimentally tutored with stimuli containing a wide range of gap durations, we observed biases in the prevalence and stereotypy of gap durations. Together, these studies demonstrate how biological predispositions and developmental experiences differently affect distinct temporal features of birdsong and highlight similarities in developmental plasticity across birdsong, speech, and music. RESEARCH HIGHLIGHTS: The temporal organization of learned acoustic patterns can be similar across human cultures and across species, suggesting biological predispositions in acquisition. We studied how biological predispositions and developmental experiences affect an important temporal feature of birdsong, namely the duration of silent intervals between vocal elements ("gaps"). Semi-naturally and experimentally tutored zebra finches imitated the durations of gaps in their tutor's song and displayed some biases in the learning and production of gap durations and in gap variability. These findings in the zebra finch provide parallels with the acquisition of temporal features of speech and music in humans.
Collapse
|
20
|
Improvised herding: Mapping biobehavioral mechanisms that underlie group efficacy during improvised social interaction. Psychophysiology 2023; 60:e14307. [PMID: 37073965 DOI: 10.1111/psyp.14307] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2022] [Revised: 09/14/2022] [Accepted: 11/08/2022] [Indexed: 04/20/2023]
Abstract
Improvisation is a natural occurring phenomenon that is central to social interaction. Yet, improvisation is an understudied area in group processes and intergroup relations. Here we build on theory and research about human herding to study the contributions of improvisation on group efficacy and its biobehavioral underpinnings. We employed a novel multimodal approach and integrative method when observing face-to-face interactions-51 triads (total N = 153) drummed together in spontaneous-free improvisations as a group, while their electrodermal activity was monitored simultaneously with their second-by-second rhythmic coordination on a shared electronic drum machine. Our results show that three hypothesized factors of human herding-physiological synchrony, behavioral coordination, and emotional contagion-predict a sense of group efficacy in its group members. These findings are some of the first to show herding at three levels (physiological, behavioral, and mental) in a single study and lay a basis for understanding the role of improvisation in social interaction.
Collapse
|
21
|
Abstract
Sociality and timing are tightly interrelated in human interaction as seen in turn-taking or synchronised dance movements. Sociality and timing also show in communicative acts of other species that might be pleasurable, but also necessary for survival. Sociality and timing often co-occur, but their shared phylogenetic trajectory is unknown: How, when, and why did they become so tightly linked? Answering these questions is complicated by several constraints; these include the use of divergent operational definitions across fields and species, the focus on diverse mechanistic explanations (e.g., physiological, neural, or cognitive), and the frequent adoption of anthropocentric theories and methodologies in comparative research. These limitations hinder the development of an integrative framework on the evolutionary trajectory of social timing and make comparative studies not as fruitful as they could be. Here, we outline a theoretical and empirical framework to test contrasting hypotheses on the evolution of social timing with species-appropriate paradigms and consistent definitions. To facilitate future research, we introduce an initial set of representative species and empirical hypotheses. The proposed framework aims at building and contrasting evolutionary trees of social timing toward and beyond the crucial branch represented by our own lineage. Given the integration of cross-species and quantitative approaches, this research line might lead to an integrated empirical-theoretical paradigm and, as a long-term goal, explain why humans are such socially coordinated animals.
Collapse
|
22
|
Infants show enhanced neural responses to musical meter frequencies beyond low-level features. Dev Sci 2023; 26:e13353. [PMID: 36415027 DOI: 10.1111/desc.13353] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2022] [Revised: 10/20/2022] [Accepted: 11/16/2022] [Indexed: 11/24/2022]
Abstract
Music listening often entails spontaneous perception and body movement to a periodic pulse-like meter. There is increasing evidence that this cross-cultural ability relates to neural processes that selectively enhance metric periodicities, even when these periodicities are not prominent in the acoustic stimulus. However, whether these neural processes emerge early in development remains largely unknown. Here, we recorded the electroencephalogram (EEG) of 20 healthy 5- to 6-month-old infants, while they were exposed to two rhythms known to induce the perception of meter consistently across Western adults. One rhythm contained prominent acoustic periodicities corresponding to the meter, whereas the other rhythm did not. Infants showed significantly enhanced representations of meter periodicities in their EEG responses to both rhythms. This effect is unlikely to reflect the tracking of salient acoustic features in the stimulus, as it was observed irrespective of the prominence of meter periodicities in the audio signals. Moreover, as previously observed in adults, the neural enhancement of meter was greater when the rhythm was delivered by low-pitched sounds. Together, these findings indicate that the endogenous enhancement of metric periodicities beyond low-level acoustic features is a neural property that is already present soon after birth. These high-level neural processes could set the stage for internal representations of musical meter that are critical for human movement coordination during rhythmic musical behavior. RESEARCH HIGHLIGHTS: 5- to 6-month-old infants were presented with auditory rhythms that induce the perception of a periodic pulse-like meter in adults. Infants showed selective enhancement of EEG activity at meter-related frequencies irrespective of the prominence of these frequencies in the stimulus. Responses at meter-related frequencies were boosted when the rhythm was conveyed by bass sounds. High-level neural processes that transform rhythmic auditory stimuli into internal meter templates emerge early after birth.
Collapse
|
23
|
Frequency-specific directed interactions between whole-brain regions during sentence processing using multimodal stimulus. Neurosci Lett 2023; 812:137409. [PMID: 37487970 DOI: 10.1016/j.neulet.2023.137409] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2023] [Revised: 06/26/2023] [Accepted: 07/20/2023] [Indexed: 07/26/2023]
Abstract
Neural oscillations subserve a broad range of speech processing and language comprehension functions. Using an electroencephalogram (EEG), we investigated the frequency-specific directed interactions between whole-brain regions while the participants processed Chinese sentences using different modality stimuli (i.e., auditory, visual, and audio-visual). The results indicate that low-frequency responses correspond to the process of information flow aggregation in primary sensory cortices in different modalities. Information flow dominated by high-frequency responses exhibited characteristics of bottom-up flow from left posterior temporal to left frontal regions. The network pattern of top-down information flowing out of the left frontal lobe was presented by the joint dominance of low- and high-frequency rhythms. Overall, our results suggest that the brain may be modality-independent when processing higher-order language information.
Collapse
|
24
|
The same phase creates a unique visual rhythm unifying moving elements in time. Psych J 2023; 12:500-506. [PMID: 36916772 DOI: 10.1002/pchj.636] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2022] [Accepted: 12/02/2022] [Indexed: 03/16/2023]
Abstract
Attention can be selectively tuned to particular features at different spatial locations or objects. The deployment of attention can be guided by properties, such as color, orientation, and so forth, as guiding features. What might be such guiding features for visual stimuli under dynamic rhythmic conditions? We asked specifically what might be the parameters that attract attention when perceiving a visual rhythm. We used a visual search paradigm, in which a dynamic search display consisted of vertically "bouncing balls" with regular rhythms. The search target was defined by a unique visual rhythm (i.e., with either a shorter or longer period) among rhythmic distractors sharing an identical period. We modulated amplitudes and phases of the distractor balls systematically. The results showed a crucial factor of the phase, not the amplitude. If the phase is violated, the target suddenly "pops out" as an "oddball," showing an efficient parallel search. The findings indicate in general the essential role of the phase in conjunction with amplitude and period for visual rhythm perception. Furthermore, a higher saliency of moving objects with a higher frequency component has also been disclosed.
Collapse
|
25
|
Central pattern generators evolved for real-time adaptation to rhythmic stimuli. BIOINSPIRATION & BIOMIMETICS 2023; 18:046020. [PMID: 37339660 DOI: 10.1088/1748-3190/ace017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/05/2023] [Accepted: 06/20/2023] [Indexed: 06/22/2023]
Abstract
For a robot to be both autonomous and collaborative requires the ability to adapt its movement to a variety of external stimuli, whether these come from humans or other robots. Typically, legged robots have oscillation periods explicitly defined as a control parameter, limiting the adaptability of walking gaits. Here we demonstrate a virtual quadruped robot employing a bio-inspired central pattern generator (CPG) that can spontaneously synchronize its movement to a range of rhythmic stimuli. Multi-objective evolutionary algorithms were used to optimize the variation of movement speed and direction as a function of the brain stem drive and the centre of mass control respectively. This was followed by optimization of an additional layer of neurons that filters fluctuating inputs. As a result, a range of CPGs were able to adjust their gait pattern and/or frequency to match the input period. We show how this can be used to facilitate coordinated movement despite differences in morphology, as well as to learn new movement patterns.
Collapse
|
26
|
Sensory and motor representations of internalized rhythms in the cerebellum and basal ganglia. Proc Natl Acad Sci U S A 2023; 120:e2221641120. [PMID: 37276394 PMCID: PMC10268275 DOI: 10.1073/pnas.2221641120] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2022] [Accepted: 05/04/2023] [Indexed: 06/07/2023] Open
Abstract
Both the cerebellum and basal ganglia are involved in rhythm processing, but their specific roles remain unclear. During rhythm perception, these areas may be processing purely sensory information, or they may be involved in motor preparation, as periodic stimuli often induce synchronized movements. Previous studies have shown that neurons in the cerebellar dentate nucleus and the caudate nucleus exhibit periodic activity when the animals prepare to respond to the random omission of regularly repeated visual stimuli. To detect stimulus omission, the animals need to learn the stimulus tempo and predict the timing of the next stimulus. The present study demonstrates that neuronal activity in the cerebellum is modulated by the location of the repeated stimulus and that in the striatum (STR) by the direction of planned movement. However, in both brain regions, neuronal activity during movement and the effect of electrical stimulation immediately before stimulus omission were largely dependent on the direction of movement. These results suggest that, during rhythm processing, the cerebellum is involved in multiple stages from sensory prediction to motor control, while the STR consistently plays a role in motor preparation. Thus, internalized rhythms without movement are maintained as periodic neuronal activity, with the cerebellum and STR preferring sensory and motor representations, respectively.
Collapse
|
27
|
No evidence for tactile entrainment of attention. Front Psychol 2023; 14:1168428. [PMID: 37303888 PMCID: PMC10250593 DOI: 10.3389/fpsyg.2023.1168428] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2023] [Accepted: 05/11/2023] [Indexed: 06/13/2023] Open
Abstract
Temporal patterns in our environment provide a rich source of information, to which endogenous neural processes linked to perception and attention can synchronize. This phenomenon, known as entrainment, has so far been studied predominately in the visual and auditory domains. It is currently unknown whether sensory phase-entrainment generalizes to the tactile modality, e.g., for the perception of surface patterns or when reading braille. Here, we address this open question via a behavioral experiment with preregistered experimental and analysis protocols. Twenty healthy participants were presented, on each trial, with 2 s of either rhythmic or arrhythmic 10 Hz tactile stimuli. Their task was to detect a subsequent tactile target either in-phase or out-of-phase with the rhythmic entrainment. Contrary to our hypothesis, we observed no evidence for sensory entrainment in response times, sensitivity or response bias. In line with several other recently reported null findings, our data suggest that behaviorally relevant sensory phase-entrainment might require very specific stimulus parameters, and may not generalize to the tactile domain.
Collapse
|
28
|
Perceived rhythmic regularity is greater for song than speech: examining acoustic correlates of rhythmic regularity in speech and song. Front Psychol 2023; 14:1167003. [PMID: 37303916 PMCID: PMC10250601 DOI: 10.3389/fpsyg.2023.1167003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2023] [Accepted: 05/09/2023] [Indexed: 06/13/2023] Open
Abstract
Rhythm is a key feature of music and language, but the way rhythm unfolds within each domain differs. Music induces perception of a beat, a regular repeating pulse spaced by roughly equal durations, whereas speech does not have the same isochronous framework. Although rhythmic regularity is a defining feature of music and language, it is difficult to derive acoustic indices of the differences in rhythmic regularity between domains. The current study examined whether participants could provide subjective ratings of rhythmic regularity for acoustically matched (syllable-, tempo-, and contour-matched) and acoustically unmatched (varying in tempo, syllable number, semantics, and contour) exemplars of speech and song. We used subjective ratings to index the presence or absence of an underlying beat and correlated ratings with stimulus features to identify acoustic metrics of regularity. Experiment 1 highlighted that ratings based on the term "rhythmic regularity" did not result in consistent definitions of regularity across participants, with opposite ratings for participants who adopted a beat-based definition (song greater than speech), a normal-prosody definition (speech greater than song), or an unclear definition (no difference). Experiment 2 defined rhythmic regularity as how easy it would be to tap or clap to the utterances. Participants rated song as easier to clap or tap to than speech for both acoustically matched and unmatched datasets. Subjective regularity ratings from Experiment 2 illustrated that stimuli with longer syllable durations and with less spectral flux were rated as more rhythmically regular across domains. Our findings demonstrate that rhythmic regularity distinguishes speech from song and several key acoustic features can be used to predict listeners' perception of rhythmic regularity within and across domains as well.
Collapse
|
29
|
Task-irrelevant auditory metre shapes visuomotor sequential learning. PSYCHOLOGICAL RESEARCH 2023; 87:872-893. [PMID: 35690927 PMCID: PMC10017598 DOI: 10.1007/s00426-022-01690-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2021] [Accepted: 05/17/2022] [Indexed: 11/24/2022]
Abstract
The ability to learn and reproduce sequences is fundamental to every-day life, and deficits in sequential learning are associated with developmental disorders such as specific language impairment. Individual differences in sequential learning are usually investigated using the serial reaction time task (SRTT), wherein a participant responds to a series of regularly timed, seemingly random visual cues that in fact follow a repeating deterministic structure. Although manipulating inter-cue interval timing has been shown to adversely affect sequential learning, the role of metre (the patterning of salience across time) remains unexplored within the regularly timed, visual SRTT. The current experiment consists of an SRTT adapted to include task-irrelevant auditory rhythms conferring a sense of metre. We predicted that (1) participants' (n = 41) reaction times would reflect the auditory metric structure; (2) that disrupting the correspondence between the learned visual sequence and auditory metre would impede performance; and (3) that individual differences in sensitivity to rhythm would predict the magnitude of these effects. Altering the relationship via a phase shift between the trained visual sequence and auditory metre slowed reaction times. Sensitivity to rhythm was predictive of reaction times over all. In an exploratory analysis, we, moreover, found that approximately half of participants made systematically different responses to visual cues on the basis of the cues' position within the auditory metre. We demonstrate the influence of auditory temporal structures on visuomotor sequential learning in a widely used task where metre and timing are rarely considered. The current results indicate sensitivity to metre as a possible latent factor underpinning individual differences in SRTT performance.
Collapse
|
30
|
Cortical encoding of rhythmic kinematic structures in biological motion. Neuroimage 2023; 268:119893. [PMID: 36693597 DOI: 10.1016/j.neuroimage.2023.119893] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2022] [Revised: 01/04/2023] [Accepted: 01/20/2023] [Indexed: 01/22/2023] Open
Abstract
Biological motion (BM) perception is of great survival value to human beings. The critical characteristics of BM information lie in kinematic cues containing rhythmic structures. However, how rhythmic kinematic structures of BM are dynamically represented in the brain and contribute to visual BM processing remains largely unknown. Here, we probed this issue in three experiments using electroencephalogram (EEG). We found that neural oscillations of observers entrained to the hierarchical kinematic structures of the BM sequences (i.e., step-cycle and gait-cycle for point-light walkers). Notably, only the cortical tracking of the higher-level rhythmic structure (i.e., gait-cycle) exhibited a BM processing specificity, manifested by enhanced neural responses to upright over inverted BM stimuli. This effect could be extended to different motion types and tasks, with its strength positively correlated with the perceptual sensitivity to BM stimuli at the right temporal brain region dedicated to visual BM processing. Modeling results further suggest that the neural encoding of spatiotemporally integrative kinematic cues, in particular the opponent motions of bilateral limbs, drives the selective cortical tracking of BM information. These findings underscore the existence of a cortical mechanism that encodes periodic kinematic features of body movements, which underlies the dynamic construction of visual BM perception.
Collapse
|
31
|
Musical tempo affects EEG spectral dynamics during subsequent time estimation. Biol Psychol 2023; 178:108517. [PMID: 36801434 DOI: 10.1016/j.biopsycho.2023.108517] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2022] [Revised: 01/24/2023] [Accepted: 02/12/2023] [Indexed: 02/19/2023]
Abstract
The perception of time depends on the rhythmicity of internal and external synchronizers. One external synchronizer that affects time estimation is music. This study aimed to analyze the effects of musical tempi on EEG spectral dynamics during subsequent time estimation. Participants performed a time production task after (i) silence and (ii) listening to music at different tempi -90, 120, and 150 bpm- while EEG activity was recorded. While listening, there was an increase in alpha power at all tempi compared to the resting state and an increase of beta at the fastest tempo. The beta increase persisted during the subsequent time estimations, with higher beta power during the task after listening to music at the fastest tempo than task performance without music. Spectral dynamics in frontal regions showed lower alpha activity in the final stages of time estimations after listening to music at 90- and 120-bpm than in the silence condition and higher beta in the early stages at 150 bpm. Behaviorally, the 120 bpm musical tempo produced slight improvements. Listening to music modified tonic EEG activity that subsequently affected EEG dynamics during time production. Music at a more optimal rate could have benefited temporal expectation and anticipation. The fastest musical tempo may have generated an over-activated state that affected subsequent time estimations. These results emphasize the importance of music as an external stimulus that can affect brain functional organization during time perception even after listening.
Collapse
|
32
|
What a difference a syllable makes-Rhythmic reading of poetry. Front Psychol 2023; 14:1043651. [PMID: 36865353 PMCID: PMC9973453 DOI: 10.3389/fpsyg.2023.1043651] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2022] [Accepted: 01/06/2023] [Indexed: 02/15/2023] Open
Abstract
In reading conventional poems aloud, the rhythmic experience is coupled with the projection of meter, enabling the prediction of subsequent input. However, it is unclear how top-down and bottom-up processes interact. If the rhythmicity in reading loud is governed by the top-down prediction of metric patterns of weak and strong stress, these should be projected also onto a randomly included, lexically meaningless syllable. If bottom-up information such as the phonetic quality of consecutive syllables plays a functional role in establishing a structured rhythm, the occurrence of the lexically meaningless syllable should affect reading and the number of these syllables in a metrical line should modulate this effect. To investigate this, we manipulated poems by replacing regular syllables at random positions with the syllable "tack". Participants were instructed to read the poems aloud and their voice was recorded during the reading. At the syllable level, we calculated the syllable onset interval (SOI) as a measure of articulation duration, as well as the mean syllable intensity. Both measures were supposed to operationalize how strongly a syllable was stressed. Results show that the average articulation duration of metrically strong regular syllables was longer than for weak syllables. This effect disappeared for "tacks". Syllable intensities, on the other hand, captured metrical stress of "tacks" as well, but only for musically active participants. Additionally, we calculated the normalized pairwise variability index (nPVI) for each line as an indicator for rhythmic contrast, i.e., the alternation between long and short, as well as louder and quieter syllables, to estimate the influence of "tacks" on reading rhythm. For SOI the nPVI revealed a clear negative effect: When "tacks" occurred, lines appeared to be read less altering, and this effect was proportional to the number of tacks per line. For intensity, however, the nPVI did not capture significant effects. Results suggests that top-down prediction does not always suffice to maintain a rhythmic gestalt across a series of syllables that carry little bottom-up prosodic information. Instead, the constant integration of sufficiently varying bottom-up information appears necessary to maintain a stable metrical pattern prediction.
Collapse
|
33
|
Exploring individual differences in musical rhythm and grammar skills in school-aged children with typically developing language. Sci Rep 2023; 13:2201. [PMID: 36750727 PMCID: PMC9905575 DOI: 10.1038/s41598-022-21902-0] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2022] [Accepted: 10/05/2022] [Indexed: 02/09/2023] Open
Abstract
A growing number of studies have shown a connection between rhythmic processing and language skill. It has been proposed that domain-general rhythm abilities might help children to tap into the rhythm of speech (prosody), cueing them to prosodic markers of grammatical (syntactic) information during language acquisition, thus underlying the observed correlations between rhythm and language. Working memory processes common to task demands for musical rhythm discrimination and spoken language paradigms are another possible source of individual variance observed in musical rhythm and language abilities. To investigate the nature of the relationship between musical rhythm and expressive grammar skills, we adopted an individual differences approach in N = 132 elementary school-aged children ages 5-7, with typical language development, and investigated prosodic perception and working memory skills as possible mediators. Aligning with the literature, musical rhythm was correlated with expressive grammar performance (r = 0.41, p < 0.001). Moreover, musical rhythm predicted mastery of complex syntax items (r = 0.26, p = 0.003), suggesting a privileged role of hierarchical processing shared between musical rhythm processing and children's acquisition of complex syntactic structures. These relationships between rhythm and grammatical skills were not mediated by prosodic perception, working memory, or non-verbal IQ; instead, we uncovered a robust direct effect of musical rhythm perception on grammatical task performance. Future work should focus on possible biological endophenotypes and genetic influences underlying this relationship.
Collapse
|
34
|
The Tapping-PROMS: A test for the assessment of sensorimotor rhythmic abilities. Front Psychol 2023; 13:862468. [PMID: 36726505 PMCID: PMC9886312 DOI: 10.3389/fpsyg.2022.862468] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2022] [Accepted: 12/23/2022] [Indexed: 01/17/2023] Open
Abstract
Sensorimotor synchronization is a longstanding paradigm in the analysis of isochronous beat tapping. Assessing the finger tapping of complex rhythmic patterns is far less explored and considerably more complex to analyze. Hence, whereas several instruments to assess tempo or beat tapping ability exist, there is at present a shortage of paradigms and tools for the assessment of the ability to tap to complex rhythmic patterns. To redress this limitation, we developed a standardized rhythm tapping test comprising test items of different complexity. The items were taken from the rhythm and tempo subtests of the Profile of Music Perception Skills (PROMS), and administered as tapping items to 40 participants (20 women). Overall, results showed satisfactory psychometric properties for internal consistency and test-retest reliability. Convergent, discriminant, and criterion validity correlations fell in line with expectations. Specifically, performance in rhythm tapping was correlated more strongly with performance in rhythm perception than in tempo perception, whereas performance in tempo tapping was more strongly correlated with performance in tempo than rhythm perception. Both tapping tasks were only marginally correlated with non-temporal perception tasks. In combination, the tapping tasks explained variance in external indicators of musical proficiency above and beyond the perceptual PROMS tasks. This tool allows for the assessment of complex rhythmic tapping skills in about 15 min, thus providing a useful addition to existing music aptitude batteries.
Collapse
|
35
|
Auditory rhythm discrimination in adults who stutter: An fMRI study. BRAIN AND LANGUAGE 2023; 236:105219. [PMID: 36577315 DOI: 10.1016/j.bandl.2022.105219] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/30/2022] [Revised: 11/09/2022] [Accepted: 12/16/2022] [Indexed: 06/17/2023]
Abstract
Rhythm perception deficits have been linked to neurodevelopmental disorders affecting speech and language. Children who stutter have shown poorer rhythm discrimination and attenuated functional connectivity in rhythm-related brain areas, which may negatively impact timing control required for speech. It is unclear whether adults who stutter (AWS), who are likely to have acquired compensatory adaptations in response to rhythm processing/timing deficits, are similarly affected. We compared rhythm discrimination in AWS and controls (total n = 36) during fMRI in two matched conditions: simple rhythms that consistently reinforced a periodic beat, and complex rhythms that did not (requiring greater reliance on internal timing). Consistent with an internal beat deficit hypothesis, behavioral results showed poorer complex rhythm discrimination for AWS than controls. In AWS, greater stuttering severity was associated with poorer rhythm discrimination. AWS showed increased activity within beat-based timing regions and increased functional connectivity between putamen and cerebellum (supporting interval-based timing) for simple rhythms.
Collapse
|
36
|
Finding Hierarchical Structure in Binary Sequences: Evidence from Lindenmayer Grammar Learning. Cogn Sci 2023; 47:e13242. [PMID: 36655988 PMCID: PMC10078511 DOI: 10.1111/cogs.13242] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2022] [Revised: 12/15/2022] [Accepted: 01/04/2023] [Indexed: 01/20/2023]
Abstract
In this article, we explore the extraction of recursive nested structure in the processing of binary sequences. Our aim was to determine whether humans learn the higher-order regularities of a highly simplified input where only sequential-order information marks the hierarchical structure. To this end, we implemented a sequence generated by the Fibonacci grammar in a serial reaction time task. This deterministic grammar generates aperiodic but self-similar sequences. The combination of these two properties allowed us to evaluate hierarchical learning while controlling for the use of low-level strategies like detecting recurring patterns. The deterministic aspect of the grammar allowed us to predict precisely which points in the sequence should be subject to anticipation. Results showed that participants' pattern of anticipation could not be accounted for by "flat" statistical learning processes and was consistent with them anticipating upcoming points based on hierarchical assumptions. We also found that participants were sensitive to the structure constituency, suggesting that they organized the signal into embedded constituents. We hypothesized that the participants built this structure by merging recursively deterministic transitions.
Collapse
|
37
|
Compromised word-level neural tracking in the high-gamma band for children with attention deficit hyperactivity disorder. Front Hum Neurosci 2023; 17:1174720. [PMID: 37213926 PMCID: PMC10196181 DOI: 10.3389/fnhum.2023.1174720] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2023] [Accepted: 04/18/2023] [Indexed: 05/23/2023] Open
Abstract
Children with attention deficit hyperactivity disorder (ADHD) exhibit pervasive difficulties in speech perception. Given that speech processing involves both acoustic and linguistic stages, it remains unclear which stage of speech processing is impaired in children with ADHD. To investigate this issue, we measured neural tracking of speech at syllable and word levels using electroencephalography (EEG), and evaluated the relationship between neural responses and ADHD symptoms in 6-8 years old children. Twenty-three children participated in the current study, and their ADHD symptoms were assessed with SNAP-IV questionnaires. In the experiment, the children listened to hierarchical speech sequences in which syllables and words were, respectively, repeated at 2.5 and 1.25 Hz. Using frequency domain analyses, reliable neural tracking of syllables and words was observed in both the low-frequency band (<4 Hz) and the high-gamma band (70-160 Hz). However, the neural tracking of words in the high-gamma band showed an anti-correlation with the ADHD symptom scores of the children. These results indicate that ADHD prominently impairs cortical encoding of linguistic information (e.g., words) in speech perception.
Collapse
|
38
|
The Musical Abilities, Pleiotropy, Language, and Environment (MAPLE) Framework for Understanding Musicality-Language Links Across the Lifespan. NEUROBIOLOGY OF LANGUAGE (CAMBRIDGE, MASS.) 2022; 3:615-664. [PMID: 36742012 PMCID: PMC9893227 DOI: 10.1162/nol_a_00079] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/29/2021] [Accepted: 08/08/2022] [Indexed: 04/18/2023]
Abstract
Using individual differences approaches, a growing body of literature finds positive associations between musicality and language-related abilities, complementing prior findings of links between musical training and language skills. Despite these associations, musicality has been often overlooked in mainstream models of individual differences in language acquisition and development. To better understand the biological basis of these individual differences, we propose the Musical Abilities, Pleiotropy, Language, and Environment (MAPLE) framework. This novel integrative framework posits that musical and language-related abilities likely share some common genetic architecture (i.e., genetic pleiotropy) in addition to some degree of overlapping neural endophenotypes, and genetic influences on musically and linguistically enriched environments. Drawing upon recent advances in genomic methodologies for unraveling pleiotropy, we outline testable predictions for future research on language development and how its underlying neurobiological substrates may be supported by genetic pleiotropy with musicality. In support of the MAPLE framework, we review and discuss findings from over seventy behavioral and neural studies, highlighting that musicality is robustly associated with individual differences in a range of speech-language skills required for communication and development. These include speech perception-in-noise, prosodic perception, morphosyntactic skills, phonological skills, reading skills, and aspects of second/foreign language learning. Overall, the current work provides a clear agenda and framework for studying musicality-language links using individual differences approaches, with an emphasis on leveraging advances in the genomics of complex musicality and language traits.
Collapse
|
39
|
Inter-individual coordination in walking chimpanzees. Curr Biol 2022; 32:5138-5143.e3. [PMID: 36270278 DOI: 10.1016/j.cub.2022.09.059] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2022] [Revised: 08/11/2022] [Accepted: 09/28/2022] [Indexed: 12/12/2022]
Abstract
Humans, like many other animals, live in groups and coordinate actions with others in social settings.1 Such interpersonal coordination may emerge unconsciously and when the goal is not the coordination of movements, as when falling into the same rhythm when walking together.2 Although one of our closest living relatives, the chimpanzee (Pan troglodytes), shows the ability to succeed in complex joint action tasks where coordination is the goal,3 little is known about simpler forms of joint action. Here, we examine whether chimpanzees spontaneously synchronize their actions with conspecifics while walking together. We collected data on individual walking behavior of two groups of chimpanzees under semi-natural conditions. In addition, we assessed social relationships to investigate potential effects on the strength of coordination. When walking with a conspecific, individuals walked faster than when alone. The relative phase was symmetrically distributed around 0° with the highest frequencies around 0, indicating a tendency to coordinate actions. Further, coordination was stronger when walking with a partner compared with two individuals walking independently. Although the inter-limb entrainment was more pronounced between individuals of similar age as a proxy for height, it was not affected by the kinship or bonding status of the walkers or the behaviors they engaged in immediately after the walk. We conclude that chimpanzees adapt their individual behavior to temporally coordinate actions with others, which might provide a basis for engaging in other more complex forms of joint action. This spontaneous form of inter-individual coordination, often called entrainment, is thus shared with humans.
Collapse
|
40
|
Cross-modal attentional effects of rhythmic sensory stimulation. Atten Percept Psychophys 2022; 85:863-878. [PMID: 36385670 PMCID: PMC10066103 DOI: 10.3758/s13414-022-02611-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/31/2022] [Indexed: 11/18/2022]
Abstract
AbstractTemporal regularities are ubiquitous in our environment. The theory of entrainment posits that the brain can utilize these regularities by synchronizing neural activity with external events, thereby, aligning moments of high neural excitability with expected upcoming stimuli and facilitating perception. Despite numerous accounts reporting entrainment of behavioural and electrophysiological measures, evidence regarding this phenomenon remains mixed, with several recent studies having failed to provide confirmatory evidence. Notably, it is currently unclear whether and for how long the effects of entrainment can persist beyond their initiating stimulus, and whether they remain restricted to the stimulated sensory modality or can cross over to other modalities. Here, we set out to answer these questions by presenting participants with either visual or auditory rhythmic sensory stimulation, followed by a visual or auditory target at six possible time points, either in-phase or out-of-phase relative to the initial stimulus train. Unexpectedly, but in line with several recent studies, we observed no evidence for cyclic fluctuations in performance, despite our design being highly similar to those used in previous demonstrations of sensory entrainment. However, our data revealed a temporally less specific attentional effect, via cross-modally facilitated performance following auditory compared with visual rhythmic stimulation. In addition to a potentially higher salience of auditory rhythms, this could indicate an effect on oscillatory 3-Hz amplitude, resulting in facilitated cognitive control and attention. In summary, our study further challenges the generality of periodic behavioural modulation associated with sensory entrainment, while demonstrating a modality-independent attention effect following auditory rhythmic stimulation.
Collapse
|
41
|
Systematic errors in the perception of rhythm. Front Hum Neurosci 2022; 16:1009219. [DOI: 10.3389/fnhum.2022.1009219] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2022] [Accepted: 10/14/2022] [Indexed: 11/11/2022] Open
Abstract
One hypothesis for why humans enjoy musical rhythms relates to their prediction of when each beat should occur. The ability to predict the timing of an event is important from an evolutionary perspective. Therefore, our brains have evolved internal mechanisms for processing the progression of time. However, due to inherent noise in neural signals, this prediction is not always accurate. Theoretical considerations of optimal estimates suggest the occurrence of certain systematic errors made by the brain when estimating the timing of beats in rhythms. Here, we tested psychophysically whether these systematic errors exist and if so, how they depend on stimulus parameters. Our experimental data revealed two main types of systematic errors. First, observers perceived the time of the last beat of a rhythmic pattern as happening earlier than actual when the inter-beat interval was short. Second, the perceived time of the last beat was later than the actual when the inter-beat interval was long. The magnitude of these systematic errors fell as the number of beats increased. However, with many beats, the errors due to long inter-beat intervals became more apparent. We propose a Bayesian model for these systematic errors. The model fits these data well, allowing us to offer possible explanations for how these errors occurred. For instance, neural processes possibly contributing to the errors include noisy and temporally asymmetric impulse responses, priors preferring certain time intervals, and better-early-than-late loss functions. We finish this article with brief discussions of both the implications of systematic errors for the appreciation of rhythm and the possible compensation by the brain’s motor system during a musical performance.
Collapse
|
42
|
Superior visual rhythm discrimination in expert musicians is most likely not related to cross-modal recruitment of the auditory cortex. Front Psychol 2022; 13:1036669. [PMID: 36337485 PMCID: PMC9632485 DOI: 10.3389/fpsyg.2022.1036669] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2022] [Accepted: 10/06/2022] [Indexed: 11/25/2022] Open
Abstract
Training can influence behavioral performance and lead to brain reorganization. In particular, training in one modality, for example, auditory, can improve performance in another modality, for example, visual. Previous research suggests that one of the mechanisms behind this phenomenon could be the cross-modal recruitment of the sensory areas, for example, the auditory cortex. Studying expert musicians offers a chance to explore this process. Rhythm is an aspect of music that can be presented in various modalities. We designed an fMRI experiment in which professional pianists and non-musicians discriminated between two sequences of rhythms presented auditorily (series of sounds) or visually (series of flashes). Behavioral results showed that musicians performed in both visual and auditory rhythmic tasks better than non-musicians. We found no significant between-group differences in fMRI activations within the auditory cortex. However, we observed that musicians had increased activation in the right Inferior Parietal Lobe when compared to non-musicians. We conclude that the musicians’ superior visual rhythm discrimination is not related to cross-modal recruitment of the auditory cortex; instead, it could be related to activation in higher-level, multimodal areas in the cortex.
Collapse
|
43
|
Spontaneous rhythm discrimination in a mammalian vocal learner. Biol Lett 2022; 18:20220316. [PMID: 36285461 PMCID: PMC9597408 DOI: 10.1098/rsbl.2022.0316] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Rhythm and vocal production learning are building blocks of human music and speech. Vocal learning has been hypothesized as a prerequisite for rhythmic capacities. Yet, no mammalian vocal learner but humans have shown the capacity to flexibly and spontaneously discriminate rhythmic patterns. Here we tested untrained rhythm discrimination in a mammalian vocal learning species, the harbour seal (Phoca vitulina). Twenty wild-born seals were exposed to music-like playbacks of conspecific call sequences varying in basic rhythmic properties. These properties were called length, sequence regularity, and overall tempo. All three features significantly influenced seals' reaction (number of looks and their duration), demonstrating spontaneous rhythm discrimination in a vocal learning mammal. This finding supports the rhythm–vocal learning hypothesis and showcases pinnipeds as promising models for comparative research on rhythmic phylogenies.
Collapse
|
44
|
Forebrain nuclei linked to woodpecker territorial drum displays mirror those that enable vocal learning in songbirds. PLoS Biol 2022; 20:e3001751. [PMID: 36125990 PMCID: PMC9488818 DOI: 10.1371/journal.pbio.3001751] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2022] [Accepted: 07/11/2022] [Indexed: 11/19/2022] Open
Abstract
Vocal learning is thought to have evolved in 3 orders of birds (songbirds, parrots, and hummingbirds), with each showing similar brain regions that have comparable gene expression specializations relative to the surrounding forebrain motor circuitry. Here, we searched for signatures of these same gene expression specializations in previously uncharacterized brains of 7 assumed vocal non-learning bird lineages across the early branches of the avian family tree. Our findings using a conserved marker for the song system found little evidence of specializations in these taxa, except for woodpeckers. Instead, woodpeckers possessed forebrain regions that were anatomically similar to the pallial song nuclei of vocal learning birds. Field studies of free-living downy woodpeckers revealed that these brain nuclei showed increased expression of immediate early genes (IEGs) when males produce their iconic drum displays, the elaborate bill-hammering behavior that individuals use to compete for territories, much like birdsong. However, these specialized areas did not show increased IEG expression with vocalization or flight. We further confirmed that other woodpecker species contain these brain nuclei, suggesting that these brain regions are a common feature of the woodpecker brain. We therefore hypothesize that ancient forebrain nuclei for refined motor control may have given rise to not only the song control systems of vocal learning birds, but also the drumming system of woodpeckers. Vocal learning is thought to have evolved in three orders of birds (songbirds, parrots, and hummingbirds). This study shows that woodpeckers have evolved a set of brain nuclei to mediate their drum displays, and these regions closely mirror those that underlie song learning in songbirds.
Collapse
|
45
|
The Biological Roots of Music and Dance : Extending the Credible Signaling Hypothesis to Predator Deterrence. HUMAN NATURE (HAWTHORNE, N.Y.) 2022; 33:261-279. [PMID: 35986877 DOI: 10.1007/s12110-022-09429-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 07/28/2022] [Indexed: 12/14/2022]
Abstract
After they diverged from panins, hominins evolved an increasingly committed terrestrial lifestyle in open habitats that exposed them to increased predation pressure from Africa's formidable predator guild. In the Pleistocene, Homo transitioned to a more carnivorous lifestyle that would have further increased predation pressure. An effective defense against predators would have required a high degree of cooperation by the smaller and slower hominins. It is in the interest of predator and potential prey to avoid encounters that will be costly for both. A wide variety of species, including carnivores and apes and other primates, have therefore evolved visual and auditory signals that deter predators by credibly signaling detection and/or the ability to effectively defend themselves. In some cooperative species, these predator deterrent signals involve highly synchronized visual and auditory displays among group members. Hagen and Bryant (Human Nature, 14(1), 21-51, 2003) proposed that synchronized visual and auditory displays credibly signal coalition quality. Here, this hypothesis is extended to include credible signals to predators that they have been detected and would be met with a highly coordinated defensive response, thereby deterring an attack. Within-group signaling functions are also proposed. The evolved cognitive abilities underlying these behaviors were foundations for the evolution of fully human music and dance.
Collapse
|
46
|
Abstract
Through long-term training, music experts acquire complex and specialized sensorimotor skills, which are paralleled by continuous neuro-anatomical and -functional adaptations. The underlying neuroplasticity mechanisms have been extensively explored in decades of research in music, cognitive, and translational neuroscience. However, the absence of a comprehensive review and quantitative meta-analysis prevents the plethora of variegated findings to ultimately converge into a unified picture of the neuroanatomy of musical expertise. Here, we performed a comprehensive neuroimaging meta-analysis of publications investigating neuro-anatomical and -functional differences between musicians (M) and non-musicians (NM). Eighty-four studies were included in the qualitative synthesis. From these, 58 publications were included in coordinate-based meta-analyses using the anatomic/activation likelihood estimation (ALE) method. This comprehensive approach delivers a coherent cortico-subcortical network encompassing sensorimotor and limbic regions bilaterally. Particularly, M exhibited higher volume/activity in auditory, sensorimotor, interoceptive, and limbic brain areas and lower volume/activity in parietal areas as opposed to NM. Notably, we reveal topographical (dis-)similarities between the identified functional and anatomical networks and characterize their link to various cognitive functions by means of meta-analytic connectivity modelling. Overall, we effectively synthesized decades of research in the field and provide a consistent and controversies-free picture of the neuroanatomy of musical expertise.
Collapse
|
47
|
Music rhythm tree based partitioning approach to decision tree classifier. JOURNAL OF KING SAUD UNIVERSITY - COMPUTER AND INFORMATION SCIENCES 2022. [DOI: 10.1016/j.jksuci.2020.03.015] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
48
|
Rhythm May Be Key to Linking Language and Cognition in Young Infants: Evidence From Machine Learning. Front Psychol 2022; 13:894405. [PMID: 35693512 PMCID: PMC9178268 DOI: 10.3389/fpsyg.2022.894405] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2022] [Accepted: 05/03/2022] [Indexed: 11/30/2022] Open
Abstract
Rhythm is key to language acquisition. Across languages, rhythmic features highlight fundamental linguistic elements of the sound stream and structural relations among them. A sensitivity to rhythmic features, which begins in utero, is evident at birth. What is less clear is whether rhythm supports infants' earliest links between language and cognition. Prior evidence has documented that for infants as young as 3 and 4 months, listening to their native language (English) supports the core cognitive capacity of object categorization. This precocious link is initially part of a broader template: listening to a non-native language from the same rhythmic class as (e.g., German, but not Cantonese) and to vocalizations of non-human primates (e.g., lemur, Eulemur macaco flavifrons, but not birds e.g., zebra-finches, Taeniopygia guttata) provide English-acquiring infants the same cognitive advantage as does listening to their native language. Here, we implement a machine-learning (ML) approach to ask whether there are acoustic properties, available on the surface of these vocalizations, that permit infants' to identify which vocalizations are candidate links to cognition. We provided the model with a robust sample of vocalizations that, from the vantage point of English-acquiring 4-month-olds, either support object categorization (English, German, lemur vocalizations) or fail to do so (Cantonese, zebra-finch vocalizations). We assess (a) whether supervised ML classification models can distinguish those vocalizations that support cognition from those that do not, and (b) which class(es) of acoustic features (including rhythmic, spectral envelope, and pitch features) best support that classification. Our analysis reveals that principal components derived from rhythm-relevant acoustic features were among the most robust in supporting the classification. Classifications performed using temporal envelope components were also robust. These new findings provide in principle evidence that infants' earliest links between vocalizations and cognition may be subserved by their perceptual sensitivity to rhythmic and spectral elements available on the surface of these vocalizations, and that these may guide infants' identification of candidate links to cognition.
Collapse
|
49
|
Where words are powerless to express: Use of music in paediatric neurology. J Pediatr Rehabil Med 2022; 16:179-194. [PMID: 35599509 DOI: 10.3233/prm-200802] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
Music is an art form that strongly affects people and can elicit many different emotions at the same time, including happiness, anxiety, sadness, and even ecstasy. What is it about music that causes such a strong reaction from each of us? Music engages many senses, which in turn can produce a multiplicity of responses and help create more extensive neuronal connections, as well as influence behaviour through structural and functional changes in the brain. Music-based interventions as a therapeutic tool in rehabilitation are becoming more common. It is said that the impact of music on the human body is positive. However, what impact does music have on the young nervous system, especially the affected one? This review presents the advantages and disadvantages of the use of music in paediatric neurology to treat dyslexia, cerebral palsy, and stroke, among others. Potential negative impacts such as musicogenic epilepsy and hallucinations will be discussed.
Collapse
|
50
|
Slow phase-locked modulations support selective attention to sound. Neuroimage 2022; 252:119024. [PMID: 35231629 PMCID: PMC9133470 DOI: 10.1016/j.neuroimage.2022.119024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2022] [Revised: 02/16/2022] [Accepted: 02/19/2022] [Indexed: 11/16/2022] Open
Abstract
To make sense of complex soundscapes, listeners must select and attend to task-relevant streams while ignoring uninformative sounds. One possible neural mechanism underlying this process is alignment of endogenous oscillations with the temporal structure of the target sound stream. Such a mechanism has been suggested to mediate attentional modulation of neural phase-locking to the rhythms of attended sounds. However, such modulations are compatible with an alternate framework, where attention acts as a filter that enhances exogenously-driven neural auditory responses. Here we attempted to test several predictions arising from the oscillatory account by playing two tone streams varying across conditions in tone duration and presentation rate; participants attended to one stream or listened passively. Attentional modulation of the evoked waveform was roughly sinusoidal and scaled with rate, while the passive response did not. However, there was only limited evidence for continuation of modulations through the silence between sequences. These results suggest that attentionally-driven changes in phase alignment reflect synchronization of slow endogenous activity with the temporal structure of attended stimuli.
Collapse
|