1
|
Murray CA, Shams L. Crossmodal interactions in human learning and memory. Front Hum Neurosci 2023; 17:1181760. [PMID: 37266327 PMCID: PMC10229776 DOI: 10.3389/fnhum.2023.1181760] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2023] [Accepted: 05/02/2023] [Indexed: 06/03/2023] Open
Abstract
Most studies of memory and perceptual learning in humans have employed unisensory settings to simplify the study paradigm. However, in daily life we are often surrounded by complex and cluttered scenes made up of many objects and sources of sensory stimulation. Our experiences are, therefore, highly multisensory both when passively observing the world and when acting and navigating. We argue that human learning and memory systems are evolved to operate under these multisensory and dynamic conditions. The nervous system exploits the rich array of sensory inputs in this process, is sensitive to the relationship between the sensory inputs, and continuously updates sensory representations, and encodes memory traces based on the relationship between the senses. We review some recent findings that demonstrate a range of human learning and memory phenomena in which the interactions between visual and auditory modalities play an important role, and suggest possible neural mechanisms that can underlie some surprising recent findings. We outline open questions as well as directions of future research to unravel human perceptual learning and memory.
Collapse
Affiliation(s)
- Carolyn A. Murray
- Department of Psychology, University of California, Los Angeles, Los Angeles, CA, United States
| | - Ladan Shams
- Department of Psychology, University of California, Los Angeles, Los Angeles, CA, United States
- Department of Bioengineering, Neuroscience Interdepartmental Program, University of California, Los Angeles, Los Angeles, CA, United States
| |
Collapse
|
2
|
Chiou SC, Schack T. Working memory for movement rhythms given spatial relevance: Effects of sequence length and maintenance delay. VISUAL COGNITION 2023. [DOI: 10.1080/13506285.2022.2162173] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/05/2023]
Affiliation(s)
- Shiau-Chuen Chiou
- Neurocognition and Action Research Group, Center for Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
- Faculty of Psychology and Sports Science, Bielefeld University, Bielefeld, Germany
| | - Thomas Schack
- Neurocognition and Action Research Group, Center for Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
- Faculty of Psychology and Sports Science, Bielefeld University, Bielefeld, Germany
| |
Collapse
|
3
|
Tierney A, Gomez JC, Fedele O, Kirkham NZ. Reading ability in children relates to rhythm perception across modalities. J Exp Child Psychol 2021; 210:105196. [PMID: 34090237 DOI: 10.1016/j.jecp.2021.105196] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2020] [Revised: 03/02/2021] [Accepted: 05/03/2021] [Indexed: 10/21/2022]
Abstract
The onset of reading ability is rife with individual differences, with some children termed "early readers" and some falling behind from the very beginning. Reading skill in children has been linked to an ability to remember nonverbal rhythms, specifically in the auditory modality. It has been hypothesized that the link between rhythm skills and reading reflects a shared reliance on the ability to extract temporal structure from sound. Here we tested this hypothesis by investigating whether the link between rhythm memory and reading depends on the modality in which rhythms are presented. We tested 75 primary school-aged children aged 7-11 years on a within-participants battery of reading and rhythm tasks. Participants received a reading efficiency task followed by three rhythm tasks (auditory, visual, and audiovisual). Results showed that children who performed poorly on the reading task also performed poorly on the tasks that required them to remember and repeat back nonverbal rhythms. In addition, these children showed a rhythmic deficit not just in the auditory domain but also in the visual domain. However, auditory rhythm memory explained additional variance in reading ability even after controlling for visual memory. These results suggest that reading ability and rhythm memory rely both on shared modality-general cognitive processes and on the ability to perceive the temporal structure of sound.
Collapse
Affiliation(s)
- Adam Tierney
- Department of Psychological Sciences, Birkbeck, University of London, London WC1E 7HX, UK.
| | - Jessica Cardona Gomez
- Department of Psychological Sciences, Birkbeck, University of London, London WC1E 7HX, UK
| | - Oliver Fedele
- Department of Psychological Sciences, Birkbeck, University of London, London WC1E 7HX, UK
| | - Natasha Z Kirkham
- Department of Psychological Sciences, Birkbeck, University of London, London WC1E 7HX, UK.
| |
Collapse
|
4
|
Planton S, van Kerkoerle T, Abbih L, Maheu M, Meyniel F, Sigman M, Wang L, Figueira S, Romano S, Dehaene S. A theory of memory for binary sequences: Evidence for a mental compression algorithm in humans. PLoS Comput Biol 2021; 17:e1008598. [PMID: 33465081 PMCID: PMC7845997 DOI: 10.1371/journal.pcbi.1008598] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2020] [Revised: 01/29/2021] [Accepted: 12/01/2020] [Indexed: 01/29/2023] Open
Abstract
Working memory capacity can be improved by recoding the memorized information in a condensed form. Here, we tested the theory that human adults encode binary sequences of stimuli in memory using an abstract internal language and a recursive compression algorithm. The theory predicts that the psychological complexity of a given sequence should be proportional to the length of its shortest description in the proposed language, which can capture any nested pattern of repetitions and alternations using a limited number of instructions. Five experiments examine the capacity of the theory to predict human adults' memory for a variety of auditory and visual sequences. We probed memory using a sequence violation paradigm in which participants attempted to detect occasional violations in an otherwise fixed sequence. Both subjective complexity ratings and objective violation detection performance were well predicted by our theoretical measure of complexity, which simply reflects a weighted sum of the number of elementary instructions and digits in the shortest formula that captures the sequence in our language. While a simpler transition probability model, when tested as a single predictor in the statistical analyses, accounted for significant variance in the data, the goodness-of-fit with the data significantly improved when the language-based complexity measure was included in the statistical model, while the variance explained by the transition probability model largely decreased. Model comparison also showed that shortest description length in a recursive language provides a better fit than six alternative previously proposed models of sequence encoding. The data support the hypothesis that, beyond the extraction of statistical knowledge, human sequence coding relies on an internal compression using language-like nested structures.
Collapse
Affiliation(s)
- Samuel Planton
- Cognitive Neuroimaging Unit, CEA, INSERM, Université Paris-Sud, Université Paris-Saclay, NeuroSpin center, Gif/Yvette, France
| | - Timo van Kerkoerle
- Cognitive Neuroimaging Unit, CEA, INSERM, Université Paris-Sud, Université Paris-Saclay, NeuroSpin center, Gif/Yvette, France
| | - Leïla Abbih
- Cognitive Neuroimaging Unit, CEA, INSERM, Université Paris-Sud, Université Paris-Saclay, NeuroSpin center, Gif/Yvette, France
| | - Maxime Maheu
- Cognitive Neuroimaging Unit, CEA, INSERM, Université Paris-Sud, Université Paris-Saclay, NeuroSpin center, Gif/Yvette, France
- Université de Paris, Paris, France
| | - Florent Meyniel
- Cognitive Neuroimaging Unit, CEA, INSERM, Université Paris-Sud, Université Paris-Saclay, NeuroSpin center, Gif/Yvette, France
| | - Mariano Sigman
- Laboratorio de Neurociencia, Universidad Torcuato Di Tella, Buenos Aires, Argentina
- CONICET (Consejo Nacional de Investigaciones Científicas y Tecnicas), Buenos Aires, Argentina
- Facultad de Lenguas y Educacion, Universidad Nebrija, Madrid, Spain
| | - Liping Wang
- Institute of Neuroscience, Key Laboratory of Primate Neurobiology, CAS Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai, China
| | - Santiago Figueira
- CONICET (Consejo Nacional de Investigaciones Científicas y Tecnicas), Buenos Aires, Argentina
- Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales, Departamento de Computacion, Buenos Aires, Argentina
| | - Sergio Romano
- CONICET (Consejo Nacional de Investigaciones Científicas y Tecnicas), Buenos Aires, Argentina
- Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales, Departamento de Computacion, Buenos Aires, Argentina
| | - Stanislas Dehaene
- Cognitive Neuroimaging Unit, CEA, INSERM, Université Paris-Sud, Université Paris-Saclay, NeuroSpin center, Gif/Yvette, France
- Collège de France, Paris, France
| |
Collapse
|
5
|
Temporal sequence discrimination within and across senses: do we really hear what we see? Exp Brain Res 2019; 237:3089-3098. [PMID: 31541284 DOI: 10.1007/s00221-019-05654-4] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2019] [Accepted: 09/14/2019] [Indexed: 10/26/2022]
Abstract
Previous evidence suggests that people "hear" visual stimuli when encoding temporal information. This suggestion is based on the observation that auditory distractor information can strongly affect discrimination performance for visual temporal sequences. The present study aimed to replicate and extend this finding by investigating sequence discrimination within and across the two modalities. In two experimental series, participants judged whether two subsequently presented temporal sequences, a standard sequence followed by a comparison sequence, were identical or not. In Experimental Series A, irrelevant distractor information was presented simultaneously with the standard sequence. In Series B, the distraction appeared in the retention interval between the standard sequence and the comparison sequence. The results showed that auditory distraction impaired performance irrespective of whether the target sequences were auditory or visual, whereas visual distraction only impaired the discrimination of visual target sequences. Furthermore, auditory distraction was always at least as effective as visual distraction, irrespective of standard modality. Generally, discrimination performance was much better for auditory than for visual sequences. Overall, the present results are consistent with the idea that people code visual temporal information in the auditory modality. Moreover, the present study also suggests that such cross-modal interference effects should be interpreted cautiously with respect to their underlying timing mechanism because of the basic differences in temporal sensitivity between the two modalities.
Collapse
|
6
|
Fassnidge C, Ball D, Kazaz Z, Knudsen S, Spicer A, Tipple A, Freeman E. Hearing through Your Eyes: Neural Basis of Audiovisual Cross-activation, Revealed by Transcranial Alternating Current Stimulation. J Cogn Neurosci 2019; 31:922-935. [PMID: 30883286 DOI: 10.1162/jocn_a_01395] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Some people experience auditory sensations when seeing visual flashes or movements. This prevalent synaesthesia-like visually evoked auditory response (vEAR) could result either from overexuberant cross-activation between brain areas and/or reduced inhibition of normally occurring cross-activation. We have used transcranial alternating current stimulation (tACS) to test these theories. We applied tACS at 10 Hz (alpha band frequency) or 40 Hz (gamma band), bilaterally either to temporal or occipital sites, while measuring same/different discrimination of paired auditory (A) versus visual (V) Morse code sequences. At debriefing, participants were classified as vEAR or non-vEAR, depending on whether they reported "hearing" the silent flashes. In non-vEAR participants, temporal 10-Hz tACS caused impairment of A performance, which correlated with improved V; conversely under occipital tACS, poorer V performance correlated with improved A. This reciprocal pattern suggests that sensory cortices are normally mutually inhibitory and that alpha-frequency tACS may bias the balance of competition between them. vEAR participants showed no tACS effects, consistent with reduced inhibition, or enhanced cooperation between modalities. In addition, temporal 40-Hz tACS impaired V performance, specifically in individuals who showed a performance advantage for V (relative to A). Gamma-frequency tACS may therefore modulate the ability of these individuals to benefit from recoding flashes into the auditory modality, possibly by disrupting cross-activation of auditory areas by visual stimulation. Our results support both theories, suggesting that vEAR may depend on disinhibition of normally occurring sensory cross-activation, which may be expressed more strongly in some individuals. Furthermore, endogenous alpha- and gamma-frequency oscillations may function respectively to inhibit or promote this cross-activation.
Collapse
Affiliation(s)
| | - Danny Ball
- City University London.,University College London
| | | | | | | | | | | |
Collapse
|
7
|
Boyle SC, Kayser SJ, Kayser C. Neural correlates of multisensory reliability and perceptual weights emerge at early latencies during audio-visual integration. Eur J Neurosci 2017; 46:2565-2577. [PMID: 28940728 PMCID: PMC5725738 DOI: 10.1111/ejn.13724] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/17/2017] [Revised: 09/11/2017] [Accepted: 09/18/2017] [Indexed: 12/24/2022]
Abstract
To make accurate perceptual estimates, observers must take the reliability of sensory information into account. Despite many behavioural studies showing that subjects weight individual sensory cues in proportion to their reliabilities, it is still unclear when during a trial neuronal responses are modulated by the reliability of sensory information or when they reflect the perceptual weights attributed to each sensory input. We investigated these questions using a combination of psychophysics, EEG‐based neuroimaging and single‐trial decoding. Our results show that the weighted integration of sensory information in the brain is a dynamic process; effects of sensory reliability on task‐relevant EEG components were evident 84 ms after stimulus onset, while neural correlates of perceptual weights emerged 120 ms after stimulus onset. These neural processes had different underlying sources, arising from sensory and parietal regions, respectively. Together these results reveal the temporal dynamics of perceptual and neural audio‐visual integration and support the notion of temporally early and functionally specific multisensory processes in the brain.
Collapse
Affiliation(s)
- Stephanie C Boyle
- Institute of Neuroscience and Psychology, University of Glasgow, Hillhead Street 58, Glasgow, G12 8QB, UK
| | - Stephanie J Kayser
- Institute of Neuroscience and Psychology, University of Glasgow, Hillhead Street 58, Glasgow, G12 8QB, UK
| | - Christoph Kayser
- Institute of Neuroscience and Psychology, University of Glasgow, Hillhead Street 58, Glasgow, G12 8QB, UK
| |
Collapse
|
8
|
Murgia M, Prpic V, O J, McCullagh P, Santoro I, Galmonte A, Agostini T. Modality and Perceptual-Motor Experience Influence the Detection of Temporal Deviations in Tap Dance Sequences. Front Psychol 2017; 8:1340. [PMID: 28824516 PMCID: PMC5539223 DOI: 10.3389/fpsyg.2017.01340] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2017] [Accepted: 07/20/2017] [Indexed: 11/13/2022] Open
Abstract
Accurate temporal information processing is critically important in many motor activities within disciplines such as dance, music, and sport. However, it is still unclear how temporal information related to biological motion is processed by expert and non-expert performers. It is well-known that the auditory modality dominates the visual modality in processing temporal information of simple stimuli, and that experts outperform non-experts in biological motion perception. In the present study, we combined these two areas of research; we investigated how experts and non-experts detected temporal deviations in tap dance sequences, in the auditory modality compared to the visual modality. We found that temporal deviations were better detected in the auditory modality compared to the visual modality, and by experts compared to non-experts. However, post hoc analyses indicated that these effects were mainly due to performances obtained by experts in the auditory modality. The results suggest that the experience advantage is not equally distributed across the modalities, and that tap dance experience enhances the effectiveness of the auditory modality but not the visual modality when processing temporal information. The present results and their potential implications are discussed in both temporal information processing and biological motion perception frameworks.
Collapse
Affiliation(s)
- Mauro Murgia
- Department of Life Sciences, University of TriesteTrieste, Italy
| | - Valter Prpic
- Division of Psychology, De Montfort UniversityLeicester, United Kingdom
| | - Jenny O
- Department of Kinesiology, California State University, East Bay, HaywardCA, United States
| | - Penny McCullagh
- Department of Kinesiology, California State University, East Bay, HaywardCA, United States
| | - Ilaria Santoro
- Department of Life Sciences, University of TriesteTrieste, Italy
| | - Alessandra Galmonte
- Department of Medical, Surgical and Health Sciences, University of TriesteTrieste, Italy
| | - Tiziano Agostini
- Department of Life Sciences, University of TriesteTrieste, Italy
| |
Collapse
|
9
|
Fassnidge C, Cecconi Marcotti C, Freeman E. A deafening flash! Visual interference of auditory signal detection. Conscious Cogn 2017; 49:15-24. [PMID: 28092861 DOI: 10.1016/j.concog.2016.12.009] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2016] [Accepted: 12/09/2016] [Indexed: 10/20/2022]
Abstract
In some people, visual stimulation evokes auditory sensations. How prevalent and how perceptually real is this? 22% of our neurotypical adult participants responded 'Yes' when asked whether they heard faint sounds accompanying flash stimuli, and showed significantly better ability to discriminate visual 'Morse-code' sequences. This benefit might arise from an ability to recode visual signals as sounds, thus taking advantage of superior temporal acuity of audition. In support of this, those who showed better visual relative to auditory sequence discrimination also had poorer auditory detection in the presence of uninformative visual flashes, though this was independent of awareness of visually-evoked sounds. Thus a visually-evoked auditory representation may occur subliminally and disrupt detection of real auditory signals. The frequent natural correlation between visual and auditory stimuli might explain the surprising prevalence of this phenomenon. Overall, our results suggest that learned correspondences between strongly correlated modalities may provide a precursor for some synaesthetic abilities.
Collapse
Affiliation(s)
- Christopher Fassnidge
- Cognitive Neuroscience Research Unit, Department of Psychology, City, University of London, London, UK
| | | | - Elliot Freeman
- Cognitive Neuroscience Research Unit, Department of Psychology, City, University of London, London, UK.
| |
Collapse
|
10
|
Abstract
The principles that guide large-scale cortical reorganization remain unclear. In the blind, several visual regions preserve their task specificity; ventral visual areas, for example, become engaged in auditory and tactile object-recognition tasks. It remains open whether task-specific reorganization is unique to the visual cortex or, alternatively, whether this kind of plasticity is a general principle applying to other cortical areas. Auditory areas can become recruited for visual and tactile input in the deaf. Although nonhuman data suggest that this reorganization might be task specific, human evidence has been lacking. Here we enrolled 15 deaf and 15 hearing adults into an functional MRI experiment during which they discriminated between temporally complex sequences of stimuli (rhythms). Both deaf and hearing subjects performed the task visually, in the central visual field. In addition, hearing subjects performed the same task in the auditory modality. We found that the visual task robustly activated the auditory cortex in deaf subjects, peaking in the posterior-lateral part of high-level auditory areas. This activation pattern was strikingly similar to the pattern found in hearing subjects performing the auditory version of the task. Although performing the visual task in deaf subjects induced an increase in functional connectivity between the auditory cortex and the dorsal visual cortex, no such effect was found in hearing subjects. We conclude that in deaf humans the high-level auditory cortex switches its input modality from sound to vision but preserves its task-specific activation pattern independent of input modality. Task-specific reorganization thus might be a general principle that guides cortical plasticity in the brain.
Collapse
|
11
|
Michalka SW, Kong L, Rosen ML, Shinn-Cunningham BG, Somers DC. Short-Term Memory for Space and Time Flexibly Recruit Complementary Sensory-Biased Frontal Lobe Attention Networks. Neuron 2015; 87:882-92. [PMID: 26291168 DOI: 10.1016/j.neuron.2015.07.028] [Citation(s) in RCA: 85] [Impact Index Per Article: 9.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2014] [Revised: 02/25/2015] [Accepted: 07/20/2015] [Indexed: 11/28/2022]
Abstract
The frontal lobes control wide-ranging cognitive functions; however, functional subdivisions of human frontal cortex are only coarsely mapped. Here, functional magnetic resonance imaging reveals two distinct visual-biased attention regions in lateral frontal cortex, superior precentral sulcus (sPCS) and inferior precentral sulcus (iPCS), anatomically interdigitated with two auditory-biased attention regions, transverse gyrus intersecting precentral sulcus (tgPCS) and caudal inferior frontal sulcus (cIFS). Intrinsic functional connectivity analysis demonstrates that sPCS and iPCS fall within a broad visual-attention network, while tgPCS and cIFS fall within a broad auditory-attention network. Interestingly, we observe that spatial and temporal short-term memory (STM), respectively, recruit visual and auditory attention networks in the frontal lobe, independent of sensory modality. These findings not only demonstrate that both sensory modality and information domain influence frontal lobe functional organization, they also demonstrate that spatial processing co-localizes with visual processing and that temporal processing co-localizes with auditory processing in lateral frontal cortex.
Collapse
Affiliation(s)
- Samantha W Michalka
- Center for Computational Neuroscience and Neural Technology, Boston University, Boston, MA 02215, USA; Graduate Program for Neuroscience, Boston University, Boston, MA 02215, USA.
| | - Lingqiang Kong
- Center for Computational Neuroscience and Neural Technology, Boston University, Boston, MA 02215, USA; Department of Psychological and Brain Sciences, Boston University, Boston, MA 02215, USA
| | - Maya L Rosen
- Department of Psychological and Brain Sciences, Boston University, Boston, MA 02215, USA
| | - Barbara G Shinn-Cunningham
- Center for Computational Neuroscience and Neural Technology, Boston University, Boston, MA 02215, USA; Biomedical Engineering, Boston University, Boston, MA 02215, USA
| | - David C Somers
- Center for Computational Neuroscience and Neural Technology, Boston University, Boston, MA 02215, USA; Department of Psychological and Brain Sciences, Boston University, Boston, MA 02215, USA; Graduate Program for Neuroscience, Boston University, Boston, MA 02215, USA.
| |
Collapse
|
12
|
Barakat B, Seitz AR, Shams L. Visual rhythm perception improves through auditory but not visual training. Curr Biol 2015; 25:R60-R61. [PMID: 25602302 DOI: 10.1016/j.cub.2014.12.011] [Citation(s) in RCA: 31] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Affiliation(s)
- Brandon Barakat
- Department of Psychology, University of California, Los Angeles, Los Angeles, CA 90095, USA.
| | - Aaron R Seitz
- Department of Psychology, University of California, Riverside, Riverside, CA 92521, USA.
| | - Ladan Shams
- Department of Psychology, University of California, Los Angeles, Los Angeles, CA 90095, USA.
| |
Collapse
|
13
|
Auditory feedback in error-based learning of motor regularity. Brain Res 2015; 1606:54-67. [DOI: 10.1016/j.brainres.2015.02.026] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2014] [Revised: 02/07/2015] [Accepted: 02/09/2015] [Indexed: 11/19/2022]
|
14
|
Nozaradan S. Exploring how musical rhythm entrains brain activity with electroencephalogram frequency-tagging. Philos Trans R Soc Lond B Biol Sci 2014; 369:20130393. [PMID: 25385771 PMCID: PMC4240960 DOI: 10.1098/rstb.2013.0393] [Citation(s) in RCA: 105] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
The ability to perceive a regular beat in music and synchronize to this beat is a widespread human skill. Fundamental to musical behaviour, beat and meter refer to the perception of periodicities while listening to musical rhythms and often involve spontaneous entrainment to move on these periodicities. Here, we present a novel experimental approach inspired by the frequency-tagging approach to understand the perception and production of rhythmic inputs. This approach is illustrated here by recording the human electroencephalogram responses at beat and meter frequencies elicited in various contexts: mental imagery of meter, spontaneous induction of a beat from rhythmic patterns, multisensory integration and sensorimotor synchronization. Collectively, our observations support the view that entrainment and resonance phenomena subtend the processing of musical rhythms in the human brain. More generally, they highlight the potential of this approach to help us understand the link between the phenomenology of musical beat and meter and the bias towards periodicities arising under certain circumstances in the nervous system. Entrainment to music provides a highly valuable framework to explore general entrainment mechanisms as embodied in the human brain.
Collapse
Affiliation(s)
- Sylvie Nozaradan
- Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), 53, Avenue Mounier-UCL 53.75, Bruxelles 1200, Belgium International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, Canada H3C 3J7
| |
Collapse
|
15
|
Fast transfer of crossmodal time interval training. Exp Brain Res 2014; 232:1855-64. [PMID: 24570386 DOI: 10.1007/s00221-014-3877-1] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2013] [Accepted: 02/12/2014] [Indexed: 10/25/2022]
Abstract
Sub-second time perception is essential for many important sensory and perceptual tasks including speech perception, motion perception, motor coordination, and crossmodal interaction. This study investigates to what extent the ability to discriminate sub-second time intervals acquired in one sensory modality can be transferred to another modality. To this end, we used perceptual classification of visual Ternus display (Ternus in Psychol Forsch 7:81-136, 1926) to implicitly measure participants' interval perception in pre- and posttests and implemented an intra- or crossmodal sub-second interval discrimination training protocol in between the tests. The Ternus display elicited either an "element motion" or a "group motion" percept, depending on the inter-stimulus interval between the two visual frames. The training protocol required participants to explicitly compare the interval length between a pair of visual, auditory, or tactile stimuli with a standard interval or to implicitly perceive the length of visual, auditory, or tactile intervals by completing a non-temporal task (discrimination of auditory pitch or tactile intensity). Results showed that after fast explicit training of interval discrimination (about 15 min), participants improved their ability to categorize the visual apparent motion in Ternus displays, although the training benefits were mild for visual timing training. However, the benefits were absent for implicit interval training protocols. This finding suggests that the timing ability in one modality can be rapidly acquired and used to improve timing-related performance in another modality and that there may exist a central clock for sub-second temporal processing, although modality-specific perceptual properties may constrain the functioning of this clock.
Collapse
|
16
|
|
17
|
Zhang H, Chen L, Zhou X. Adaptation to visual or auditory time intervals modulates the perception of visual apparent motion. Front Integr Neurosci 2012; 6:100. [PMID: 23133408 PMCID: PMC3488759 DOI: 10.3389/fnint.2012.00100] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2012] [Accepted: 10/16/2012] [Indexed: 11/25/2022] Open
Abstract
It is debated whether sub-second timing is subserved by a centralized mechanism or by the intrinsic properties of task-related neural activity in specific modalities (Ivry and Schlerf, 2008). By using a temporal adaptation task, we investigated whether adapting to different time intervals conveyed through stimuli in different modalities (i.e., frames of a visual Ternus display, visual blinking discs, or auditory beeps) would affect the subsequent implicit perception of visual timing, i.e., inter-stimulus interval (ISI) between two frames in a Ternus display. The Ternus display can induce two percepts of apparent motion (AM), depending on the ISI between the two frames: "element motion" for short ISIs, in which the endmost disc is seen as moving back and forth while the middle disc at the overlapping or central position remains stationary; "group motion" for longer ISIs, in which both discs appear to move in a manner of lateral displacement as a whole. In Experiment 1, participants adapted to either the typical "element motion" (ISI = 50 ms) or the typical "group motion" (ISI = 200 ms). In Experiments 2 and 3, participants adapted to a time interval of 50 or 200 ms through observing a series of two paired blinking discs at the center of the screen (Experiment 2) or hearing a sequence of two paired beeps (with pitch 1000 Hz). In Experiment 4, participants adapted to sequences of paired beeps with either low pitches (500 Hz) or high pitches (5000 Hz). After adaptation in each trial, participants were presented with a Ternus probe in which the ISI between the two frames was equal to the transitional threshold of the two types of motions, as determined by a pretest. Results showed that adapting to the short time interval in all the situations led to more reports of "group motion" in the subsequent Ternus probes; adapting to the long time interval, however, caused no aftereffect for visual adaptation but significantly more reports of group motion for auditory adaptation. These findings, suggesting amodal representation for sub-second timing across modalities, are interpreted in the framework of temporal pacemaker model.
Collapse
Affiliation(s)
- Huihui Zhang
- Department of Psychology, Center for Brain and Cognitive Sciences, Peking UniversityBeijing, China
| | - Lihan Chen
- Department of Psychology, Center for Brain and Cognitive Sciences, Peking UniversityBeijing, China
- Key Laboratory of Machine Perception (Ministry of Education), Peking UniversityBeijing, China
| | - Xiaolin Zhou
- Department of Psychology, Center for Brain and Cognitive Sciences, Peking UniversityBeijing, China
- Key Laboratory of Machine Perception (Ministry of Education), Peking UniversityBeijing, China
| |
Collapse
|
18
|
See what I hear? Beat perception in auditory and visual rhythms. Exp Brain Res 2012; 220:51-61. [PMID: 22623092 DOI: 10.1007/s00221-012-3114-8] [Citation(s) in RCA: 74] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2011] [Accepted: 04/27/2012] [Indexed: 10/28/2022]
Abstract
Our perception of time is affected by the modality in which it is conveyed. Moreover, certain temporal phenomena appear to exist in only one modality. The perception of temporal regularity or structure (e.g., the 'beat') in rhythmic patterns is one such phenomenon: visual beat perception is rare. The modality-specificity for beat perception is puzzling, as the durations that comprise rhythmic patterns are much longer than the limits of visual temporal resolution. Moreover, the optimization that beat perception provides for memory of auditory sequences should be equally relevant to visual sequences. Why does beat perception appear to be modality specific? One possibility is that the nature of the visual stimulus plays a role. Previous studies have usually used brief stimuli (e.g., light flashes) to present visual rhythms. In the current study, a rotating line that appeared sequentially in different spatial orientations was used to present a visual rhythm. Discrimination accuracy for visual rhythms and auditory rhythms was compared for different types of rhythms. The rhythms either had a regular temporal structure that previously has been shown to induce beat perception in the auditory modality, or they had an irregular temporal structure without beat-inducing qualities. Overall, the visual rhythms were discriminated more poorly than the auditory rhythms. The beat-based structure, however, increased accuracy for visual as well as auditory rhythms. These results indicate that beat perception can occur in the visual modality and improve performance on a temporal discrimination task, when certain types of stimuli are used.
Collapse
|
19
|
Marchant JL, Driver J. Visual and audiovisual effects of isochronous timing on visual perception and brain activity. Cereb Cortex 2012; 23:1290-8. [PMID: 22508766 PMCID: PMC3643713 DOI: 10.1093/cercor/bhs095] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
Understanding how the brain extracts and combines temporal structure (rhythm) information from events presented to different senses remains unresolved. Many neuroimaging beat perception studies have focused on the auditory domain and show the presence of a highly regular beat (isochrony) in “auditory” stimulus streams enhances neural responses in a distributed brain network and affects perceptual performance. Here, we acquired functional magnetic resonance imaging (fMRI) measurements of brain activity while healthy human participants performed a visual task on isochronous versus randomly timed “visual” streams, with or without concurrent task-irrelevant sounds. We found that visual detection of higher intensity oddball targets was better for isochronous than randomly timed streams, extending previous auditory findings to vision. The impact of isochrony on visual target sensitivity correlated positively with fMRI signal changes not only in visual cortex but also in auditory sensory cortex during audiovisual presentations. Visual isochrony activated a similar timing-related brain network to that previously found primarily in auditory beat perception work. Finally, activity in multisensory left posterior superior temporal sulcus increased specifically during concurrent isochronous audiovisual presentations. These results indicate that regular isochronous timing can modulate visual processing and this can also involve multisensory audiovisual brain mechanisms.
Collapse
Affiliation(s)
- Jennifer L Marchant
- Wellcome Trust Centre for Neuroimaging, UCL Institute of Neurology, London WC1N 3BG, UK.
| | | |
Collapse
|
20
|
Murgia M, Hohmann T, Galmonte A, Raab M, Agostini T. Recognising One's Own Motor Actions through Sound: The Role of Temporal Factors. Perception 2012; 41:976-87. [DOI: 10.1068/p7227] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
Abstract
It has been shown that humans are able to recognise their own movement. While visual cues have been amply studied, the contribution of auditory cues is not clear. Our aim was to investigate the role of temporal auditory cues in the identification of one's own or others' performance in a complex movement—a golf swing. We investigated whether golfers are able to discriminate between the sounds associated with their own swings and other golfers' swings, by using the relative timing and the overall duration of the movement. The sounds produced by the participants performing 65 m shots have been recorded and used to create the stimuli. The experimental conditions were: participants' swing sounds and the sounds of other golfers having equal both relative timing and overall duration, equal relative timing but different overall duration, different relative timing but equal overall duration, and both different relative timing and overall duration. The task of the participants was to say whether each sound corresponded or did not correspond to their own swing. Results show that golfers are able to recognise their own movements, but they also recognise as their own the sound produced by other athletes having equal both relative timing and overall duration.
Collapse
Affiliation(s)
- Mauro Murgia
- Department of Life Sciences, University of Trieste, via Sant' Anastasio 12, I 34134 Trieste, Italy
| | - Tanja Hohmann
- Institute of Psychology, German Sport University Cologne, Cologne, Germany
| | - Alessandra Galmonte
- Department of Neuropsychological, Morphological and Movement Sciences, University of Verona, Verona, Italy
| | - Markus Raab
- Institute of Psychology, German Sport University Cologne, Cologne, Germany
| | - Tiziano Agostini
- Department of Life Sciences, University of Trieste, via Sant' Anastasio 12, I 34134 Trieste, Italy
| |
Collapse
|
21
|
Nozaradan S, Peretz I, Mouraux A. Steady-state evoked potentials as an index of multisensory temporal binding. Neuroimage 2011; 60:21-8. [PMID: 22155324 DOI: 10.1016/j.neuroimage.2011.11.065] [Citation(s) in RCA: 57] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2011] [Revised: 11/10/2011] [Accepted: 11/22/2011] [Indexed: 11/28/2022] Open
Abstract
Temporal congruency promotes perceptual binding of multisensory inputs. Here, we used EEG frequency-tagging to track cortical activities elicited by auditory and visual inputs separately, in the form of steady-state evoked potentials (SS-EPs). We tested whether SS-EPs could reveal a dynamic coupling of cortical activities related to the binding of auditory and visual inputs conveying synchronous vs. non-synchronous temporal periodicities, or beats. The temporally congruent audiovisual condition elicited markedly enhanced auditory and visual SS-EPs, as compared to the incongruent condition. Furthermore, an increased inter-trial phase coherence of both SS-EPs was observed in that condition. Taken together, these observations indicate that temporal congruency enhances the processing of multisensory inputs at sensory-specific stages of cortical processing, possibly through a dynamic binding by synchrony of the elicited activities and/or improved dynamic attending. Moreover, we show that EEG frequency-tagging with SS-EPs constitutes an effective tool to explore the neural dynamics of multisensory integration in the human brain.
Collapse
Affiliation(s)
- Sylvie Nozaradan
- Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Belgium
| | | | | |
Collapse
|
22
|
Abstract
Many studies have demonstrated that infants exhibit robust auditory rhythm discrimination, but research on infants' perception of visual rhythm is limited. In particular, the role of motion in infants' perception of visual rhythm remains unknown, despite the prevalence of motion cues in naturally occurring visual rhythms. In the present study, we examined the role of motion in 7-month-old infants' discrimination of visual rhythms by comparing experimental conditions with apparent motion in the stimuli versus stationary rhythmic stimuli. Infants succeeded at discriminating visual rhythms only when the visual rhythm occurred with an apparent motion component. These results support the view that motion plays a role in infants' perception of visual temporal information, consistent with the manner in which natural rhythms appear in the visual world.
Collapse
|
23
|
Grahn JA, Henry MJ, McAuley JD. FMRI investigation of cross-modal interactions in beat perception: audition primes vision, but not vice versa. Neuroimage 2011; 54:1231-43. [PMID: 20858544 PMCID: PMC3002396 DOI: 10.1016/j.neuroimage.2010.09.033] [Citation(s) in RCA: 100] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2010] [Revised: 09/13/2010] [Accepted: 09/14/2010] [Indexed: 11/24/2022] Open
Abstract
How we measure time and integrate temporal cues from different sensory modalities are fundamental questions in neuroscience. Sensitivity to a "beat" (such as that routinely perceived in music) differs substantially between auditory and visual modalities. Here we examined beat sensitivity in each modality, and examined cross-modal influences, using functional magnetic resonance imaging (fMRI) to characterize brain activity during perception of auditory and visual rhythms. In separate fMRI sessions, participants listened to auditory sequences or watched visual sequences. The order of auditory and visual sequence presentation was counterbalanced so that cross-modal order effects could be investigated. Participants judged whether sequences were speeding up or slowing down, and the pattern of tempo judgments was used to derive a measure of sensitivity to an implied beat. As expected, participants were less sensitive to an implied beat in visual sequences than in auditory sequences. However, visual sequences produced a stronger sense of beat when preceded by auditory sequences with identical temporal structure. Moreover, increases in brain activity were observed in the bilateral putamen for visual sequences preceded by auditory sequences when compared to visual sequences without prior auditory exposure. No such order-dependent differences (behavioral or neural) were found for the auditory sequences. The results provide further evidence for the role of the basal ganglia in internal generation of the beat and suggest that an internal auditory rhythm representation may be activated during visual rhythm perception.
Collapse
|
24
|
Abstract
How long did it take you to read this sentence? Chances are your response is a ball park estimate and its value depends on how fast you have scanned the text, how prepared you have been for this question, perhaps your mood or how much attention you have paid to these words. Time perception is here addressed in three sections. The first section summarizes theoretical difficulties in time perception research, specifically those pertaining to the representation of time and temporal processing. The second section reviews non-exhaustively temporal effects in multisensory perception. Sensory modalities interact in temporal judgement tasks, suggesting that (i) at some level of sensory analysis, the temporal properties across senses can be integrated in building a time percept and (ii) the representational format across senses is compatible for establishing such a percept. In the last section, a two-step analysis of temporal properties is sketched out. In the first step, it is proposed that temporal properties are automatically encoded at early stages of sensory analysis, thus providing the raw material for the building of a time percept; in the second step, time representations become available to perception through attentional gating of the raw temporal representations and via re-encoding into abstract representations.
Collapse
Affiliation(s)
- Virginie van Wassenhove
- Cognitive Neuroimaging Unit, Commissariat à l'Energie Atomique, NeuroSpin Center, Bât 145, Point Courier 156, Gif-sur-Yvette, France.
| |
Collapse
|
25
|
Noulhiane M, Pouthas V, Samson S. Is time reproduction sensitive to sensory modalities? ACTA ACUST UNITED AC 2009. [DOI: 10.1080/09541440701825981] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
26
|
Han DW, Shea CH. Auditory model: effects on learning under blocked and random practice schedules. RESEARCH QUARTERLY FOR EXERCISE AND SPORT 2008; 79:476-486. [PMID: 19177949 DOI: 10.1080/02701367.2008.10599514] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
An experiment was conducted to determine the impact of an auditory model on blocked, random, and mixed practice schedules of three five-segment timing sequences (relative time constant). We were interested in whether or not the auditory model differentially affected the learning of relative and absolute timing under blocked and random practice. Participants (N = 80) were randomly assigned to one of eight practice conditions, which differed in practice schedule (blocked-blocked, blocked-random, random-blocked, random-random) and auditory model (no model, model). The results indicated that the auditory model enhanced relative timing performance on the delayed retention test regardless of the practice schedule, but it did not influence the learning of absolute timing. Blocked-blocked and blocked-random practice conditions resulted in enhanced relative timing retention performance relative to random-blocked and random-random practice schedules. Random-random and blocked-random practice schedules resulted in better absolute timing than blocked-blocked or random-blocked practice, regardless of the presence or absence of an auditory model during acquisition. Thus, considering both relative and absolute timing, the blocked-random practice condition resulted in overall learning superior to the other practice schedules. The results also suggest that an auditory model produces an added effect on learning relative timing regardless of the practice schedule, but it does not influence the learning of absolute timing.
Collapse
Affiliation(s)
- Dong-Wook Han
- Department of Physical Education, Seoul National University
| | | |
Collapse
|
27
|
|
28
|
|
29
|
|
30
|
Nimmo LM, Lewandowsky S. Distinctiveness revisited: unpredictable temporal isolation does not benefit short-term serial recall of heard or seen events. Mem Cognit 2007; 34:1368-75. [PMID: 17225515 DOI: 10.3758/bf03193278] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
The notion of a link between time and memory is intuitively appealing and forms the core assumption of temporal distinctiveness models. Distinctiveness models predict that items that are temporally isolated from their neighbors at presentation should be recalled better than items that are temporally crowded. By contrast, event-based theories consider time to be incidental to the processes that govern memory, and such theories would not imply a temporal isolation advantage unless participants engaged in a consolidation process (e.g., rehearsal or selective encoding) that exploited the temporal structure of the list. In this report, we examine two studies that assessed the effect of temporal distinctiveness on memory, using auditory (Experiment 1) and auditory and visual (Experiment 2) presentation with unpredictably varying interitem intervals. The results show that with unpredictable intervals temporal isolation does not benefit memory, regardless of presentation modality.
Collapse
Affiliation(s)
- Lisa M Nimmo
- School of Psychology, University of Western Australia, Crawley.
| | | |
Collapse
|
31
|
Abstract
The suffix effect is the selective impairment in recall of the final items of a spoken list when the list is followed by a nominally irrelevant speech item, or suffix. It is widely assumed to comprise a bottom-up, or structural, effect restricted to the terminal item and a top-down, or conceptually sensitive, effect confined to the preterminal items. Reported here are eight experiments that challenge this view by demonstrating that the terminal suffix effect, as well as the preterminal suffix effect, is susceptible to conceptual influence. The entire suffix effect may be better conceived of as a phenomenon arising from perceptual grouping.
Collapse
Affiliation(s)
- Lance C Bloom
- Department of Psychology, Rice University, Houston, TX 77251-1892, USA.
| |
Collapse
|
32
|
Boltz MG. Duration judgments of naturalistic events in the auditory and visual modalities. ACTA ACUST UNITED AC 2005; 67:1362-75. [PMID: 16555588 DOI: 10.3758/bf03193641] [Citation(s) in RCA: 30] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Two experiments were performed to examine whether the same underlying mechanisms apply to the duration estimates of both auditory and visual events. In Experiment 1, it was found that the durations of visual scenes are reproduced with the same level of accuracy in prospective and retrospective situations when these display a predictable array of information, a result consistent with past research on auditory durations. Experiment 2 further revealed that when participants are asked to prospectively or retrospectively judge the durations of various naturalistic events in their auditory, visual, or audiovisual modality, no differences in either accuracy or bias are observed. These findings diverge from previous research and are argued to stem from different processing mechanisms that arise from naturalistic events.
Collapse
Affiliation(s)
- Marilyn G Boltz
- Department of Psychology, Haverford College, Haverford, PA 19041, USA.
| |
Collapse
|
33
|
Abstract
When the senses deliver conflicting information, vision dominates spatial processing, and audition dominates temporal processing. We asked whether this sensory specialization results in cross-modal encoding of unisensory input into the task-appropriate modality. Specifically, we investigated whether visually portrayed temporal structure receives automatic, obligatory encoding in the auditory domain. In three experiments, observers judged whether the changes in two successive visual sequences followed the same or different rhythms. We assessed temporal representations by measuring the extent to which both task-irrelevant auditory information and task-irrelevant visual information interfered with rhythm discrimination. Incongruent auditory information significantly disrupted task performance, particularly when presented during encoding; by contrast, varying the nature of the rhythm-depicting visual changes had minimal impact on performance. Evidently, the perceptual system automatically and obligatorily abstracts temporal structure from its visual form and represents this structure using an auditory code, resulting in the experience of "hearing visual rhythms."
Collapse
Affiliation(s)
- Sharon E Guttman
- Department of Psychology, Vanderbilt University, 111 21st Avenue South, 301 Wilson Hall, Nashville, TN 37203, USA.
| | | | | |
Collapse
|
34
|
Lai Q, Shea CH, Bruechert L, Little M. Auditory Model Enhances Relative-Timing Learning. J Mot Behav 2002; 34:299-307. [PMID: 19260180 DOI: 10.1080/00222890209601948] [Citation(s) in RCA: 24] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
|
35
|
Klapproth F. The effect of study-test modalities on the remembrance of subjective duration from long-term memory. Behav Processes 2002; 59:37-46. [PMID: 12090944 DOI: 10.1016/s0376-6357(02)00061-x] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022]
Abstract
It was examined whether stimulus modality (auditory vs. visual) affects the retrieval of subjective duration from memory. In two experiments the temporal generalization paradigm was used. Participants had to decide whether the previously learned standard duration (400 ms) occurred in the context of comparison stimuli. Two major results were found. (1) Discrimination was more accurate if the training and testing stimuli were of the same modality than if they were of opposite modalities. (2) If both modality of learning and modality of testing were different, subjects systematically underestimated the test durations, i.e. temporal generalization gradients (the proportion of identifications of a stimulus as the standard, plotted against stimulus duration) shifted to the right. The observed shift is interpreted as a result of a delayed timing process.
Collapse
Affiliation(s)
- Florian Klapproth
- University of Hildesheim, Marienburger Platz 22, D-31141, Hildesheim, Germany
| |
Collapse
|
36
|
Abstract
The purpose of the current study was to examine the relationship between short-term memory for rhythm and the phonological loop in working memory. Results showed that digit span scores significantly correlated with the scores on the rhythmic memory task, and that the correlation between the two scores remained significant even after the common variance with reading speed was partialled out. Partial correlation and regression analyses indicated that the relation between memory for rhythm and digit span scores is mediated by the third component in the phonological loop, the component that is responsible for regulation of timing mechanisms in immediate memory tasks.
Collapse
|
37
|
Abstract
Prior research has established that performance in short-term memory tasks using auditory rhythmic stimuli is frequently superior to that in tasks using visual stimuli. In five experiments, the reasons for this were explored further. In a same-different task, pairs of brief rhythms were presented in which each rhythm was visual or auditory, resulting in two same-modality conditions and two cross-modality conditions. Three different rates of presentation were used. The results supported the temporal advantage of the auditory modality in short-term memory, which was quite robust at the quickest presentation rates. This advantage tended to decay as the presentation rate was slowed down, consistent with the view that, with time, the temporal patterns were being recoded into a more generic form.
Collapse
Affiliation(s)
- G L Collier
- Department of Psychology, South Carolina State University, Orangeburg 29117, USA.
| | | |
Collapse
|
38
|
Kumai M, Sugai K. Relation between synchronized and self-paced response in preschoolers' rhythmic movement. Percept Mot Skills 1997; 85:1327-37. [PMID: 9450288 DOI: 10.2466/pms.1997.85.3f.1327] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
Regulation of the rhythmic movement of 29 preschoolers ages 3 to 6 years was studied in connection with self-paced response. An Auditory Pulse condition presented the pulse audibly, a Visual Pulse condition presented the pulse visibly, and a Moving Visual Target condition presented the repetitive movement of a visual target. We used a Quick Tempo condition in which the interstimulus interval was slightly different from the average self-paced tapping rate at which each subject felt comfortable, and a Slow Tempo in which the interval was considerably different. The error in the interresponse interval of tapping, i.e., the time gap between the mean interresponse and interstimulus intervals, was calculated as an indicator of regulation. The error in the former decreased across age groups only in the Slow Tempo condition. In the Slow-Tempo Visual-Pulse condition in which the error in the interresponse interval was particularly large, the younger subjects tended to respond at a rate near the self-paced response. In both tempos, the error in the interresponse interval in the Moving Visual Target condition was much the same as in the Auditory Pulse condition and was statistically smaller than in the Visual Pulse condition. These results may suggest that one of the important factors in the development of preschoolers' synchronization with physical rhythm is an ability to modify or restrain the self-paced response and that additional information from movement of the visual target could assist them externally in regulating movement.
Collapse
Affiliation(s)
- M Kumai
- Japan Society for the Promotion of Science, Faculty of Education, Tohoku University, Sendai, Japan
| | | |
Collapse
|
39
|
Abstract
This chapter focuses on recent research concerning verbal learning and memory. A prominent guiding framework for research on this topic over the past three decades has been the modal model of memory, which postulates distinct sensory, primary, and secondary memory stores. Although this model continues to be popular, it has fostered much debate concerning its validity and specifically the need for its three separate memory stores. The chapter reviews research supporting and research contradicting the modal model, as well as alternative modern frameworks. Extensions of the modal model are discussed, including the search of associative memory model, the perturbation model, precategorical acoustic store, and permastore. Alternative approaches are discussed including working memory, conceptual short-term memory, long-term working memory, short-term activation and attention, processing streams, the feature model, distinctiveness, and procedural reinstatement.
Collapse
Affiliation(s)
- A F Healy
- Department of Psychology, Muenzinger Building, University of Colorado, Campus Box 345, Boulder, CO 80309-0345, USA
| | | |
Collapse
|
40
|
Boltz MG. Effects of event structure on retrospective duration judgments. PERCEPTION & PSYCHOPHYSICS 1995; 57:1080-96. [PMID: 8532497 DOI: 10.3758/bf03205466] [Citation(s) in RCA: 56] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/31/2023]
Abstract
Two experiments examined whether varying degrees of event coherence influence the remembering of an event's actual duration. Relying on musical compositions (Experiment 1) or filmed narratives (Experiment 2) as experimental stimuli, the underlying hierarchy of information within events (i.e., melodic intervals or story elements) was either attentionally highlighted or obscured by placing a varying number of accents (i.e., prolonged notes or commercial breaks) at locations that either coincided or conflicted with grammatical phrase boundaries. When subjects were unexpectedly asked to judge the actual duration of events, through a reproduction (Experiment 1) or verbal estimation (Experiment 2) task, duration estimates became more accurate and less variable when the pattern of accentuation increasingly outlined the events' nested relationships. Conversely, when the events' organization was increasingly obscured through accentuation, time judgments not only became less accurate and more variable, but were consistently overestimated. These findings support a theoretical framework emphasizing the effects of event structure on attending and remembering activities.
Collapse
Affiliation(s)
- M G Boltz
- Department of Psychology, Haverford College, Haverford, PA 19041, USA
| |
Collapse
|
41
|
Foertsch J. The impact of electronic networks on scholarly communication: Avenues for research. DISCOURSE PROCESSES 1995. [DOI: 10.1080/01638539509544919] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
42
|
Chebat JC, Gelinas-Chebat C, Filiatrault P. Interactive effects of musical and visual cues on time perception: an application to waiting lines in banks. Percept Mot Skills 1993; 77:995-1020. [PMID: 8284188 DOI: 10.2466/pms.1993.77.3.995] [Citation(s) in RCA: 87] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/29/2023]
Abstract
This study explores the interactive effects of musical and visual cues on time perception in a specific situation, that of waiting in a bank. Videotapes are employed to stimulate the situation; a 2 x 3 factorial design (N = 427) is used: 2 (high vs low) amounts of visual information and 2 (fast vs slow) levels of musical tempo in addition to a no-music condition. Two mediating variables are tested in the relation between the independent variables (musical and visual ones) and the dependent variable (perceived waiting time), mood and attention. Results of multivariate analysis of variance and a system of simultaneous equations show that musical cues and visual cues have no symmetrical effects: the musical tempo has a global (moderating) effect on the whole structure of the relations between dependent, independent, and mediating variables but has no direct influence on time perception. The visual cues affect time perception, the significance of which depends on musical tempo. Also, the "Resource Allocation Model of Time Estimation" predicts the attention-time relation better than Ornstein's "storage-size theory." Mood state serves as a substitute for time information with slow music, but its effects are cancelled with fast music.
Collapse
Affiliation(s)
- J C Chebat
- University of Quebec at Montreal, Canada
| | | | | |
Collapse
|
43
|
Cacace AT, McFarland DJ. Acoustic Pattern Recognition and Short-Term Memory in Normal Adults and Young Children: Reconnaissance de patterns acoustiques et mémoire à court terme chez les adultes normaux et les jeunes enfants. Int J Audiol 1992. [DOI: 10.3109/00206099209072921] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
|
44
|
McFarland DJ, Cacace AT. Aspects of short-term acoustic recognition memory: modality and serial position effects. AUDIOLOGY : OFFICIAL ORGAN OF THE INTERNATIONAL SOCIETY OF AUDIOLOGY 1992; 31:342-52. [PMID: 1492818 DOI: 10.3109/00206099209072922] [Citation(s) in RCA: 28] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/27/2022]
Abstract
Two experiments were performed to study short-term acoustic recognition memory using synthesized binary tone patterns within a three-interval, three-alternative forced choice psychophysical procedure. In Experiment 1, subjects showed as significant performance advantage in processing binary frequency patterns over intensity and duration patterns. In Experiment 2, we found that elements at the beginnings and ends of various length frequency pattern sequences were recognized better than those in the middle of the sequence (primacy and recency effects). Furthermore, we showed that performance on a serial position task may be a useful tool to demonstrate the limited capacity of information storage in acoustic short-term memory. Sensory memory typically has been examined using verbal stimuli and requiring immediate ordered recall. These results demonstrate the utility of studying sensory memory using complex nonverbal stimuli within a forced-choice recognition paradigm.
Collapse
Affiliation(s)
- D J McFarland
- Wadsworth Center for Laboratories and Research, New York State Department of Health, Albany 12201-0509
| | | |
Collapse
|
45
|
Abstract
Temporal coding has been studied by examining the perception and reproduction of rhythms and by examining memory for the order of events in a list. We attempt to link these research programs both empirically and theoretically. Glenberg and Swanson (1986) proposed that the superior recall of auditory material, compared with visual material, reflects more accurate temporal coding for the auditory material. In this paper, we demonstrate that a similar modality effect can be produced in a rhythm task. Auditory rhythms composed of stimuli of two durations are reproduced more accurately than are visual rhythms. Furthermore, it appears that the auditory superiority reflects enhanced chunking of the auditory material rather than better identification of durations.
Collapse
Affiliation(s)
- A M Glenberg
- Department of Psychology, University of Wisconsin, Madison 53706
| | | |
Collapse
|
46
|
Cowan N, Saults JS, Winterowd C, Sherk M. Enhancement of 4-year-old children's memory span for phonologically similar and dissimilar word lists. J Exp Child Psychol 1991; 51:30-52. [PMID: 2010726 DOI: 10.1016/0022-0965(91)90076-5] [Citation(s) in RCA: 17] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/29/2022]
Abstract
Previous research suggests that preschool children are deficient in rehearsal and that stimulus list repetitions can improve their recall, presumably by substituting for the products of rehearsal. However, the previous research included interitem or postlist retention intervals of several seconds or more. We examined the utility of list repetitions with reference to an ordinary span task in which spoken words were presented 1 s apart for immediate recall. Lists with phonologically similar versus dissimilar items were included, to determine if the overall pattern of recall could be made more similar to what is ordinarily obtained in older children. Cumulative repetition was found to cause a moderate increase in both memory span and the phonological similarity effect. Other types of list repetition provided more insight into types of stimulus redundancy that were helpful (e.g., repeated serial order information) or not helpful (e.g., forced articulatory coding) to children attempting to recall spoken lists. The underlying mnemonic processes are discussed.
Collapse
Affiliation(s)
- N Cowan
- Department of Psychology, University of Missouri, Columbia 65211
| | | | | | | |
Collapse
|
47
|
Glenberg AM. Common processes underlie enhanced recency effects for auditory and changing-state stimuli. Mem Cognit 1990; 18:638-50. [PMID: 2266865 DOI: 10.3758/bf03197106] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/31/2022]
Abstract
For some stimuli, dynamic changes are crucial for identifying just what the stimuli are. For example, spoken words (or any auditory stimuli) require change over time to be recognized. Kallman and Cameron (1989) have proposed that this sort of dynamic change underlies the enhanced recency effect found for auditory stimuli, relative to visual stimuli. The results of three experiments replicate and extend Kallman and Cameron's finding that dynamic visual stimuli (that is visual stimuli in which movement is necessary to identify the stimuli), relative to static visual stimuli, engender enhanced recency effects. In addition, an analysis based on individual differences is used to demonstrate that the processes underlying enhanced recency effects for auditory and dynamic visual stimuli are substantially similar. These results are discussed in the context of perceptual grouping processes.
Collapse
Affiliation(s)
- A M Glenberg
- Department of Psychology, University of Wisconsin, Madison 53706
| |
Collapse
|
48
|
Wetzel MC. Learning and rhythmic human EMG in ecological perspective. Physiol Behav 1990; 48:113-20. [PMID: 2236257 DOI: 10.1016/0031-9384(90)90271-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/30/2022]
Abstract
Previous evidence of strong interactions between learning and human treadmill locomotion led to a simplified system for studying learned rhythms in a framework of behavioral ecology. Motor control combined with instrumental conditioning in a rhythmic hand task with repeating trials, blocks, and complete regimens. Regimen contexts differed with respect to the pattern of stimulation before and after an electromyographic (EMG) response. Both an antecedent stimulus (a light flash) and a consequent stimulus (a tone indicating success or failure) were necessary for conditioning. Arguments were given for defining reinforcement as a composite of interdependent and size-scaled processes, some including knowledge of results, instead of as a single event after a response.
Collapse
Affiliation(s)
- M C Wetzel
- Department of Psychology, University of Arizona, Tucson 85721
| |
Collapse
|
49
|
McFarland DJ, Cacace AT. Comparisons of memory for nonverbal auditory and visual sequential stimuli. PSYCHOLOGICAL RESEARCH 1995; 57:80-7. [PMID: 7708900 DOI: 10.1007/bf00447078] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/26/2023]
Abstract
Properties of auditory and visual sensory memory were compared by examining subjects' recognition performance of randomly generated binary auditory sequential frequency patterns and binary visual sequential color patterns within a forced-choice paradigm. Experiment 1 demonstrated serial-position effects in auditory and visual modalities consisting of both primacy and recency effects. Experiment 2 found that retention of auditory and visual information was remarkably similar when assessed across a 10s interval. Experiments 3 and 4, taken together, showed that the recency effect in sensory memory is affected more by the type of response required (recognition vs. reproduction) than by the sensory modality employed. These studies suggest that auditory and visual sensory memory stores for nonverbal stimuli share similar properties with respect to serial-position effects and persistence over time.
Collapse
Affiliation(s)
- D J McFarland
- Wadsworth Center for Laboratories and Research, New York State Department of Health, Albany 12201-0509
| | | |
Collapse
|