1
|
Kimura A. Cross-modal sensitivities to auditory and visual stimulations in the first-order somatosensory thalamic nucleus. Eur J Neurosci 2024; 60:5621-5657. [PMID: 39192569 DOI: 10.1111/ejn.16510] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2024] [Revised: 07/15/2024] [Accepted: 08/06/2024] [Indexed: 08/29/2024]
Abstract
The ventral posterolateral nucleus (VPL), being categorized as the first-order thalamic nucleus, is considered to be dedicated to uni-modal somatosensory processing. Cross-modal sensory interactions on thalamic reticular nucleus cells projecting to the VPL, on the other hand, suggest that VPL cells are subject to cross-modal sensory influences. To test this possibility, the effects of auditory or visual stimulation on VPL cell activities were examined in anaesthetized rats, using juxta-cellular recording and labelling techniques. Recordings were obtained from 70 VPL cells, including 65 cells responsive to cutaneous electrical stimulation of the hindpaw. Auditory or visual alone stimulation did not elicit cell activity except in three bi-modal cells and one auditory cell. Cross-modal alterations of somatosensory response by auditory and/or visual stimulation were recognized in 61 cells with regard to the response magnitude, latency (time and jitter) and/or burst spiking properties. Both early (onset) and late responses were either suppressed or facilitated, and de novo cell activity was also induced. Cross-modal alterations took place depending on the temporal interval between the preceding counterpart and somatosensory stimulations, the intensity and frequency of sound. Alterations were observed mostly at short intervals (< 200 ms) and up to 800 ms intervals. Sounds of higher intensities and lower frequencies were more effective for modulation. The susceptibility to cross-modal influences was related to cell location and/or morphology. These and previously reported similar findings in the auditory and visual thalamic nuclei suggest that cross-modal sensory interactions pervasively take place in the first-order sensory thalamic nuclei.
Collapse
Affiliation(s)
- Akihisa Kimura
- Department of Physiology, Wakayama Medical University, Wakayama, Japan
| |
Collapse
|
2
|
Alwashmi K, Meyer G, Rowe F, Ward R. Enhancing learning outcomes through multisensory integration: A fMRI study of audio-visual training in virtual reality. Neuroimage 2024; 285:120483. [PMID: 38048921 DOI: 10.1016/j.neuroimage.2023.120483] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2023] [Revised: 11/18/2023] [Accepted: 12/01/2023] [Indexed: 12/06/2023] Open
Abstract
The integration of information from different sensory modalities is a fundamental process that enhances perception and performance in real and virtual environments (VR). Understanding these mechanisms, especially during learning tasks that exploit novel multisensory cue combinations provides opportunities for the development of new rehabilitative interventions. This study aimed to investigate how functional brain changes support behavioural performance improvements during an audio-visual (AV) learning task. Twenty healthy participants underwent a 30 min daily VR training for four weeks. The task was an AV adaptation of a 'scanning training' paradigm that is commonly used in hemianopia rehabilitation. Functional magnetic resonance imaging (fMRI) and performance data were collected at baseline, after two and four weeks of training, and four weeks post-training. We show that behavioural performance, operationalised as mean reaction time reduction in VR, significantly improves. In separate tests in a controlled laboratory environment, we showed that the behavioural performance gains in the VR training environment transferred to a significant mean RT reduction for the trained AV voluntary task on a computer screen. Enhancements were observed in both the visual-only and AV conditions, with the latter demonstrating a faster response time supported by the presence of audio cues. The behavioural learning effect also transfers to two additional tasks that were tested: a visual search task and an involuntary visual task. Our fMRI results reveal an increase in functional activation (BOLD signal) in multisensory brain regions involved in early-stage AV processing: the thalamus, the caudal inferior parietal lobe and cerebellum. These functional changes were only observed for the trained, multisensory, task and not for unimodal visual stimulation. Functional activation changes in the thalamus were significantly correlated to behavioural performance improvements. This study demonstrates that incorporating spatial auditory cues to voluntary visual training in VR leads to augmented brain activation changes in multisensory integration, resulting in measurable performance gains across tasks. The findings highlight the potential of VR-based multisensory training as an effective method for enhancing cognitive function and as a potentially valuable tool in rehabilitative programmes.
Collapse
Affiliation(s)
- Kholoud Alwashmi
- Faculty of Health and Life Sciences, University of Liverpool, United Kingdom; Department of Radiology, Princess Nourah bint Abdulrahman University, Saudi Arabia.
| | - Georg Meyer
- Digital Innovation Facility, University of Liverpool, United Kingdom
| | - Fiona Rowe
- Institute of Population Health, University of Liverpool, United Kingdom
| | - Ryan Ward
- Digital Innovation Facility, University of Liverpool, United Kingdom; School Computer Science and Mathematics, Liverpool John Moores University, United Kingdom
| |
Collapse
|
3
|
Csonka M, Mardmomen N, Webster PJ, Brefczynski-Lewis JA, Frum C, Lewis JW. Meta-Analyses Support a Taxonomic Model for Representations of Different Categories of Audio-Visual Interaction Events in the Human Brain. Cereb Cortex Commun 2021; 2:tgab002. [PMID: 33718874 PMCID: PMC7941256 DOI: 10.1093/texcom/tgab002] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2020] [Revised: 12/31/2020] [Accepted: 01/06/2021] [Indexed: 01/23/2023] Open
Abstract
Our ability to perceive meaningful action events involving objects, people, and other animate agents is characterized in part by an interplay of visual and auditory sensory processing and their cross-modal interactions. However, this multisensory ability can be altered or dysfunctional in some hearing and sighted individuals, and in some clinical populations. The present meta-analysis sought to test current hypotheses regarding neurobiological architectures that may mediate audio-visual multisensory processing. Reported coordinates from 82 neuroimaging studies (137 experiments) that revealed some form of audio-visual interaction in discrete brain regions were compiled, converted to a common coordinate space, and then organized along specific categorical dimensions to generate activation likelihood estimate (ALE) brain maps and various contrasts of those derived maps. The results revealed brain regions (cortical "hubs") preferentially involved in multisensory processing along different stimulus category dimensions, including 1) living versus nonliving audio-visual events, 2) audio-visual events involving vocalizations versus actions by living sources, 3) emotionally valent events, and 4) dynamic-visual versus static-visual audio-visual stimuli. These meta-analysis results are discussed in the context of neurocomputational theories of semantic knowledge representations and perception, and the brain volumes of interest are available for download to facilitate data interpretation for future neuroimaging studies.
Collapse
Affiliation(s)
- Matt Csonka
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - Nadia Mardmomen
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - Paula J Webster
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - Julie A Brefczynski-Lewis
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - Chris Frum
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - James W Lewis
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| |
Collapse
|
4
|
Röder B, Kekunnaya R, Guerreiro MJS. Neural mechanisms of visual sensitive periods in humans. Neurosci Biobehav Rev 2020; 120:86-99. [PMID: 33242562 DOI: 10.1016/j.neubiorev.2020.10.030] [Citation(s) in RCA: 25] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2020] [Accepted: 10/08/2020] [Indexed: 01/18/2023]
Abstract
Sensitive periods in brain development are phases of enhanced susceptibility to experience. Here we discuss research from human and non-human neuroscience studies which have demonstrated a) differences in the way infants vs. adults learn; b) how the brain adapts to atypical conditions, in particular a congenital vs. a late onset blindness (sensitive periods for atypical brain development); and c) the extent to which neural systems are capable of acquiring a typical brain organization after sight restoration following a congenital vs. late phase of pattern vision deprivation (sensitive periods for typical brain development). By integrating these three lines of research, we propose neural mechanisms characteristic of sensitive periods vs. adult neuroplasticity and learning.
Collapse
Affiliation(s)
- Brigitte Röder
- Biological Psychology and Neuropsychology, University of Hamburg, Germany.
| | - Ramesh Kekunnaya
- Jasti V Ramanamma Children's Eye Care Center, LV Prasad Eye Institute, Hyderabad, India
| | | |
Collapse
|
5
|
Kimura A. Cross-modal modulation of cell activity by sound in first-order visual thalamic nucleus. J Comp Neurol 2020; 528:1917-1941. [PMID: 31983057 DOI: 10.1002/cne.24865] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2019] [Revised: 12/19/2019] [Accepted: 01/16/2020] [Indexed: 12/16/2022]
Abstract
Cross-modal auditory influence on cell activity in the primary visual cortex emerging at short latencies raises the possibility that the first-order visual thalamic nucleus, which is considered dedicated to unimodal visual processing, could contribute to cross-modal sensory processing, as has been indicated in the auditory and somatosensory systems. To test this hypothesis, the effects of sound stimulation on visual cell activity in the dorsal lateral geniculate nucleus were examined in anesthetized rats, using juxta-cellular recording and labeling techniques. Visual responses evoked by light (white LED) were modulated by sound (noise burst) given simultaneously or 50-400 ms after the light, even though sound stimuli alone did not evoke cell activity. Alterations of visual response were observed in 71% of cells (57/80) with regard to response magnitude, latency, and/or burst spiking. Suppression predominated in response magnitude modulation, but de novo responses were also induced by combined stimulation. Sound affected not only onset responses but also late responses. Late responses were modulated by sound given before or after onset responses. Further, visual responses evoked by the second light stimulation of a double flash with a 150-700 ms interval were also modulated by sound given together with the first light stimulation. In morphological analysis of labeled cells projection cells comparable to X-, Y-, and W-like cells and interneurons were all susceptible to auditory influence. These findings suggest that the first-order visual thalamic nucleus incorporates auditory influence into parallel and complex thalamic visual processing for cross-modal modulation of visual attention and perception.
Collapse
Affiliation(s)
- Akihisa Kimura
- Department of Physiology, Wakayama Medical University, Wakayama, Japan
| |
Collapse
|
6
|
Li Z, Huang J, Xu T, Wang Y, Li K, Zeng YW, Lui SSY, Cheung EFC, Jin Z, Dazzan P, Glahn DC, Chan RCK. Neural mechanism and heritability of complex motor sequence and audiovisual integration: A healthy twin study. Hum Brain Mapp 2017; 39:1438-1448. [PMID: 29266498 DOI: 10.1002/hbm.23935] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2017] [Revised: 12/09/2017] [Accepted: 12/12/2017] [Indexed: 11/12/2022] Open
Abstract
Complex motor sequencing and sensory integration are two key items in scales assessing neurological soft signs. However, the underlying neural mechanism and heritability of these two functions is not known. Using a healthy twin design, we adopted two functional brain imaging tasks focusing on fist-edge-palm (FEP) complex motor sequence and audiovisual integration (AVI). Fifty-six monozygotic twins and 56 dizygotic twins were recruited in this study. The pre- and postcentral, temporal and parietal gyri, the supplementary motor area, and the cerebellum were activated during the FEP motor sequence, whereas the precentral, temporal, and fusiform gyri, the thalamus, and the caudate were activated during AVI. Activation in the supplementary motor area during FEP motor sequence and activation in the precentral gyrus and the thalamic nuclei during AVI exhibited significant heritability estimates, ranging from 0.5 to 0.62. These results suggest that activation in cortical motor areas, the thalamus and the cerebellum associated with complex motor sequencing and audiovisual integration function may be heritable.
Collapse
Affiliation(s)
- Zhi Li
- Neuropsychology and Applied Cognitive Neuroscience Laboratory, CAS Key Laboratory of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,Department of Psychology, The University of Chinese Academy of Sciences, Beijing, China
| | - Jia Huang
- Neuropsychology and Applied Cognitive Neuroscience Laboratory, CAS Key Laboratory of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,Department of Psychology, The University of Chinese Academy of Sciences, Beijing, China
| | - Ting Xu
- CAS Key Laboratory of Behavioral Sciences, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,Center for the Developing Brain, Child Mind Institute, New York, New York
| | - Ya Wang
- Neuropsychology and Applied Cognitive Neuroscience Laboratory, CAS Key Laboratory of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,Department of Psychology, The University of Chinese Academy of Sciences, Beijing, China
| | - Ke Li
- MRI Center, Hospital 306, Beijing, China
| | | | - Simon S Y Lui
- Neuropsychology and Applied Cognitive Neuroscience Laboratory, CAS Key Laboratory of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,Castle Peak Hospital, Hong Kong Special Administrative Region, Hong Kong, China
| | - Eric F C Cheung
- Castle Peak Hospital, Hong Kong Special Administrative Region, Hong Kong, China
| | - Zhen Jin
- MRI Center, Hospital 306, Beijing, China
| | - Paola Dazzan
- Department of Psychosis Studies, Institute of Psychiatry, Psychology and Neuroscience, King's College London, London, United Kingdom.,National Institute for Health Research (NIHR) Biomedical Research Centre at South London and Maudsley NHS Foundation Trust and King's College London, London, United Kingdom
| | - David C Glahn
- Department of Psychiatry, Yale University & Olin Neuropsychiatric Research Center, Institute of Living, United States of America
| | - Raymond C K Chan
- Neuropsychology and Applied Cognitive Neuroscience Laboratory, CAS Key Laboratory of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,Department of Psychology, The University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
7
|
Murray MM, Thelen A, Thut G, Romei V, Martuzzi R, Matusz PJ. The multisensory function of the human primary visual cortex. Neuropsychologia 2015; 83:161-169. [PMID: 26275965 DOI: 10.1016/j.neuropsychologia.2015.08.011] [Citation(s) in RCA: 107] [Impact Index Per Article: 10.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2015] [Revised: 08/08/2015] [Accepted: 08/10/2015] [Indexed: 01/20/2023]
Abstract
It has been nearly 10 years since Ghazanfar and Schroeder (2006) proposed that the neocortex is essentially multisensory in nature. However, it is only recently that sufficient and hard evidence that supports this proposal has accrued. We review evidence that activity within the human primary visual cortex plays an active role in multisensory processes and directly impacts behavioural outcome. This evidence emerges from a full pallet of human brain imaging and brain mapping methods with which multisensory processes are quantitatively assessed by taking advantage of particular strengths of each technique as well as advances in signal analyses. Several general conclusions about multisensory processes in primary visual cortex of humans are supported relatively solidly. First, haemodynamic methods (fMRI/PET) show that there is both convergence and integration occurring within primary visual cortex. Second, primary visual cortex is involved in multisensory processes during early post-stimulus stages (as revealed by EEG/ERP/ERFs as well as TMS). Third, multisensory effects in primary visual cortex directly impact behaviour and perception, as revealed by correlational (EEG/ERPs/ERFs) as well as more causal measures (TMS/tACS). While the provocative claim of Ghazanfar and Schroeder (2006) that the whole of neocortex is multisensory in function has yet to be demonstrated, this can now be considered established in the case of the human primary visual cortex.
Collapse
Affiliation(s)
- Micah M Murray
- The Laboratory for Investigative Neurophysiology (The LINE), Neuropsychology and Neurorehabilitation Service and Department of Radiology, University Hospital Center and University of Lausanne, Lausanne, Switzerland; EEG Brain Mapping Core, Center for Biomedical Imaging (CIBM) of Lausanne and Geneva, Lausanne, Switzerland; Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, TN, USA.
| | - Antonia Thelen
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, TN, USA
| | - Gregor Thut
- Centre for Cognitive Neuroimaging, Institute of Neuroscience and Psychology, University of Glasgow, Glasgow G12 8QB, United Kingdom
| | - Vincenzo Romei
- Centre for Brain Science, Department of Psychology, University of Essex, Colchester, United Kingdom
| | - Roberto Martuzzi
- Laboratory of Cognitive Neuroscience, Brain-Mind Institute, Ecole Polytechnique Fédérale de Lausanne, Switzerland
| | - Pawel J Matusz
- The Laboratory for Investigative Neurophysiology (The LINE), Neuropsychology and Neurorehabilitation Service and Department of Radiology, University Hospital Center and University of Lausanne, Lausanne, Switzerland; Attention, Brain, and Cognitive Development Group, Department of Experimental Psychology, University of Oxford, United Kingdom.
| |
Collapse
|
8
|
Henschke JU, Noesselt T, Scheich H, Budinger E. Possible anatomical pathways for short-latency multisensory integration processes in primary sensory cortices. Brain Struct Funct 2014; 220:955-77. [DOI: 10.1007/s00429-013-0694-4] [Citation(s) in RCA: 61] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2013] [Accepted: 12/17/2013] [Indexed: 01/25/2023]
|
9
|
Jaschke AC. Music intervention as system: Reversing hyper systemising in autism spectrum disorders to the comprehension of music as intervention. Med Hypotheses 2014; 82:40-8. [PMID: 24280561 DOI: 10.1016/j.mehy.2013.11.001] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2013] [Revised: 10/28/2013] [Accepted: 11/05/2013] [Indexed: 10/26/2022]
|
10
|
Mégevand P, Molholm S, Nayak A, Foxe JJ. Recalibration of the multisensory temporal window of integration results from changing task demands. PLoS One 2013; 8:e71608. [PMID: 23951203 PMCID: PMC3738519 DOI: 10.1371/journal.pone.0071608] [Citation(s) in RCA: 59] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2013] [Accepted: 07/02/2013] [Indexed: 11/29/2022] Open
Abstract
The notion of the temporal window of integration, when applied in a multisensory context, refers to the breadth of the interval across which the brain perceives two stimuli from different sensory modalities as synchronous. It maintains a unitary perception of multisensory events despite physical and biophysical timing differences between the senses. The boundaries of the window can be influenced by attention and past sensory experience. Here we examined whether task demands could also influence the multisensory temporal window of integration. We varied the stimulus onset asynchrony between simple, short-lasting auditory and visual stimuli while participants performed two tasks in separate blocks: a temporal order judgment task that required the discrimination of subtle auditory-visual asynchronies, and a reaction time task to the first incoming stimulus irrespective of its sensory modality. We defined the temporal window of integration as the range of stimulus onset asynchronies where performance was below 75% in the temporal order judgment task, as well as the range of stimulus onset asynchronies where responses showed multisensory facilitation (race model violation) in the reaction time task. In 5 of 11 participants, we observed audio-visual stimulus onset asynchronies where reaction time was significantly accelerated (indicating successful integration in this task) while performance was accurate in the temporal order judgment task (indicating successful segregation in that task). This dissociation suggests that in some participants, the boundaries of the temporal window of integration can adaptively recalibrate in order to optimize performance according to specific task demands.
Collapse
Affiliation(s)
- Pierre Mégevand
- The Sheryl and Daniel R. Tishman Cognitive Neurophysiology Laboratory, Children’s Evaluation and Rehabilitation Center, Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine, Bronx, New York, United States of America
| | - Sophie Molholm
- The Sheryl and Daniel R. Tishman Cognitive Neurophysiology Laboratory, Children’s Evaluation and Rehabilitation Center, Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine, Bronx, New York, United States of America
| | - Ashabari Nayak
- The Sheryl and Daniel R. Tishman Cognitive Neurophysiology Laboratory, Children’s Evaluation and Rehabilitation Center, Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine, Bronx, New York, United States of America
| | - John J. Foxe
- The Sheryl and Daniel R. Tishman Cognitive Neurophysiology Laboratory, Children’s Evaluation and Rehabilitation Center, Departments of Pediatrics and Neuroscience, Albert Einstein College of Medicine, Bronx, New York, United States of America
- * E-mail:
| |
Collapse
|
11
|
Hoefer M, Tyll S, Kanowski M, Brosch M, Schoenfeld MA, Heinze HJ, Noesselt T. Tactile stimulation and hemispheric asymmetries modulate auditory perception and neural responses in primary auditory cortex. Neuroimage 2013; 79:371-82. [PMID: 23664954 DOI: 10.1016/j.neuroimage.2013.04.119] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2013] [Revised: 04/12/2013] [Accepted: 04/27/2013] [Indexed: 10/26/2022] Open
Abstract
Although multisensory integration has been an important area of recent research, most studies focused on audiovisual integration. Importantly, however, the combination of audition and touch can guide our behavior as effectively which we studied here using psychophysics and functional magnetic resonance imaging (fMRI). We tested whether task-irrelevant tactile stimuli would enhance auditory detection, and whether hemispheric asymmetries would modulate these audiotactile benefits using lateralized sounds. Spatially aligned task-irrelevant tactile stimuli could occur either synchronously or asynchronously with the sounds. Auditory detection was enhanced by non-informative synchronous and asynchronous tactile stimuli, if presented on the left side. Elevated fMRI-signals to left-sided synchronous bimodal stimulation were found in primary auditory cortex (A1). Adjacent regions (planum temporale, PT) expressed enhanced BOLD-responses for synchronous and asynchronous left-sided bimodal conditions. Additional connectivity analyses seeded in right-hemispheric A1 and PT for both bimodal conditions showed enhanced connectivity with right-hemispheric thalamic, somatosensory and multisensory areas that scaled with subjects' performance. Our results indicate that functional asymmetries interact with audiotactile interplay which can be observed for left-lateralized stimulation in the right hemisphere. There, audiotactile interplay recruits a functional network of unisensory cortices, and the strength of these functional network connections is directly related to subjects' perceptual sensitivity.
Collapse
Affiliation(s)
- M Hoefer
- Department of Biological Psychology, Otto-von-Guericke-University Magdeburg, Postfach 4120, 39106 Magdeburg, Germany.
| | | | | | | | | | | | | |
Collapse
|