1
|
Remache-Vinueza B, Guijarro-Molina I, Trujillo-León A. Influence of duration and visual feedback on the perception of tactile illusions of motion. Sci Rep 2025; 15:10965. [PMID: 40164678 PMCID: PMC11958647 DOI: 10.1038/s41598-025-95527-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2024] [Accepted: 03/21/2025] [Indexed: 04/02/2025] Open
Abstract
Tactile illusions manifest when the perceived sensation does not align with the actual tactile stimulation. Phantom motion and cutaneous rabbit are two illusions that convey motion and direction information using only a pair of actuators, which results in reduced cost, weight, and energy consumption. This study presents two experiments involving these illusions. In the first of them, participants grasped a two-actuator haptic interface. It explored how the duration of the illusions influences the distance traveled by the illusory points. For phantom motion, duration directly affects the perceived end-point location. As duration increases, the end-point starts being sensed out of the hand, either in the interface or even in the air. On the other hand, cutaneous rabbit illusion is not influenced by duration. The second experiment investigated the influence of visual feedback on the perception of both tactile illusions. Visual stimuli were presented paired with their haptic counterpart. Results show a clear dominance of vision over the haptic mode in both phantom motion and cutaneous rabbit illusions. Characteristics such as the motion location, direction, distance traveled, or number of jumps in cutaneous rabbit were fixed by the visual stimulus, no matter the content of the haptic cue. This finding opens a world of possibilities for integrating tactile illusions in visuo-haptic experiences typical of Virtual Reality environments: simple setups with two actuators are enough to elicit clear and varied haptic perceptions when presented together with the appropriate visual stimulus.
Collapse
Affiliation(s)
- Byron Remache-Vinueza
- Universidad de Málaga, Av. Cervantes, 29016, Málaga, Spain
- SISAu Research Group, Facultad de Ingenierías, Ingeniería Industrial, Machala St., Quito, 170103, Ecuador
| | | | - Andrés Trujillo-León
- Universidad de Málaga, Av. Cervantes, 29016, Málaga, Spain.
- Instituto Universitario de Investigación en Ingeniería Mecatrónica y Sistemas Ciberfísicos (IMECH.UMA), Parque Tecnológico de Andalucía, 29590, Campanillas, Spain.
| |
Collapse
|
2
|
Ruttorf M, Tal Z, Amaral L, Fang F, Bi Y, Almeida J. Neuroplastic changes in functional wiring in sensory cortices of the congenitally deaf: A network analysis. Hum Brain Mapp 2023; 44:6523-6536. [PMID: 37956260 PMCID: PMC10681644 DOI: 10.1002/hbm.26530] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2023] [Revised: 10/10/2023] [Accepted: 10/22/2023] [Indexed: 11/15/2023] Open
Abstract
Congenital sensory deprivation induces significant changes in the structural and functional organisation of the brain. These are well-characterised by cross-modal plasticity, in which deprived cortical areas are recruited to process information from non-affected sensory modalities, as well as by other neuroplastic alterations within regions dedicated to the remaining senses. Here, we analysed visual and auditory networks of congenitally deaf and hearing individuals during different visual tasks to assess changes in network community structure and connectivity patterns due to congenital deafness. In the hearing group, the nodes are clearly divided into three communities (visual, auditory and subcortical), whereas in the deaf group a fourth community consisting mainly of bilateral superior temporal sulcus and temporo-insular regions is present. Perhaps more importantly, the right lateral geniculate body, as well as bilateral thalamus and pulvinar joined the auditory community of the deaf. Moreover, there is stronger connectivity between bilateral thalamic and pulvinar and auditory areas in the deaf group, when compared to the hearing group. No differences were found in the number of connections of these nodes to visual areas. Our findings reveal substantial neuroplastic changes occurring within the auditory and visual networks caused by deafness, emphasising the dynamic nature of the sensory systems in response to congenital deafness. Specifically, these results indicate that in the deaf but not the hearing group, subcortical thalamic nuclei are highly connected to auditory areas during processing of visual information, suggesting that these relay areas may be responsible for rerouting visual information to the auditory cortex under congenital deafness.
Collapse
Affiliation(s)
- Michaela Ruttorf
- Computer Assisted Clinical MedicineHeidelberg UniversityMannheimGermany
- Mannheim Institute for Intelligent Systems in MedicineHeidelberg UniversityMannheimGermany
| | - Zohar Tal
- Proaction LaboratoryUniversity of CoimbraPortugal
- Faculty of Psychology and Educational SciencesUniversity of CoimbraPortugal
| | - Lénia Amaral
- Department of NeuroscienceGeorgetown University Medical CenterWashingtonDistrict of ColumbiaUSA
| | - Fang Fang
- School of Psychological and Cognitive Sciences and Beijing Key Laboratory of Behavior and Mental HealthPeking UniversityBeijingChina
- IDG/McGovern Institute for Brain ResearchPeking UniversityBeijingChina
- Peking‐Tsinghua Center for Life SciencesPeking UniversityBeijingChina
| | - Yanchao Bi
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern, Institute for Brain ResearchBeijing Normal UniversityBeijingChina
- Beijing Key Laboratory of Brain Imaging and ConnectomicsBeijing Normal UniversityBeijingChina
- Chinese Institute for Brain ResearchBeijingChina
| | - Jorge Almeida
- Proaction LaboratoryUniversity of CoimbraPortugal
- Faculty of Psychology and Educational SciencesUniversity of CoimbraPortugal
| |
Collapse
|
3
|
Scheliga S, Kellermann T, Lampert A, Rolke R, Spehr M, Habel U. Neural correlates of multisensory integration in the human brain: an ALE meta-analysis. Rev Neurosci 2023; 34:223-245. [PMID: 36084305 DOI: 10.1515/revneuro-2022-0065] [Citation(s) in RCA: 15] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2022] [Accepted: 07/22/2022] [Indexed: 02/07/2023]
Abstract
Previous fMRI research identified superior temporal sulcus as central integration area for audiovisual stimuli. However, less is known about a general multisensory integration network across senses. Therefore, we conducted activation likelihood estimation meta-analysis with multiple sensory modalities to identify a common brain network. We included 49 studies covering all Aristotelian senses i.e., auditory, visual, tactile, gustatory, and olfactory stimuli. Analysis revealed significant activation in bilateral superior temporal gyrus, middle temporal gyrus, thalamus, right insula, and left inferior frontal gyrus. We assume these regions to be part of a general multisensory integration network comprising different functional roles. Here, thalamus operate as first subcortical relay projecting sensory information to higher cortical integration centers in superior temporal gyrus/sulcus while conflict-processing brain regions as insula and inferior frontal gyrus facilitate integration of incongruent information. We additionally performed meta-analytic connectivity modelling and found each brain region showed co-activations within the identified multisensory integration network. Therefore, by including multiple sensory modalities in our meta-analysis the results may provide evidence for a common brain network that supports different functional roles for multisensory integration.
Collapse
Affiliation(s)
- Sebastian Scheliga
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty RWTH Aachen University, Pauwelsstraße 30, 52074 Aachen, Germany
| | - Thilo Kellermann
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty RWTH Aachen University, Pauwelsstraße 30, 52074 Aachen, Germany.,JARA-Institute Brain Structure Function Relationship, Pauwelsstraße 30, 52074 Aachen, Germany
| | - Angelika Lampert
- Institute of Physiology, Medical Faculty RWTH Aachen University, Pauwelsstraße 30, 52074 Aachen, Germany
| | - Roman Rolke
- Department of Palliative Medicine, Medical Faculty RWTH Aachen University, Pauwelsstraße 30, 52074 Aachen, Germany
| | - Marc Spehr
- Department of Chemosensation, RWTH Aachen University, Institute for Biology, Worringerweg 3, 52074 Aachen, Germany
| | - Ute Habel
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty RWTH Aachen University, Pauwelsstraße 30, 52074 Aachen, Germany.,JARA-Institute Brain Structure Function Relationship, Pauwelsstraße 30, 52074 Aachen, Germany
| |
Collapse
|
4
|
Direct Structural Connections between Auditory and Visual Motion-Selective Regions in Humans. J Neurosci 2021; 41:2393-2405. [PMID: 33514674 DOI: 10.1523/jneurosci.1552-20.2021] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2020] [Revised: 12/23/2020] [Accepted: 01/04/2021] [Indexed: 11/21/2022] Open
Abstract
In humans, the occipital middle-temporal region (hMT+/V5) specializes in the processing of visual motion, while the planum temporale (hPT) specializes in auditory motion processing. It has been hypothesized that these regions might communicate directly to achieve fast and optimal exchange of multisensory motion information. Here we investigated, for the first time in humans (male and female), the presence of direct white matter connections between visual and auditory motion-selective regions using a combined fMRI and diffusion MRI approach. We found evidence supporting the potential existence of direct white matter connections between individually and functionally defined hMT+/V5 and hPT. We show that projections between hMT+/V5 and hPT do not overlap with large white matter bundles, such as the inferior longitudinal fasciculus and the inferior frontal occipital fasciculus. Moreover, we did not find evidence suggesting the presence of projections between the fusiform face area and hPT, supporting the functional specificity of hMT+/V5-hPT connections. Finally, the potential presence of hMT+/V5-hPT connections was corroborated in a large sample of participants (n = 114) from the human connectome project. Together, this study provides a first indication for potential direct occipitotemporal projections between hMT+/V5 and hPT, which may support the exchange of motion information between functionally specialized auditory and visual regions.SIGNIFICANCE STATEMENT Perceiving and integrating moving signal across the senses is arguably one of the most important perceptual skills for the survival of living organisms. In order to create a unified representation of movement, the brain must therefore integrate motion information from separate senses. Our study provides support for the potential existence of direct connections between motion-selective regions in the occipital/visual (hMT+/V5) and temporal/auditory (hPT) cortices in humans. This connection could represent the structural scaffolding for the rapid and optimal exchange and integration of multisensory motion information. These findings suggest the existence of computationally specific pathways that allow information flow between areas that share a similar computational goal.
Collapse
|
5
|
Csonka M, Mardmomen N, Webster PJ, Brefczynski-Lewis JA, Frum C, Lewis JW. Meta-Analyses Support a Taxonomic Model for Representations of Different Categories of Audio-Visual Interaction Events in the Human Brain. Cereb Cortex Commun 2021; 2:tgab002. [PMID: 33718874 PMCID: PMC7941256 DOI: 10.1093/texcom/tgab002] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2020] [Revised: 12/31/2020] [Accepted: 01/06/2021] [Indexed: 01/23/2023] Open
Abstract
Our ability to perceive meaningful action events involving objects, people, and other animate agents is characterized in part by an interplay of visual and auditory sensory processing and their cross-modal interactions. However, this multisensory ability can be altered or dysfunctional in some hearing and sighted individuals, and in some clinical populations. The present meta-analysis sought to test current hypotheses regarding neurobiological architectures that may mediate audio-visual multisensory processing. Reported coordinates from 82 neuroimaging studies (137 experiments) that revealed some form of audio-visual interaction in discrete brain regions were compiled, converted to a common coordinate space, and then organized along specific categorical dimensions to generate activation likelihood estimate (ALE) brain maps and various contrasts of those derived maps. The results revealed brain regions (cortical "hubs") preferentially involved in multisensory processing along different stimulus category dimensions, including 1) living versus nonliving audio-visual events, 2) audio-visual events involving vocalizations versus actions by living sources, 3) emotionally valent events, and 4) dynamic-visual versus static-visual audio-visual stimuli. These meta-analysis results are discussed in the context of neurocomputational theories of semantic knowledge representations and perception, and the brain volumes of interest are available for download to facilitate data interpretation for future neuroimaging studies.
Collapse
Affiliation(s)
- Matt Csonka
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - Nadia Mardmomen
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - Paula J Webster
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - Julie A Brefczynski-Lewis
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - Chris Frum
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - James W Lewis
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| |
Collapse
|
6
|
Shared Representation of Visual and Auditory Motion Directions in the Human Middle-Temporal Cortex. Curr Biol 2020; 30:2289-2299.e8. [DOI: 10.1016/j.cub.2020.04.039] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2019] [Revised: 03/03/2020] [Accepted: 04/16/2020] [Indexed: 11/23/2022]
|
7
|
Abstract
Reaching movements are usually initiated by visual events and controlled visually and kinesthetically. Lately, studies have focused on the possible benefit of auditory information for localization tasks, and also for movement control. This explorative study aimed to investigate if it is possible to code reaching space purely by auditory information. Therefore, the precision of reaching movements to merely acoustically coded target positions was analyzed. We studied the efficacy of acoustically effect-based and of additional acoustically performance-based instruction and feedback and the role of visual movement control. Twenty-four participants executed reaching movements to merely acoustically presented, invisible target positions in three mutually perpendicular planes in front of them. Effector-endpoint trajectories were tracked using inertial sensors. Kinematic data regarding the three spatial dimensions and the movement velocity were sonified. Thus, acoustic instruction and real-time feedback of the movement trajectories and the target position of the hand were provided. The subjects were able to align their reaching movements to the merely acoustically instructed targets. Reaching space can be coded merely acoustically, additional visual movement control does not enhance reaching performance. On the basis of these results, a remarkable benefit of kinematic movement acoustics for the neuromotor rehabilitation of everyday motor skills can be assumed.
Collapse
|
8
|
Park M, Blake R, Kim Y, Kim CY. Congruent audio-visual stimulation during adaptation modulates the subsequently experienced visual motion aftereffect. Sci Rep 2019; 9:19391. [PMID: 31852921 PMCID: PMC6920416 DOI: 10.1038/s41598-019-54894-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2019] [Accepted: 11/11/2019] [Indexed: 11/11/2022] Open
Abstract
Sensory information registered in one modality can influence perception associated with sensory information registered in another modality. The current work focuses on one particularly salient form of such multisensory interaction: audio-visual motion perception. Previous studies have shown that watching visual motion and listening to auditory motion influence each other, but results from those studies are mixed with regard to the nature of the interactions promoting that influence and where within the sequence of information processing those interactions transpire. To address these issues, we investigated whether (i) concurrent audio-visual motion stimulation during an adaptation phase impacts the strength of the visual motion aftereffect (MAE) during a subsequent test phase, and (ii) whether the magnitude of that impact was dependent on the congruence between auditory and visual motion experienced during adaptation. Results show that congruent direction of audio-visual motion during adaptation induced a stronger initial impression and a slower decay of the MAE than did the incongruent direction, which is not attributable to differential patterns of eye movements during adaptation. The audio-visual congruency effects measured here imply that visual motion perception emerges from integration of audio-visual motion information at a sensory neural stage of processing.
Collapse
Affiliation(s)
- Minsun Park
- Department of Psychology, Korea University, Seoul, 02841, Korea
| | - Randolph Blake
- Department of Psychology and Vanderbilt Vision Research Center, Vanderbilt University, Nashville, TN, 37240, USA.
| | - Yeseul Kim
- Department of Psychology, Korea University, Seoul, 02841, Korea
| | - Chai-Youn Kim
- Department of Psychology, Korea University, Seoul, 02841, Korea.
| |
Collapse
|
9
|
The functional database of the ARCHI project: Potential and perspectives. Neuroimage 2019; 197:527-543. [PMID: 31063817 DOI: 10.1016/j.neuroimage.2019.04.056] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2018] [Revised: 04/08/2019] [Accepted: 04/20/2019] [Indexed: 02/04/2023] Open
Abstract
More than two decades of functional magnetic resonance imaging (fMRI) of the human brain have succeeded to identify, with a growing level of precision, the neural basis of multiple cognitive skills within various domains (perception, sensorimotor processes, language, emotion and social cognition …). Progress has been made in the comprehension of the functional organization of localized brain areas. However, the long time required for fMRI acquisition limits the number of experimental conditions performed in a single individual. As a consequence, distinct brain localizations have mostly been studied in separate groups of participants, and their functional relationships at the individual level remain poorly understood. To address this issue, we report here preliminary results on a database of fMRI data acquired on 78 individuals who each performed a total of 29 experimental conditions, grouped in 4 cross-domains functional localizers. This protocol has been designed to efficiently isolate, in a single session, the brain activity associated with language, numerical representation, social perception and reasoning, premotor and visuomotor representations. Analyses are reported at the group and at the individual level, to establish the ability of our protocol to selectively capture distinct regions of interest in a very short time. Test-retest reliability was assessed in a subset of participants. The activity evoked by the different contrasts of the protocol is located in distinct brain networks that, individually, largely replicate previous findings and, taken together, cover a large proportion of the cortical surface. We provide detailed analyses of a subset of regions of relevance: the left frontal, left temporal and middle frontal cortices. These preliminary analyses highlight how combining such a large set of functional contrasts may contribute to establish a finer-grained brain atlas of cognitive functions, especially in regions of high functional overlap. Detailed structural images (structural connectivity, micro-structures, axonal diameter) acquired in the same individuals in the context of the ARCHI database provide a promising situation to explore functional/structural interdependence. Additionally, this protocol might also be used as a way to establish individual neurofunctional signatures in large cohorts.
Collapse
|
10
|
Schaffert N, Janzen TB, Mattes K, Thaut MH. A Review on the Relationship Between Sound and Movement in Sports and Rehabilitation. Front Psychol 2019; 10:244. [PMID: 30809175 PMCID: PMC6379478 DOI: 10.3389/fpsyg.2019.00244] [Citation(s) in RCA: 90] [Impact Index Per Article: 15.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2018] [Accepted: 01/24/2019] [Indexed: 12/19/2022] Open
Abstract
The role of auditory information on perceptual-motor processes has gained increased interest in sports and psychology research in recent years. Numerous neurobiological and behavioral studies have demonstrated the close interaction between auditory and motor areas of the brain, and the importance of auditory information for movement execution, control, and learning. In applied research, artificially produced acoustic information and real-time auditory information have been implemented in sports and rehabilitation to improve motor performance in athletes, healthy individuals, and patients affected by neurological or movement disorders. However, this research is scattered both across time and scientific disciplines. The aim of this paper is to provide an overview about the interaction between movement and sound and review the current literature regarding the effect of natural movement sounds, movement sonification, and rhythmic auditory information in sports and motor rehabilitation. The focus here is threefold: firstly, we provide an overview of empirical studies using natural movement sounds and movement sonification in sports. Secondly, we review recent clinical and applied studies using rhythmic auditory information and sonification in rehabilitation, addressing in particular studies on Parkinson's disease and stroke. Thirdly, we summarize current evidence regarding the cognitive mechanisms and neural correlates underlying the processing of auditory information during movement execution and its mental representation. The current state of knowledge here reviewed provides evidence of the feasibility and effectiveness of the application of auditory information to improve movement execution, control, and (re)learning in sports and motor rehabilitation. Findings also corroborate the critical role of auditory information in auditory-motor coupling during motor (re)learning and performance, suggesting that this area of clinical and applied research has a large potential that is yet to be fully explored.
Collapse
Affiliation(s)
- Nina Schaffert
- Department of Movement and Training Science, Institute for Human Movement Science, University of Hamburg, Hamburg, Germany
| | - Thenille Braun Janzen
- Music and Health Science Research Collaboratory, Faculty of Music, University of Toronto, Toronto, ON, Canada
| | - Klaus Mattes
- Department of Movement and Training Science, Institute for Human Movement Science, University of Hamburg, Hamburg, Germany
| | - Michael H. Thaut
- Music and Health Science Research Collaboratory, Faculty of Music, University of Toronto, Toronto, ON, Canada
| |
Collapse
|
11
|
Kafaligonul H, Albright TD, Stoner GR. Auditory modulation of spiking activity and local field potentials in area MT does not appear to underlie an audiovisual temporal illusion. J Neurophysiol 2018; 120:1340-1355. [PMID: 29924710 DOI: 10.1152/jn.00835.2017] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
The timing of brief stationary sounds has been shown to alter the perceived speed of visual apparent motion (AM), presumably by altering the perceived timing of the individual frames of the AM stimuli and/or the duration of the interstimulus intervals (ISIs) between those frames. To investigate the neural correlates of this "temporal ventriloquism" illusion, we recorded spiking and local field potential (LFP) activity from the middle temporal area (area MT) in awake, fixating macaques. We found that the spiking activity of most MT neurons (but not the LFP) was tuned for the ISI/speed (these parameters covaried) of our AM stimuli but that auditory timing had no effect on that tuning. We next asked whether the predicted changes in perceived timing were reflected in the timing of neuronal responses to the individual frames of the AM stimuli. Although spiking dynamics were significantly, if weakly, affected by auditory timing in a minority of neurons, the timing of spike responses did not systematically mirror the predicted perception of stimuli. Conversely, the duration of LFP responses in β- and γ-frequency bands was qualitatively consistent with human perceptual reports. We discovered, however, that LFP responses to auditory stimuli presented alone were robust and that responses to audiovisual stimuli were predicted by the linear sum of responses to auditory and visual stimuli presented individually. In conclusion, we find evidence of auditory input into area MT but not of the nonlinear audiovisual interactions we had hypothesized to underlie the illusion. NEW & NOTEWORTHY We utilized a set of audiovisual stimuli that elicit an illusion demonstrating "temporal ventriloquism" in visual motion and that have spatiotemporal intervals for which neurons within the middle temporal area are selective. We found evidence of auditory input into the middle temporal area but not of the nonlinear audiovisual interactions underlying this illusion. Our findings suggest that either the illusion was absent in our nonhuman primate subjects or the neuronal correlates of this illusion lie within other areas.
Collapse
Affiliation(s)
- Hulusi Kafaligonul
- National Magnetic Resonance Research Center, Bilkent University , Ankara , Turkey.,Interdisciplinary Neuroscience Program, Bilkent University , Ankara , Turkey
| | - Thomas D Albright
- Vision Center Laboratory, The Salk Institute for Biological Studies , La Jolla, California
| | - Gene R Stoner
- Vision Center Laboratory, The Salk Institute for Biological Studies , La Jolla, California
| |
Collapse
|
12
|
Schmitz G, Bergmann J, Effenberg AO, Krewer C, Hwang TH, Müller F. Movement Sonification in Stroke Rehabilitation. Front Neurol 2018; 9:389. [PMID: 29910768 PMCID: PMC5992267 DOI: 10.3389/fneur.2018.00389] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2018] [Accepted: 05/14/2018] [Indexed: 11/25/2022] Open
Abstract
Stroke often affects arm functions and thus impairs patients' daily activities. Recently, several studies have shown that additional movement acoustics can enhance motor perception and motor control. Therefore, a new method has been developed that allows providing auditory feedback about arm movement trajectories in real-time for motor rehabilitation after stroke. The present article describes the study protocol for a randomized, controlled, examiner, and patient blinded superiority trial (German Clinical Trials Register, www.drks.de, DRKS00011419), in which the method will be applied to 13 subacute stroke patients with hemiparesis during 12 sessions of 30 min each as additional feedback during the regular movement therapy. As primary outcome, a significant pre-post-change in the Box and Block Test is expected that exceeds the performance increase of 13 patients who will be provided with sham-acoustics. Possible limitations of the method as well as the study design are discussed.
Collapse
Affiliation(s)
- Gerd Schmitz
- Institute of Sports Science, Leibniz University Hannover, Hannover, Germany
| | - Jeannine Bergmann
- Schön Klinik Bad Aibling, Bad Aibling, Germany.,German Center for Vertigo and Balance Disorders, Ludwig-Maximilians University of Munich, Munich, Germany
| | - Alfred O Effenberg
- Institute of Sports Science, Leibniz University Hannover, Hannover, Germany
| | - Carmen Krewer
- Schön Klinik Bad Aibling, Bad Aibling, Germany.,Department of Sport and Health Sciences, Technical University Munich, Human Movement Science, Munich, Germany
| | - Tong-Hun Hwang
- Institute of Sports Science, Leibniz University Hannover, Hannover, Germany.,Institute of Microelectronic Systems, Leibniz University Hannover, Hannover, Germany
| | - Friedemann Müller
- Schön Klinik Bad Aibling, Bad Aibling, Germany.,German Center for Vertigo and Balance Disorders, Ludwig-Maximilians University of Munich, Munich, Germany
| |
Collapse
|
13
|
Effenberg AO, Schmitz G. Acceleration and deceleration at constant speed: systematic modulation of motion perception by kinematic sonification. Ann N Y Acad Sci 2018; 1425:52-69. [DOI: 10.1111/nyas.13693] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2017] [Revised: 02/15/2018] [Accepted: 03/06/2018] [Indexed: 02/04/2023]
Affiliation(s)
| | - Gerd Schmitz
- Institute of Sports Science Leibniz Universität Hannover Hannover Germany
| |
Collapse
|
14
|
Ghai S, Schmitz G, Hwang TH, Effenberg AO. Auditory Proprioceptive Integration: Effects of Real-Time Kinematic Auditory Feedback on Knee Proprioception. Front Neurosci 2018; 12:142. [PMID: 29568259 PMCID: PMC5852112 DOI: 10.3389/fnins.2018.00142] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2017] [Accepted: 02/22/2018] [Indexed: 01/23/2023] Open
Abstract
The purpose of the study was to assess the influence of real-time auditory feedback on knee proprioception. Thirty healthy participants were randomly allocated to control (n = 15), and experimental group I (15). The participants performed an active knee-repositioning task using their dominant leg, with/without additional real-time auditory feedback where the frequency was mapped in a convergent manner to two different target angles (40 and 75°). Statistical analysis revealed significant enhancement in knee re-positioning accuracy for the constant and absolute error with real-time auditory feedback, within and across the groups. Besides this convergent condition, we established a second divergent condition. Here, a step-wise transposition of frequency was performed to explore whether a systematic tuning between auditory-proprioceptive repositioning exists. No significant effects were identified in this divergent auditory feedback condition. An additional experimental group II (n = 20) was further included. Here, we investigated the influence of a larger magnitude and directional change of step-wise transposition of the frequency. In a first step, results confirm the findings of experiment I. Moreover, significant effects on knee auditory-proprioception repositioning were evident when divergent auditory feedback was applied. During the step-wise transposition participants showed systematic modulation of knee movements in the opposite direction of transposition. We confirm that knee re-positioning accuracy can be enhanced with concurrent application of real-time auditory feedback and that knee re-positioning can modulated in a goal-directed manner with step-wise transposition of frequency. Clinical implications are discussed with respect to joint position sense in rehabilitation settings.
Collapse
Affiliation(s)
- Shashank Ghai
- Institute of Sports Science, Leibniz University Hannover, Hannover, Germany
| | | | | | | |
Collapse
|
15
|
Ghai S, Ghai I, Effenberg AO. "Low road" to rehabilitation: a perspective on subliminal sensory neuroprosthetics. Neuropsychiatr Dis Treat 2018; 14:301-307. [PMID: 29398914 PMCID: PMC5775748 DOI: 10.2147/ndt.s153392] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 01/04/2023] Open
Abstract
Fear can propagate parallelly through both cortical and subcortical pathways. It can instigate memory consolidation habitually and might allow internal simulation of movements independent of the cortical structures. This perspective suggests delivery of subliminal, aversive and kinematic audiovisual stimuli via neuroprosthetics in patients with neocortical dysfunctions. We suggest possible scenarios by which these stimuli might bypass damaged neocortical structures and possibly assisting in motor relearning. Anticipated neurophysiological mechanisms and methodological scenarios have been discussed in this perspective. This approach introduces novel perspectives into neuropsychology as to how subcortical pathways might be used to induce motor relearning.
Collapse
Affiliation(s)
- Shashank Ghai
- Institute of Sports Science, Leibniz University Hannover, Hannover
| | - Ishan Ghai
- School of Life Sciences, Jacobs University, Bremen, Germany
| | | |
Collapse
|
16
|
Tonelli A, Cuturi LF, Gori M. The Influence of Auditory Information on Visual Size Adaptation. Front Neurosci 2017; 11:594. [PMID: 29114201 PMCID: PMC5660698 DOI: 10.3389/fnins.2017.00594] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2017] [Accepted: 10/09/2017] [Indexed: 12/04/2022] Open
Abstract
Size perception can be influenced by several visual cues, such as spatial (e.g., depth or vergence) and temporal contextual cues (e.g., adaptation to steady visual stimulation). Nevertheless, perception is generally multisensory and other sensory modalities, such as auditory, can contribute to the functional estimation of the size of objects. In this study, we investigate whether auditory stimuli at different sound pitches can influence visual size perception after visual adaptation. To this aim, we used an adaptation paradigm (Pooresmaeili et al., 2013) in three experimental conditions: visual-only, visual-sound at 100 Hz and visual-sound at 9,000 Hz. We asked participants to judge the size of a test stimulus in a size discrimination task. First, we obtained a baseline for all conditions. In the visual-sound conditions, the auditory stimulus was concurrent to the test stimulus. Secondly, we repeated the task by presenting an adapter (twice as big as the reference stimulus) before the test stimulus. We replicated the size aftereffect in the visual-only condition: the test stimulus was perceived smaller than its physical size. The new finding is that we found the auditory stimuli have an effect on the perceived size of the test stimulus after visual adaptation: low frequency sound decreased the effect of visual adaptation, making the stimulus perceived bigger compared to the visual-only condition, and contrarily, the high frequency sound had the opposite effect, making the test size perceived even smaller.
Collapse
Affiliation(s)
- Alessia Tonelli
- Unit for Visually Impaired People, Science and Technology for Children and Adults, Istituto Italiano di Tecnologia, Genoa, Italy.,Robotics, Brain and Cognitive Sciences Department, Istituto Italiano di Tecnologia, Genoa, Italy
| | - Luigi F Cuturi
- Unit for Visually Impaired People, Science and Technology for Children and Adults, Istituto Italiano di Tecnologia, Genoa, Italy
| | - Monica Gori
- Unit for Visually Impaired People, Science and Technology for Children and Adults, Istituto Italiano di Tecnologia, Genoa, Italy
| |
Collapse
|
17
|
Chauvigné LAS, Belyk M, Brown S. Following during physically-coupled joint action engages motion area MT+/V5. J Integr Neurosci 2017; 16:307-318. [PMID: 28891519 DOI: 10.3233/jin-170023] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
Interpersonal coordination during joint action depends on the perception of the partner's movements. In many such situations - for example, while moving furniture together or dancing a tango - there are kinesthetic interactions between the partners due to the forces shared between them that allow them to directly perceive one another's movements. Joint action of this type often involves a contrast between the roles of leader and follower, where the leader imparts forces onto the follower, and the follower has to be responsive to these force-cues during movement. We carried out a novel 2-person functional MRI study with trained couple dancers engaged in bimanual contact with an experimenter standing next to the bore of the magnet, where the two alternated between being the leader and follower of joint improvised movements, all with the eyes closed. One brain area that was unexpectedly more active during following than leading was the region of MT+/V5. While classically described as an area for processing visual motion, it has more recently been shown to be responsive to tactile motion as well. We suggest that MT+/V5 responds to motion based on force-cues during joint haptic interaction, most especially when a follower responds to force-cues coming from a leader's movements.
Collapse
Affiliation(s)
- Léa A S Chauvigné
- NeuroArts Lab, Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main St. West, Hamilton, ON, L8S 4K1, Canada
| | - Michel Belyk
- NeuroArts Lab, Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main St. West, Hamilton, ON, L8S 4K1, Canada
| | - Steven Brown
- NeuroArts Lab, Department of Psychology, Neuroscience & Behaviour, McMaster University, 1280 Main St. West, Hamilton, ON, L8S 4K1, Canada
| |
Collapse
|
18
|
Andric M, Davis B, Hasson U. Visual cortex signals a mismatch between regularity of auditory and visual streams. Neuroimage 2017; 157:648-659. [DOI: 10.1016/j.neuroimage.2017.05.028] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2017] [Revised: 04/14/2017] [Accepted: 05/15/2017] [Indexed: 10/19/2022] Open
|
19
|
Hidaka S, Higuchi S, Teramoto W, Sugita Y. Neural mechanisms underlying sound-induced visual motion perception: An fMRI study. Acta Psychol (Amst) 2017; 178:66-72. [PMID: 28600968 DOI: 10.1016/j.actpsy.2017.05.013] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2016] [Revised: 05/17/2017] [Accepted: 05/25/2017] [Indexed: 10/19/2022] Open
Abstract
Studies of crossmodal interactions in motion perception have reported activation in several brain areas, including those related to motion processing and/or sensory association, in response to multimodal (e.g., visual and auditory) stimuli that were both in motion. Recent studies have demonstrated that sounds can trigger illusory visual apparent motion to static visual stimuli (sound-induced visual motion: SIVM): A visual stimulus blinking at a fixed location is perceived to be moving laterally when an alternating left-right sound is also present. Here, we investigated brain activity related to the perception of SIVM using a 7T functional magnetic resonance imaging technique. Specifically, we focused on the patterns of neural activities in SIVM and visually induced visual apparent motion (VIVM). We observed shared activations in the middle occipital area (V5/hMT), which is thought to be involved in visual motion processing, for SIVM and VIVM. Moreover, as compared to VIVM, SIVM resulted in greater activation in the superior temporal area and dominant functional connectivity between the V5/hMT area and the areas related to auditory and crossmodal motion processing. These findings indicate that similar but partially different neural mechanisms could be involved in auditory-induced and visually-induced motion perception, and neural signals in auditory, visual, and, crossmodal motion processing areas closely and directly interact in the perception of SIVM.
Collapse
|
20
|
Schmitz G, Effenberg AO. Schlagmann 2.0 – Bewegungsakustische Dimensionen interpersonaler Koordination im Mannschaftssport. GERMAN JOURNAL OF EXERCISE AND SPORT RESEARCH 2017. [DOI: 10.1007/s12662-017-0442-7] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
21
|
Kayser SJ, Philiastides MG, Kayser C. Sounds facilitate visual motion discrimination via the enhancement of late occipital visual representations. Neuroimage 2017; 148:31-41. [PMID: 28082107 PMCID: PMC5349847 DOI: 10.1016/j.neuroimage.2017.01.010] [Citation(s) in RCA: 33] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2016] [Revised: 12/12/2016] [Accepted: 01/05/2017] [Indexed: 12/24/2022] Open
Abstract
Sensory discriminations, such as judgements about visual motion, often benefit from multisensory evidence. Despite many reports of enhanced brain activity during multisensory conditions, it remains unclear which dynamic processes implement the multisensory benefit for an upcoming decision in the human brain. Specifically, it remains difficult to attribute perceptual benefits to specific processes, such as early sensory encoding, the transformation of sensory representations into a motor response, or to more unspecific processes such as attention. We combined an audio-visual motion discrimination task with the single-trial mapping of dynamic sensory representations in EEG activity to localize when and where multisensory congruency facilitates perceptual accuracy. Our results show that a congruent sound facilitates the encoding of motion direction in occipital sensory - as opposed to parieto-frontal - cortices, and facilitates later - as opposed to early (i.e. below 100 ms) - sensory activations. This multisensory enhancement was visible as an earlier rise of motion-sensitive activity in middle-occipital regions about 350 ms from stimulus onset, which reflected the better discriminability of motion direction from brain activity and correlated with the perceptual benefit provided by congruent multisensory information. This supports a hierarchical model of multisensory integration in which the enhancement of relevant sensory cortical representations is transformed into a more accurate choice. Feature specific multisensory integration occurs in sensory not amodal cortex. Feature specific integration occurs late, i.e. around 350 ms post stimulus onset. Acoustic and visual representations interact in occipital motion regions.
Collapse
Affiliation(s)
- Stephanie J Kayser
- Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, UK.
| | | | - Christoph Kayser
- Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, UK
| |
Collapse
|
22
|
Berger CC, Ehrsson HH. Auditory Motion Elicits a Visual Motion Aftereffect. Front Neurosci 2016; 10:559. [PMID: 27994538 PMCID: PMC5136551 DOI: 10.3389/fnins.2016.00559] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2016] [Accepted: 11/21/2016] [Indexed: 11/18/2022] Open
Abstract
The visual motion aftereffect is a visual illusion in which exposure to continuous motion in one direction leads to a subsequent illusion of visual motion in the opposite direction. Previous findings have been mixed with regard to whether this visual illusion can be induced cross-modally by auditory stimuli. Based on research on multisensory perception demonstrating the profound influence auditory perception can have on the interpretation and perceived motion of visual stimuli, we hypothesized that exposure to auditory stimuli with strong directional motion cues should induce a visual motion aftereffect. Here, we demonstrate that horizontally moving auditory stimuli induced a significant visual motion aftereffect—an effect that was driven primarily by a change in visual motion perception following exposure to leftward moving auditory stimuli. This finding is consistent with the notion that visual and auditory motion perception rely on at least partially overlapping neural substrates.
Collapse
Affiliation(s)
| | - H Henrik Ehrsson
- Department of Neuroscience, Karolinska Institutet Stockholm, Sweden
| |
Collapse
|
23
|
Abstract
We asked whether the perceived direction of visual motion and contrast thresholds for motion discrimination are influenced by the concurrent motion of an auditory sound source. Visual motion stimuli were counterphasing Gabor patches, whose net motion energy was manipulated by adjusting the contrast of the leftward-moving and rightward-moving components. The presentation of these visual stimuli was paired with the simultaneous presentation of auditory stimuli, whose apparent motion in 3D auditory space (rightward, leftward, static, no sound) was manipulated using interaural time and intensity differences, and Doppler cues. In experiment 1, observers judged whether the Gabor visual stimulus appeared to move rightward or leftward. In experiment 2, contrast discrimination thresholds for detecting the interval containing unequal (rightward or leftward) visual motion energy were obtained under the same auditory conditions. Experiment 1 showed that the perceived direction of ambiguous visual motion is powerfully influenced by concurrent auditory motion, such that auditory motion 'captured' ambiguous visual motion. Experiment 2 showed that this interaction occurs at a sensory stage of processing as visual contrast discrimination thresholds (a criterion-free measure of sensitivity) were significantly elevated when paired with congruent auditory motion. These results suggest that auditory and visual motion signals are integrated and combined into a supramodal (audiovisual) representation of motion.
Collapse
|
24
|
Scholz DS, Rohde S, Nikmaram N, Brückner HP, Großbach M, Rollnik JD, Altenmüller EO. Sonification of Arm Movements in Stroke Rehabilitation - A Novel Approach in Neurologic Music Therapy. Front Neurol 2016; 7:106. [PMID: 27445970 PMCID: PMC4928599 DOI: 10.3389/fneur.2016.00106] [Citation(s) in RCA: 36] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2015] [Accepted: 06/20/2016] [Indexed: 01/17/2023] Open
Abstract
Gross motor impairments are common after stroke, but efficient and motivating therapies for these impairments are scarce. We present an innovative musical sonification therapy, especially designed to retrain patients’ gross motor functions. Sonification should motivate patients and provide additional sensory input informing about relative limb position. Twenty-five stroke patients were included in a clinical pre–post study and took part in the sonification training. The patients’ upper extremity functions, their psychological states, and their arm movement smoothness were assessed pre and post training. Patients were randomly assigned to either of two groups. Both groups received an average of 10 days (M = 9.88; SD = 2.03; 30 min/day) of musical sonification therapy [music group (MG)] or a sham sonification movement training [control group (CG)], respectively. The only difference between the two protocols was that in the CG no sound was played back during training. In the beginning, patients explored the acoustic effects of their arm movements in space. At the end of the training, the patients played simple melodies by coordinated arm movements. The 15 patients in the MG showed significantly reduced joint pain (F = 19.96, p < 0.001) in the Fugl–Meyer assessment after training. They also reported a trend to have improved hand function in the stroke impact scale as compared to the CG. Movement smoothness at day 1, day 5, and the last day of the intervention was compared in MG patients and found to be significantly better after the therapy. Taken together, musical sonification may be a promising therapy for motor impairments after stroke, but further research is required since estimated effect sizes point to moderate treatment outcomes.
Collapse
Affiliation(s)
- Daniel S Scholz
- Institute of Music Physiology and Musicians' Medicine, University of Music, Drama and Media , Hannover , Germany
| | - Sönke Rohde
- Institute of Music Physiology and Musicians' Medicine, University of Music, Drama and Media , Hannover , Germany
| | - Nikou Nikmaram
- Institute of Music Physiology and Musicians' Medicine, University of Music, Drama and Media , Hannover , Germany
| | - Hans-Peter Brückner
- Institute of Music Physiology and Musicians' Medicine, University of Music, Drama and Media , Hannover , Germany
| | - Michael Großbach
- Institute of Music Physiology and Musicians' Medicine, University of Music, Drama and Media , Hannover , Germany
| | - Jens D Rollnik
- Institute for Neurorehabilitational Research (InFo), BDH-Clinic Hessisch Oldendorf, Teaching Hospital of Hannover Medical School (MHH) , Hessisch Oldendorf , Germany
| | - Eckart O Altenmüller
- Institute of Music Physiology and Musicians' Medicine, University of Music, Drama and Media , Hannover , Germany
| |
Collapse
|
25
|
Effenberg AO, Fehse U, Schmitz G, Krueger B, Mechling H. Movement Sonification: Effects on Motor Learning beyond Rhythmic Adjustments. Front Neurosci 2016; 10:219. [PMID: 27303255 PMCID: PMC4883456 DOI: 10.3389/fnins.2016.00219] [Citation(s) in RCA: 59] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2016] [Accepted: 05/02/2016] [Indexed: 12/19/2022] Open
Abstract
Motor learning is based on motor perception and emergent perceptual-motor representations. A lot of behavioral research is related to single perceptual modalities but during last two decades the contribution of multimodal perception on motor behavior was discovered more and more. A growing number of studies indicates an enhanced impact of multimodal stimuli on motor perception, motor control and motor learning in terms of better precision and higher reliability of the related actions. Behavioral research is supported by neurophysiological data, revealing that multisensory integration supports motor control and learning. But the overwhelming part of both research lines is dedicated to basic research. Besides research in the domains of music, dance and motor rehabilitation, there is almost no evidence for enhanced effectiveness of multisensory information on learning of gross motor skills. To reduce this gap, movement sonification is used here in applied research on motor learning in sports. Based on the current knowledge on the multimodal organization of the perceptual system, we generate additional real-time movement information being suitable for integration with perceptual feedback streams of visual and proprioceptive modality. With ongoing training, synchronously processed auditory information should be initially integrated into the emerging internal models, enhancing the efficacy of motor learning. This is achieved by a direct mapping of kinematic and dynamic motion parameters to electronic sounds, resulting in continuous auditory and convergent audiovisual or audio-proprioceptive stimulus arrays. In sharp contrast to other approaches using acoustic information as error-feedback in motor learning settings, we try to generate additional movement information suitable for acceleration and enhancement of adequate sensorimotor representations and processible below the level of consciousness. In the experimental setting, participants were asked to learn a closed motor skill (technique acquisition of indoor rowing). One group was treated with visual information and two groups with audiovisual information (sonification vs. natural sounds). For all three groups learning became evident and remained stable. Participants treated with additional movement sonification showed better performance compared to both other groups. Results indicate that movement sonification enhances motor learning of a complex gross motor skill-even exceeding usually expected acoustic rhythmic effects on motor learning.
Collapse
Affiliation(s)
- Alfred O Effenberg
- Faculty of Humanities, Institute of Sports Science, Leibniz Universität Hannover Hanover, Germany
| | - Ursula Fehse
- Faculty of Humanities, Institute of Sports Science, Leibniz Universität Hannover Hanover, Germany
| | - Gerd Schmitz
- Faculty of Humanities, Institute of Sports Science, Leibniz Universität Hannover Hanover, Germany
| | - Bjoern Krueger
- Computer Science, Faculty of Mathematics and Natural Sciences, Institute of Computer Science II, University of Bonn Bonn, Germany
| | - Heinz Mechling
- Institute of Sport Gerontology, German Sport University Cologne Cologne, Germany
| |
Collapse
|
26
|
Hidaka S, Teramoto W, Sugita Y. Spatiotemporal Processing in Crossmodal Interactions for Perception of the External World: A Review. Front Integr Neurosci 2015; 9:62. [PMID: 26733827 PMCID: PMC4686600 DOI: 10.3389/fnint.2015.00062] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2015] [Accepted: 12/03/2015] [Indexed: 11/13/2022] Open
Abstract
Research regarding crossmodal interactions has garnered much interest in the last few decades. A variety of studies have demonstrated that multisensory information (vision, audition, tactile sensation, and so on) can perceptually interact with each other in the spatial and temporal domains. Findings regarding crossmodal interactions in the spatiotemporal domain (i.e., motion processing) have also been reported, with updates in the last few years. In this review, we summarize past and recent findings on spatiotemporal processing in crossmodal interactions regarding perception of the external world. A traditional view regarding crossmodal interactions holds that vision is superior to audition in spatial processing, but audition is dominant over vision in temporal processing. Similarly, vision is considered to have dominant effects over the other sensory modalities (i.e., visual capture) in spatiotemporal processing. However, recent findings demonstrate that sound could have a driving effect on visual motion perception. Moreover, studies regarding perceptual associative learning reported that, after association is established between a sound sequence without spatial information and visual motion information, the sound sequence could trigger visual motion perception. Other sensory information, such as motor action or smell, has also exhibited similar driving effects on visual motion perception. Additionally, recent brain imaging studies demonstrate that similar activation patterns could be observed in several brain areas, including the motion processing areas, between spatiotemporal information from different sensory modalities. Based on these findings, we suggest that multimodal information could mutually interact in spatiotemporal processing in the percept of the external world and that common perceptual and neural underlying mechanisms would exist for spatiotemporal processing.
Collapse
Affiliation(s)
- Souta Hidaka
- Department of Psychology, Rikkyo University Saitama, Japan
| | - Wataru Teramoto
- Department of Psychology, Kumamoto University Kumamoto, Japan
| | - Yoichi Sugita
- Department of Psychology, Waseda University Tokyo, Japan
| |
Collapse
|
27
|
Krebber M, Harwood J, Spitzer B, Keil J, Senkowski D. Visuotactile motion congruence enhances gamma-band activity in visual and somatosensory cortices. Neuroimage 2015; 117:160-9. [DOI: 10.1016/j.neuroimage.2015.05.056] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2015] [Revised: 04/15/2015] [Accepted: 05/19/2015] [Indexed: 11/16/2022] Open
|
28
|
Scholz DS, Rhode S, Großbach M, Rollnik J, Altenmüller E. Moving with music for stroke rehabilitation: a sonification feasibility study. Ann N Y Acad Sci 2015; 1337:69-76. [PMID: 25773619 DOI: 10.1111/nyas.12691] [Citation(s) in RCA: 30] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/17/2023]
Abstract
Gross-motor impairments are common after stroke, but efficacious and motivating therapies for these impairments are scarce. We present a novel musical sonification therapy especially designed to retrain gross-motor functions. Four stroke patients were included in a clinical pre-post feasibility study and were trained with our sonification training. Patients' upper-extremity functions and their psychological states were assessed before and after training. The four patients were subdivided into two groups, with both groups receiving 9 days of musical sonification therapy (music group, MG) or a sham sonification training (control group, CG). The only difference between these training protocols was that, in the CG, no sound was played back. During the training the patients initially explored the acoustic effects of their arm movements, and at the end of the training the patients played simple melodies by moving their arms. The two patients in the MG improved in nearly all motor function tests after the training. They also reported in the stroke impact scale, which assesses well-being, memory, thinking, and social participation, to be less impaired by the stroke. The two patients in the CG did benefit less from the movement training. Taken together, musical sonification may be a promising therapy for impairments after stroke.
Collapse
Affiliation(s)
- Daniel S Scholz
- Institute of Music Physiology and Musicians' Medicine, University of Music, Drama, and Media, Hannover, Germany
| | | | | | | | | |
Collapse
|
29
|
Kafaligonul H, Oluk C. Audiovisual associations alter the perception of low-level visual motion. Front Integr Neurosci 2015; 9:26. [PMID: 25873869 PMCID: PMC4379893 DOI: 10.3389/fnint.2015.00026] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2014] [Accepted: 03/14/2015] [Indexed: 11/13/2022] Open
Abstract
Motion perception is a pervasive nature of vision and is affected by both immediate pattern of sensory inputs and prior experiences acquired through associations. Recently, several studies reported that an association can be established quickly between directions of visual motion and static sounds of distinct frequencies. After the association is formed, sounds are able to change the perceived direction of visual motion. To determine whether such rapidly acquired audiovisual associations and their subsequent influences on visual motion perception are dependent on the involvement of higher-order attentive tracking mechanisms, we designed psychophysical experiments using regular and reverse-phi random dot motions isolating low-level pre-attentive motion processing. Our results show that an association between the directions of low-level visual motion and static sounds can be formed and this audiovisual association alters the subsequent perception of low-level visual motion. These findings support the view that audiovisual associations are not restricted to high-level attention based motion system and early-level visual motion processing has some potential role.
Collapse
Affiliation(s)
- Hulusi Kafaligonul
- National Magnetic Resonance Research Center (UMRAM), Bilkent University Ankara, Turkey
| | - Can Oluk
- Department of Psychology, Bilkent University Ankara, Turkey
| |
Collapse
|
30
|
The effect of real-time auditory feedback on learning new characters. Hum Mov Sci 2014; 43:216-28. [PMID: 25533208 DOI: 10.1016/j.humov.2014.12.002] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2013] [Revised: 11/24/2014] [Accepted: 12/03/2014] [Indexed: 11/22/2022]
Abstract
The present study investigated the effect of handwriting sonification on graphomotor learning. Thirty-two adults, distributed in two groups, learned four new characters with their non-dominant hand. The experimental design included a pre-test, a training session, and two post-tests, one just after the training sessions and another 24h later. Two characters were learned with and two without real-time auditory feedback (FB). The first group first learned the two non-sonified characters and then the two sonified characters whereas the reverse order was adopted for the second group. Results revealed that auditory FB improved the speed and fluency of handwriting movements but reduced, in the short-term only, the spatial accuracy of the trace. Transforming kinematic variables into sounds allows the writer to perceive his/her movement in addition to the written trace and this might facilitate handwriting learning. However, there were no differential effects of auditory FB, neither long-term nor short-term for the subjects who first learned the characters with auditory FB. We hypothesize that the positive effect on the handwriting kinematics was transferred to characters learned without FB. This transfer effect of the auditory FB is discussed in light of the Theory of Event Coding.
Collapse
|
31
|
Scholz DS, Wu L, Pirzer J, Schneider J, Rollnik JD, Großbach M, Altenmüller EO. Sonification as a possible stroke rehabilitation strategy. Front Neurosci 2014; 8:332. [PMID: 25368548 PMCID: PMC4202805 DOI: 10.3389/fnins.2014.00332] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2014] [Accepted: 10/01/2014] [Indexed: 11/24/2022] Open
Abstract
Despite cerebral stroke being one of the main causes of acquired impairments of motor skills worldwide, well-established therapies to improve motor functions are sparse. Recently, attempts have been made to improve gross motor rehabilitation by mapping patient movements to sound, termed sonification. Sonification provides additional sensory input, supplementing impaired proprioception. However, to date no established sonification-supported rehabilitation protocol strategy exists. In order to examine and validate the effectiveness of sonification in stroke rehabilitation, we developed a computer program, termed “SonicPointer”: Participants' computer mouse movements were sonified in real-time with complex tones. Tone characteristics were derived from an invisible parameter mapping, overlaid on the computer screen. The parameters were: tone pitch and tone brightness. One parameter varied along the x, the other along the y axis. The order of parameter assignment to axes was balanced in two blocks between subjects so that each participant performed under both conditions. Subjects were naive to the overlaid parameter mappings and its change between blocks. In each trial a target tone was presented and subjects were instructed to indicate its origin with respect to the overlaid parameter mappings on the screen as quickly and accurately as possible with a mouse click. Twenty-six elderly healthy participants were tested. Required time and two-dimensional accuracy were recorded. Trial duration times and learning curves were derived. We hypothesized that subjects performed in one of the two parameter-to-axis–mappings better, indicating the most natural sonification. Generally, subjects' localizing performance was better on the pitch axis as compared to the brightness axis. Furthermore, the learning curves were steepest when pitch was mapped onto the vertical and brightness onto the horizontal axis. This seems to be the optimal constellation for this two-dimensional sonification.
Collapse
Affiliation(s)
- Daniel S Scholz
- Institute of Music Physiology and Musicians' Medicine, University of Music, Drama and Media Hannover, Germany
| | - Liming Wu
- Institute of Music Physiology and Musicians' Medicine, University of Music, Drama and Media Hannover, Germany
| | - Jonas Pirzer
- Institute of Music Physiology and Musicians' Medicine, University of Music, Drama and Media Hannover, Germany
| | - Johann Schneider
- Institute of Music Physiology and Musicians' Medicine, University of Music, Drama and Media Hannover, Germany
| | - Jens D Rollnik
- Institute for Neurorehabilitational Research (InFo), BDH-Clinic Hessisch Oldendorf, Teaching Hospital of Hannover Medical School (MHH) Hessisch Oldendorf, Germany
| | - Michael Großbach
- Institute of Music Physiology and Musicians' Medicine, University of Music, Drama and Media Hannover, Germany
| | - Eckart O Altenmüller
- Institute of Music Physiology and Musicians' Medicine, University of Music, Drama and Media Hannover, Germany
| |
Collapse
|
32
|
Zilber N, Ciuciu P, Gramfort A, Azizi L, van Wassenhove V. Supramodal processing optimizes visual perceptual learning and plasticity. Neuroimage 2014; 93 Pt 1:32-46. [DOI: 10.1016/j.neuroimage.2014.02.017] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2013] [Revised: 02/05/2014] [Accepted: 02/13/2014] [Indexed: 11/25/2022] Open
|
33
|
Chuang CH, Ko LW, Jung TP, Lin CT. Kinesthesia in a sustained-attention driving task. Neuroimage 2014; 91:187-202. [PMID: 24444995 DOI: 10.1016/j.neuroimage.2014.01.015] [Citation(s) in RCA: 53] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2013] [Revised: 12/04/2013] [Accepted: 01/11/2014] [Indexed: 11/30/2022] Open
Abstract
This study investigated the effects of kinesthetic stimuli on brain activities during a sustained-attention task in an immersive driving simulator. Tonic and phasic brain responses on multiple timescales were analyzed using time-frequency analysis of electroencephalographic (EEG) sources identified by independent component analysis (ICA). Sorting EEG spectra with respect to reaction times (RT) to randomly introduced lane-departure events revealed distinct effects of kinesthetic stimuli on the brain under different performance levels. Experimental results indicated that EEG spectral dynamics highly correlated with performance lapses when driving involved kinesthetic feedback. Furthermore, in the realistic environment involving both visual and kinesthetic feedback, a transitive relationship of power spectra between optimal-, suboptimal-, and poor-performance groups was found predominately across most of the independent components. In contrast to the static environment with visual input only, kinesthetic feedback reduced theta-power augmentation in the central and frontal components when preparing for action and error monitoring, while strengthening alpha suppression in the central component while steering the wheel. In terms of behavior, subjects tended to have a short response time to process unexpected events with the assistance of kinesthesia, yet only when their performance was optimal. Decrease in attentional demand, facilitated by kinesthetic feedback, eventually significantly increased the reaction time in the suboptimal-performance state. Neurophysiological evidence of mutual relationships between behavioral performance and neurocognition in complex task paradigms and experimental environments, presented in this study, might elucidate our understanding of distributed brain dynamics, supporting natural human cognition and complex coordinated, multi-joint naturalistic behavior, and lead to improved understanding of brain-behavior relations in operating environments.
Collapse
Affiliation(s)
- Chun-Hsiang Chuang
- Brain Research Center, National Chiao Tung University, Hsinchu, Taiwan; Institute of Electrical Control Engineering, Department of Electrical and Computer Engineering, National Chiao Tung University, Hsinchu, Taiwan; Swartz Center for Computational Neuroscience, Institute for Neural Computation, University of California, San Diego, CA, USA
| | - Li-Wei Ko
- Brain Research Center, National Chiao Tung University, Hsinchu, Taiwan; Department of Biological Science and Technology, National Chiao Tung University, Hsinchu, Taiwan
| | - Tzyy-Ping Jung
- Swartz Center for Computational Neuroscience, Institute for Neural Computation, University of California, San Diego, CA, USA; Center for Advanced Neurological Engineering, Institute of Engineering in Medicine, University of California, San Diego, CA, USA.
| | - Chin-Teng Lin
- Brain Research Center, National Chiao Tung University, Hsinchu, Taiwan; Institute of Electrical Control Engineering, Department of Electrical and Computer Engineering, National Chiao Tung University, Hsinchu, Taiwan; Center for Advanced Neurological Engineering, Institute of Engineering in Medicine, University of California, San Diego, CA, USA.
| |
Collapse
|
34
|
Tactile and visual motion direction processing in hMT+/V5. Neuroimage 2014; 84:420-7. [DOI: 10.1016/j.neuroimage.2013.09.004] [Citation(s) in RCA: 48] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2013] [Revised: 08/20/2013] [Accepted: 09/03/2013] [Indexed: 11/18/2022] Open
|
35
|
Gleiss S, Kayser C. Oscillatory mechanisms underlying the enhancement of visual motion perception by multisensory congruency. Neuropsychologia 2014; 53:84-93. [DOI: 10.1016/j.neuropsychologia.2013.11.005] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2013] [Revised: 10/10/2013] [Accepted: 11/11/2013] [Indexed: 12/30/2022]
|
36
|
Dubus G, Bresin R. A systematic review of mapping strategies for the sonification of physical quantities. PLoS One 2013; 8:e82491. [PMID: 24358192 PMCID: PMC3866150 DOI: 10.1371/journal.pone.0082491] [Citation(s) in RCA: 55] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2013] [Accepted: 10/24/2013] [Indexed: 11/23/2022] Open
Abstract
The field of sonification has progressed greatly over the past twenty years and currently constitutes an established area of research. This article aims at exploiting and organizing the knowledge accumulated in previous experimental studies to build a foundation for future sonification works. A systematic review of these studies may reveal trends in sonification design, and therefore support the development of design guidelines. To this end, we have reviewed and analyzed 179 scientific publications related to sonification of physical quantities. Using a bottom-up approach, we set up a list of conceptual dimensions belonging to both physical and auditory domains. Mappings used in the reviewed works were identified, forming a database of 495 entries. Frequency of use was analyzed among these conceptual dimensions as well as higher-level categories. Results confirm two hypotheses formulated in a preliminary study: pitch is by far the most used auditory dimension in sonification applications, and spatial auditory dimensions are almost exclusively used to sonify kinematic quantities. To detect successful as well as unsuccessful sonification strategies, assessment of mapping efficiency conducted in the reviewed works was considered. Results show that a proper evaluation of sonification mappings is performed only in a marginal proportion of publications. Additional aspects of the publication database were investigated: historical distribution of sonification works is presented, projects are classified according to their primary function, and the sonic material used in the auditory display is discussed. Finally, a mapping-based approach for characterizing sonification is proposed.
Collapse
Affiliation(s)
- Gaël Dubus
- Department of Speech, Music and Hearing, School of Computer Science and Communication, KTH Royal Institute of Technology, Stockholm, Sweden
- * E-mail:
| | - Roberto Bresin
- Department of Speech, Music and Hearing, School of Computer Science and Communication, KTH Royal Institute of Technology, Stockholm, Sweden
| |
Collapse
|
37
|
Ogawa A, Macaluso E. Audio-visual interactions for motion perception in depth modulate activity in visual area V3A. Neuroimage 2013; 71:158-67. [PMID: 23333414 DOI: 10.1016/j.neuroimage.2013.01.012] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2012] [Revised: 12/20/2012] [Accepted: 01/09/2013] [Indexed: 11/28/2022] Open
Abstract
Multisensory signals can enhance the spatial perception of objects and events in the environment. Changes of visual size and auditory intensity provide us with the main cues about motion direction in depth. However, frequency changes in audition and binocular disparity in vision also contribute to the perception of motion in depth. Here, we presented subjects with several combinations of auditory and visual depth-cues to investigate multisensory interactions during processing of motion in depth. The task was to discriminate the direction of auditory motion in depth according to increasing or decreasing intensity. Rising or falling auditory frequency provided an additional within-audition cue that matched or did not match the intensity change (i.e. intensity-frequency (IF) "matched vs. unmatched" conditions). In two-thirds of the trials, a task-irrelevant visual stimulus moved either in the same or opposite direction of the auditory target, leading to audio-visual "congruent vs. incongruent" between-modalities depth-cues. Furthermore, these conditions were presented either with or without binocular disparity. Behavioral data showed that the best performance was observed in the audio-visual congruent condition with IF matched. Brain imaging results revealed maximal response in visual area V3A when all cues provided congruent and reliable depth information (i.e. audio-visual congruent, IF-matched condition including disparity cues). Analyses of effective connectivity revealed increased coupling from auditory cortex to V3A specifically in audio-visual congruent trials. We conclude that within- and between-modalities cues jointly contribute to the processing of motion direction in depth, and that they do so via dynamic changes of connectivity between visual and auditory cortices.
Collapse
Affiliation(s)
- Akitoshi Ogawa
- Neuroimaging Laboratory, IRCCS, Santa Lucia Foundation, Via Ardeatina 306, Rome 00179, Italy.
| | | |
Collapse
|
38
|
Schmitz G, Mohammadi B, Hammer A, Heldmann M, Samii A, Münte TF, Effenberg AO. Observation of sonified movements engages a basal ganglia frontocortical network. BMC Neurosci 2013; 14:32. [PMID: 23496827 PMCID: PMC3602090 DOI: 10.1186/1471-2202-14-32] [Citation(s) in RCA: 45] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2012] [Accepted: 03/07/2013] [Indexed: 11/25/2022] Open
Abstract
Background Producing sounds by a musical instrument can lead to audiomotor coupling, i.e. the joint activation of the auditory and motor system, even when only one modality is probed. The sonification of otherwise mute movements by sounds based on kinematic parameters of the movement has been shown to improve motor performance and perception of movements. Results Here we demonstrate in a group of healthy young non-athletes that congruently (sounds match visual movement kinematics) vs. incongruently (no match) sonified breaststroke movements of a human avatar lead to better perceptual judgement of small differences in movement velocity. Moreover, functional magnetic resonance imaging revealed enhanced activity in superior and medial posterior temporal regions including the superior temporal sulcus, known as an important multisensory integration site, as well as the insula bilaterally and the precentral gyrus on the right side. Functional connectivity analysis revealed pronounced connectivity of the STS with the basal ganglia and thalamus as well as frontal motor regions for the congruent stimuli. This was not seen to the same extent for the incongruent stimuli. Conclusions We conclude that sonification of movements amplifies the activity of the human action observation system including subcortical structures of the motor loop. Sonification may thus be an important method to enhance training and therapy effects in sports science and neurological rehabilitation.
Collapse
Affiliation(s)
- Gerd Schmitz
- Institute of Sports Science, University of Hannover, Hannover, Germany
| | | | | | | | | | | | | |
Collapse
|
39
|
Vinken PM, Kröger D, Fehse U, Schmitz G, Brock H, Effenberg AO. Auditory coding of human movement kinematics. Multisens Res 2013; 26:533-52. [PMID: 24800411 DOI: 10.1163/22134808-00002435] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
Although visual perception is dominant on motor perception, control and learning, auditory information can enhance and modulate perceptual as well as motor processes in a multifaceted manner. During last decades new methods of auditory augmentation had been developed with movement sonification as one of the most recent approaches expanding auditory movement information also to usually mute phases of movement. Despite general evidence on the effectiveness of movement sonification in different fields of applied research there is nearly no empirical proof on how sonification of gross motor human movement should be configured to achieve information rich sound sequences. Such lack of empirical proof is given for (a) the selection of suitable movement features as well as for (b) effective kinetic-acoustical mapping patterns and for (c) the number of regarded dimensions of sonification. In this study we explore the informational content of artificial acoustical kinematics in terms of a kinematic movement sonification using an intermodal discrimination paradigm. In a repeated measure design we analysed discrimination rates of six everyday upper limb actions to evaluate the effectiveness of seven different kinds of kinematic-acoustical mappings as well as short-term learning effects. The kinematics of the upper limb actions were calculated based on inertial motion sensor data and transformed into seven different sonifications. Sound sequences were randomly presented to participants and discrimination rates as well as confidence of choice were analysed. Data indicate an instantaneous comprehensibility of the artificial movement acoustics as well as short-term learning effects. No differences between different dimensional encodings became evident thus indicating a high efficiency for intermodal pattern discrimination for the acoustically coded velocity distribution of the actions. Taken together movement information related to continuous kinematic parameters can be transformed into the auditory domain. Additionally, pattern based action discrimination is obviously not restricted to the visual modality. Artificial acoustical kinematics might be used to supplement and/or substitute visual motion perception in sports and motor rehabilitation.
Collapse
|
40
|
Klemen J, Chambers CD. Current perspectives and methods in studying neural mechanisms of multisensory interactions. Neurosci Biobehav Rev 2012; 36:111-33. [PMID: 21569794 DOI: 10.1016/j.neubiorev.2011.04.015] [Citation(s) in RCA: 66] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2011] [Accepted: 04/21/2011] [Indexed: 11/22/2022]
Abstract
In the past decade neuroscience has witnessed major advances in the field of multisensory interactions. A large body of research has revealed several new types of cross-sensory interactions. In addition, multisensory interactions have been reported at temporal and spatial system levels previously thought of as strictly unimodal. We review the findings that have led to the current broad consensus that most, if not all, higher, as well as lower level neural processes are in some form multisensory. We continue by outlining the progress that has been made in identifying the functional significance of different types of interactions, for example, in subserving stimulus binding and enhancement of perceptual certainty. Finally, we provide a critical introduction to cutting edge methods from bayes optimal integration to multivoxel pattern analysis as applied to multisensory research at different system levels.
Collapse
Affiliation(s)
- Jane Klemen
- School of Psychology, Cardiff University Brain Research Imaging Centre (CUBRIC), Cardiff University, Tower Building, Park Place, Cardiff CF10 3AT, UK.
| | | |
Collapse
|
41
|
Abstract
It is well known that the nervous system combines information from different cues within and across sensory modalities to improve performance on perceptual tasks. In this article, we present results showing that in a visual motion-detection task, concurrent auditory motion stimuli improve accuracy even when they do not provide any useful information for the task. When participants judged which of two stimulus intervals contained visual coherent motion, the addition of identical moving sounds to both intervals improved accuracy. However, this enhancement occurred only with sounds that moved in the same direction as the visual motion. Therefore, it appears that the observed benefit of auditory stimulation is due to auditory-visual interactions at a sensory level. Thus, auditory and visual motion-processing pathways interact at a sensory-representation level in addition to the level at which perceptual estimates are combined.
Collapse
|
42
|
Hearing the speed: visual motion biases the perception of auditory tempo. Exp Brain Res 2011; 214:357-71. [DOI: 10.1007/s00221-011-2835-4] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2011] [Accepted: 08/03/2011] [Indexed: 10/17/2022]
|
43
|
Abstract
Our vision remains stable even though the movements of our eyes, head and bodies create a motion pattern on the retina. One of the most important, yet basic, feats of the visual system is to correctly determine whether this retinal motion is owing to real movement in the world or rather our own self-movement. This problem has occupied many great thinkers, such as Descartes and Helmholtz, at least since the time of Alhazen. This theme issue brings together leading researchers from animal neurophysiology, clinical neurology, psychophysics and cognitive neuroscience to summarize the state of the art in the study of visual stability. Recently, there has been significant progress in understanding the limits of visual stability in humans and in identifying many of the brain circuits involved in maintaining a stable percept of the world. Clinical studies and new experimental methods, such as transcranial magnetic stimulation, now make it possible to test the causal role of different brain regions in creating visual stability and also allow us to measure the consequences when the mechanisms of visual stability break down.
Collapse
Affiliation(s)
- David Melcher
- Faculty of Cognitive Science, University of Trento, Italy.
| |
Collapse
|
44
|
Audiovisual synchrony improves motion discrimination via enhanced connectivity between early visual and auditory areas. J Neurosci 2010; 30:12329-39. [PMID: 20844129 DOI: 10.1523/jneurosci.5745-09.2010] [Citation(s) in RCA: 102] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Audiovisual synchrony enables integration of dynamic visual and auditory signals into a more robust and reliable multisensory percept. In this fMRI study, we investigated the neural mechanisms by which audiovisual synchrony facilitates shape and motion discrimination under degraded visual conditions. Subjects were presented with visual patterns that were rotated by discrete increments at irregular and unpredictable intervals while partially obscured by a dynamic noise mask. On synchronous trials, each rotation coincided with an auditory click. On asynchronous trials, clicks were noncoincident with the rotational movements (but with identical temporal statistics). Subjects discriminated shape or rotational motion profile of the partially hidden visual stimuli. Regardless of task context, synchronous signals increased activations bilaterally in (1) calcarine sulcus (CaS) extending into ventral occipitotemporal cortex and (2) Heschl's gyrus extending into planum temporale (HG/PT) compared with asynchronous signals. Adjacent to these automatic synchrony effects, synchrony-induced activations in lateral occipital (LO) regions were amplified bilaterally during shape discrimination and in the right posterior superior temporal sulcus (pSTS) during motion discrimination. Subjects' synchrony-induced benefits in motion discrimination significantly predicted blood oxygenation level-dependent synchrony effects in V5/hMT+. According to dynamic causal modeling, audiovisual synchrony increased connectivity between CaS and HG/PT bidirectionally, whereas shape and motion tasks increased forwards connectivity from CaS to LO or to pSTS, respectively. To increase the salience of partially obscured moving objects, audiovisual synchrony may amplify visual activations by increasing the connectivity between low level visual and auditory areas. These automatic synchrony-induced response amplifications may then be gated to higher order areas according to behavioral relevance and task context.
Collapse
|
45
|
Shams L, Kim R. Crossmodal influences on visual perception. Phys Life Rev 2010; 7:269-84. [DOI: 10.1016/j.plrev.2010.04.006] [Citation(s) in RCA: 87] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/19/2010] [Revised: 03/25/2010] [Accepted: 03/25/2010] [Indexed: 10/19/2022]
|
46
|
Tubaldi F, Turella L, Pierno AC, Grodd W, Tirindelli R, Castiello U. Smelling odors, understanding actions. Soc Neurosci 2010; 6:31-47. [PMID: 20379900 DOI: 10.1080/17470911003691089] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Abstract
Previous evidence indicates that we understand others' actions not only by perceiving their visual features but also by their sound. This raises the possibility that brain regions responsible for action understanding respond to cues coming from different sensory modalities. Yet no studies, to date, have examined if this extends to olfaction. Here we addressed this issue by using functional magnetic resonance imaging. We searched for brain activity related to the observation of an action executed towards an object that was smelled rather than seen. The results show that temporal, parietal, and frontal areas were activated when individuals observed a hand grasping a smelled object. This activity differed from that evoked during the observation of a mimed grasp. Furthermore, superadditive activity was revealed when the action target-object was both seen and smelled. Together these findings indicate the influence of olfaction on action understanding and its contribution to multimodal action representations.
Collapse
|