1
|
Fu X, Smulders FTY, Riecke L. Touch Helps Hearing: Evidence From Continuous Audio-Tactile Stimulation. Ear Hear 2025; 46:184-195. [PMID: 39680490 PMCID: PMC11637573 DOI: 10.1097/aud.0000000000001566] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2023] [Accepted: 06/25/2024] [Indexed: 07/25/2024]
Abstract
OBJECTIVES Identifying target sounds in challenging environments is crucial for daily experiences. It is important to note that it can be enhanced by nonauditory stimuli, for example, through lip-reading in an ongoing conversation. However, how tactile stimuli affect auditory processing is still relatively unclear. Recent studies have shown that brief tactile stimuli can reliably facilitate auditory perception, while studies using longer-lasting audio-tactile stimulation yielded conflicting results. This study aimed to investigate the impact of ongoing pulsating tactile stimulation on basic auditory processing. DESIGN In experiment 1, the electroencephalogram (EEG) was recorded while 24 participants performed a loudness-discrimination task on a 4-Hz modulated tone-in-noise and received either in-phase, anti-phase, or no 4-Hz electrotactile stimulation above the median nerve. In experiment 2, another 24 participants were presented with the same tactile stimulation as before, but performed a tone-in-noise detection task while their selective auditory attention was manipulated. RESULTS We found that in-phase tactile stimulation enhanced EEG responses to the tone, whereas anti-phase tactile stimulation suppressed these responses. No corresponding tactile effects on loudness-discrimination performance were observed in experiment 1. Using a yes/no paradigm in experiment 2, we found that in-phase tactile stimulation, but not anti-phase tactile stimulation, improved detection thresholds. Selective attention also improved thresholds but did not modulate the observed benefit from in-phase tactile stimulation. CONCLUSIONS Our study highlights that ongoing in-phase tactile input can enhance basic auditory processing as reflected in scalp EEG and detection thresholds. This might have implications for the development of hearing enhancement technologies and interventions.
Collapse
Affiliation(s)
- Xueying Fu
- Faculty of Psychology and Neuroscience, Department of Cognitive Neuroscience, Maastricht University, Maastricht, the Netherlands
| | - Fren T. Y. Smulders
- Faculty of Psychology and Neuroscience, Department of Cognitive Neuroscience, Maastricht University, Maastricht, the Netherlands
| | - Lars Riecke
- Faculty of Psychology and Neuroscience, Department of Cognitive Neuroscience, Maastricht University, Maastricht, the Netherlands
| |
Collapse
|
2
|
Răutu IS, De Tiège X, Jousmäki V, Bourguignon M, Bertels J. Speech-derived haptic stimulation enhances speech recognition in a multi-talker background. Sci Rep 2023; 13:16621. [PMID: 37789043 PMCID: PMC10547762 DOI: 10.1038/s41598-023-43644-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2023] [Accepted: 09/26/2023] [Indexed: 10/05/2023] Open
Abstract
Speech understanding, while effortless in quiet conditions, is challenging in noisy environments. Previous studies have revealed that a feasible approach to supplement speech-in-noise (SiN) perception consists in presenting speech-derived signals as haptic input. In the current study, we investigated whether the presentation of a vibrotactile signal derived from the speech temporal envelope can improve SiN intelligibility in a multi-talker background for untrained, normal-hearing listeners. We also determined if vibrotactile sensitivity, evaluated using vibrotactile detection thresholds, modulates the extent of audio-tactile SiN improvement. In practice, we measured participants' speech recognition in a multi-talker noise without (audio-only) and with (audio-tactile) concurrent vibrotactile stimulation delivered in three schemes: to the left or right palm, or to both. Averaged across the three stimulation delivery schemes, the vibrotactile stimulation led to a significant improvement of 0.41 dB in SiN recognition when compared to the audio-only condition. Notably, there were no significant differences observed between the improvements in these delivery schemes. In addition, audio-tactile SiN benefit was significantly predicted by participants' vibrotactile threshold levels and unimodal (audio-only) SiN performance. The extent of the improvement afforded by speech-envelope-derived vibrotactile stimulation was in line with previously uncovered vibrotactile enhancements of SiN perception in untrained listeners with no known hearing impairment. Overall, these results highlight the potential of concurrent vibrotactile stimulation to improve SiN recognition, especially in individuals with poor SiN perception abilities, and tentatively more so with increasing tactile sensitivity. Moreover, they lend support to the multimodal accounts of speech perception and research on tactile speech aid devices.
Collapse
Affiliation(s)
- I Sabina Răutu
- Laboratoire de Neuroanatomie et de Neuroimagerie Translationnelles (LN2T), UNI - ULB Neuroscience Institute, Université Libre de Bruxelles (ULB), Brussels, Belgium.
| | - Xavier De Tiège
- Laboratoire de Neuroanatomie et de Neuroimagerie Translationnelles (LN2T), UNI - ULB Neuroscience Institute, Université Libre de Bruxelles (ULB), Brussels, Belgium
- Service de Neuroimagerie Translationnelle, Hôpital Universitaire de Bruxelles (H.U.B.), CUB Hôpital Erasme, Université Libre de Bruxelles (ULB), Brussels, Belgium
| | | | - Mathieu Bourguignon
- Laboratoire de Neuroanatomie et de Neuroimagerie Translationnelles (LN2T), UNI - ULB Neuroscience Institute, Université Libre de Bruxelles (ULB), Brussels, Belgium
- BCBL, Basque Center on Cognition, Brain and Language, 20009, San Sebastián, Spain
- Laboratory of Neurophysiology and Movement Biomechanics, UNI - ULB Neuroscience Institute, Université Libre de Bruxelles (ULB), Brussels, Belgium
| | - Julie Bertels
- Laboratoire de Neuroanatomie et de Neuroimagerie Translationnelles (LN2T), UNI - ULB Neuroscience Institute, Université Libre de Bruxelles (ULB), Brussels, Belgium.
- ULBabylab, Center for Research in Cognition and Neurosciences (CRCN), UNI - ULB Neuroscience Institute, Université Libre de Bruxelles (ULB), Brussels, Belgium.
| |
Collapse
|
3
|
Fu X, Riecke L. Effects of continuous tactile stimulation on auditory-evoked cortical responses depend on the audio-tactile phase. Neuroimage 2023; 274:120140. [PMID: 37120042 DOI: 10.1016/j.neuroimage.2023.120140] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2023] [Accepted: 04/27/2023] [Indexed: 05/01/2023] Open
Abstract
Auditory perception can benefit from stimuli in non-auditory sensory modalities, as for example in lip-reading. Compared with such visual influences, tactile influences are still poorly understood. It has been shown that single tactile pulses can enhance the perception of auditory stimuli depending on their relative timing, but whether and how such brief auditory enhancements can be stretched in time with more sustained, phase-specific periodic tactile stimulation is still unclear. To address this question, we presented tactile stimulation that fluctuated coherently and continuously at 4Hz with an auditory noise (either in-phase or anti-phase) and assessed its effect on the cortical processing and perception of an auditory signal embedded in that noise. Scalp-electroencephalography recordings revealed an enhancing effect of in-phase tactile stimulation on cortical responses phase-locked to the noise and a suppressive effect of anti-phase tactile stimulation on responses evoked by the auditory signal. Although these effects appeared to follow well-known principles of multisensory integration of discrete audio-tactile events, they were not accompanied by corresponding effects on behavioral measures of auditory signal perception. Our results indicate that continuous periodic tactile stimulation can enhance cortical processing of acoustically-induced fluctuations and mask cortical responses to an ongoing auditory signal. They further suggest that such sustained cortical effects can be insufficient for inducing sustained bottom-up auditory benefits.
Collapse
Affiliation(s)
- Xueying Fu
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, the Netherlands.
| | - Lars Riecke
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, the Netherlands
| |
Collapse
|
4
|
Franken MK, Liu BC, Ostry DJ. Towards a somatosensory theory of speech perception. J Neurophysiol 2022; 128:1683-1695. [PMID: 36416451 PMCID: PMC9762980 DOI: 10.1152/jn.00381.2022] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2022] [Revised: 11/19/2022] [Accepted: 11/19/2022] [Indexed: 11/24/2022] Open
Abstract
Speech perception is known to be a multimodal process, relying not only on auditory input but also on the visual system and possibly on the motor system as well. To date there has been little work on the potential involvement of the somatosensory system in speech perception. In the present review, we identify the somatosensory system as another contributor to speech perception. First, we argue that evidence in favor of a motor contribution to speech perception can just as easily be interpreted as showing somatosensory involvement. Second, physiological and neuroanatomical evidence for auditory-somatosensory interactions across the auditory hierarchy indicates the availability of a neural infrastructure that supports somatosensory involvement in auditory processing in general. Third, there is accumulating evidence for somatosensory involvement in the context of speech specifically. In particular, tactile stimulation modifies speech perception, and speech auditory input elicits activity in somatosensory cortical areas. Moreover, speech sounds can be decoded from activity in somatosensory cortex; lesions to this region affect perception, and vowels can be identified based on somatic input alone. We suggest that the somatosensory involvement in speech perception derives from the somatosensory-auditory pairing that occurs during speech production and learning. By bringing together findings from a set of studies that have not been previously linked, the present article identifies the somatosensory system as a presently unrecognized contributor to speech perception.
Collapse
Affiliation(s)
| | | | - David J Ostry
- McGill University, Montreal, Quebec, Canada
- Haskins Laboratories, New Haven, Connecticut
| |
Collapse
|
5
|
Lohse M, Zimmer-Harwood P, Dahmen JC, King AJ. Integration of somatosensory and motor-related information in the auditory system. Front Neurosci 2022; 16:1010211. [PMID: 36330342 PMCID: PMC9622781 DOI: 10.3389/fnins.2022.1010211] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Accepted: 09/28/2022] [Indexed: 11/30/2022] Open
Abstract
An ability to integrate information provided by different sensory modalities is a fundamental feature of neurons in many brain areas. Because visual and auditory inputs often originate from the same external object, which may be located some distance away from the observer, the synthesis of these cues can improve localization accuracy and speed up behavioral responses. By contrast, multisensory interactions occurring close to the body typically involve a combination of tactile stimuli with other sensory modalities. Moreover, most activities involving active touch generate sound, indicating that stimuli in these modalities are frequently experienced together. In this review, we examine the basis for determining sound-source distance and the contribution of auditory inputs to the neural encoding of space around the body. We then consider the perceptual consequences of combining auditory and tactile inputs in humans and discuss recent evidence from animal studies demonstrating how cortical and subcortical areas work together to mediate communication between these senses. This research has shown that somatosensory inputs interface with and modulate sound processing at multiple levels of the auditory pathway, from the cochlear nucleus in the brainstem to the cortex. Circuits involving inputs from the primary somatosensory cortex to the auditory midbrain have been identified that mediate suppressive effects of whisker stimulation on auditory thalamocortical processing, providing a possible basis for prioritizing the processing of tactile cues from nearby objects. Close links also exist between audition and movement, and auditory responses are typically suppressed by locomotion and other actions. These movement-related signals are thought to cancel out self-generated sounds, but they may also affect auditory responses via the associated somatosensory stimulation or as a result of changes in brain state. Together, these studies highlight the importance of considering both multisensory context and movement-related activity in order to understand how the auditory cortex operates during natural behaviors, paving the way for future work to investigate auditory-somatosensory interactions in more ecological situations.
Collapse
|
6
|
Ball F, Nentwich A, Noesselt T. Cross-modal perceptual enhancement of unisensory targets is uni-directional and does not affect temporal expectations. Vision Res 2021; 190:107962. [PMID: 34757275 DOI: 10.1016/j.visres.2021.107962] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2021] [Revised: 10/05/2021] [Accepted: 10/15/2021] [Indexed: 10/20/2022]
Abstract
Temporal structures in the environment can shape temporal expectations (TE); and previous studies demonstrated that TEs interact with multisensory interplay (MSI) when multisensory stimuli are presented synchronously. Here, we tested whether other types of MSI - evoked by asynchronous yet temporally flanking irrelevant stimuli - result in similar performance patterns. To this end, we presented sequences of 12 stimuli (10 Hz) which consisted of auditory (A), visual (V) or alternating auditory-visual stimuli (e.g. A-V-A-V-…) with either auditory or visual targets (Exp. 1). Participants discriminated target frequencies (auditory pitch or visual spatial frequency) embedded in these sequences. To test effects of TE, the proportion of early and late temporal target positions was manipulated run-wise. Performance for unisensory targets was affected by temporally flanking distractors, with auditory temporal flankers selectively improving visual target perception (Exp. 1). However, no effect of temporal expectation was observed. Control experiments (Exp. 2-3) tested whether this lack of TE effect was due to the higher presentation frequency in Exp. 1 relative to previous experiments. Importantly, even at higher stimulation frequencies redundant multisensory targets (Exp. 2-3) reliably modulated TEs. Together, our results indicate that visual target detection was enhanced by MSI. However, this cross-modal enhancement - in contrast to the redundant target effect - was still insufficient to generate TEs. We posit that unisensory target representations were either instable or insufficient for the generation of TEs while less demanding MSI still occurred; highlighting the need for robust stimulus representations when generating temporal expectations.
Collapse
Affiliation(s)
- Felix Ball
- Department of Biological Psychology, Faculty of Natural Science, Otto-von-Guericke-University Magdeburg, Germany; Center for Behavioral Brain Sciences, Otto-von-Guericke-University Magdeburg, Germany.
| | - Annika Nentwich
- Department of Biological Psychology, Faculty of Natural Science, Otto-von-Guericke-University Magdeburg, Germany
| | - Toemme Noesselt
- Department of Biological Psychology, Faculty of Natural Science, Otto-von-Guericke-University Magdeburg, Germany; Center for Behavioral Brain Sciences, Otto-von-Guericke-University Magdeburg, Germany
| |
Collapse
|
7
|
Zioga I, Harrison PMC, Pearce MT, Bhattacharya J, Luft CDB. Auditory but Not Audiovisual Cues Lead to Higher Neural Sensitivity to the Statistical Regularities of an Unfamiliar Musical Style. J Cogn Neurosci 2020; 32:2241-2259. [PMID: 32762519 DOI: 10.1162/jocn_a_01614] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
It is still a matter of debate whether visual aids improve learning of music. In a multisession study, we investigated the neural signatures of novel music sequence learning with or without aids (auditory-only: AO, audiovisual: AV). During three training sessions on three separate days, participants (nonmusicians) reproduced (note by note on a keyboard) melodic sequences generated by an artificial musical grammar. The AV group (n = 20) had each note color-coded on screen, whereas the AO group (n = 20) had no color indication. We evaluated learning of the statistical regularities of the novel music grammar before and after training by presenting melodies ending on correct or incorrect notes and by asking participants to judge the correctness and surprisal of the final note, while EEG was recorded. We found that participants successfully learned the new grammar. Although the AV group, as compared to the AO group, reproduced longer sequences during training, there was no significant difference in learning between groups. At the neural level, after training, the AO group showed a larger N100 response to low-probability compared with high-probability notes, suggesting an increased neural sensitivity to statistical properties of the grammar; this effect was not observed in the AV group. Our findings indicate that visual aids might improve sequence reproduction while not necessarily promoting better learning, indicating a potential dissociation between sequence reproduction and learning. We suggest that the difficulty induced by auditory-only input during music training might enhance cognitive engagement, thereby improving neural sensitivity to the underlying statistical properties of the learned material.
Collapse
|
8
|
Zumer JM, White TP, Noppeney U. The neural mechanisms of audiotactile binding depend on asynchrony. Eur J Neurosci 2020; 52:4709-4731. [PMID: 32725895 DOI: 10.1111/ejn.14928] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2020] [Revised: 07/06/2020] [Accepted: 07/24/2020] [Indexed: 11/30/2022]
Abstract
Asynchrony is a critical cue informing the brain whether sensory signals are caused by a common source and should be integrated or segregated. This psychophysics-electroencephalography (EEG) study investigated the influence of asynchrony on how the brain binds audiotactile (AT) signals to enable faster responses in a redundant target paradigm. Human participants actively responded (psychophysics) or passively attended (EEG) to noise bursts, "taps-to-the-face" and their AT combinations at seven AT asynchronies: 0, ±20, ±70 and ±500 ms. Behaviourally, observers were faster at detecting AT than unisensory stimuli within a temporal integration window: the redundant target effect was maximal for synchronous stimuli and declined within a ≤70 ms AT asynchrony. EEG revealed a cascade of AT interactions that relied on different neural mechanisms depending on AT asynchrony. At small (≤20 ms) asynchronies, AT interactions arose for evoked response potentials (ERPs) at 110 ms and ~400 ms post-stimulus. Selectively at ±70 ms asynchronies, AT interactions were observed for the P200 ERP, theta-band inter-trial coherence (ITC) and power at ~200 ms post-stimulus. In conclusion, AT binding was mediated by distinct neural mechanisms depending on the asynchrony of the AT signals. Early AT interactions in ERPs and theta-band ITC and power were critical for the behavioural response facilitation within a ≤±70 ms temporal integration window.
Collapse
Affiliation(s)
- Johanna M Zumer
- School of Psychology, University of Birmingham, Birmingham, UK.,Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, UK.,Centre for Human Brain Health, University of Birmingham, Birmingham, UK.,School of Life and Health Sciences, Aston University, Birmingham, UK
| | - Thomas P White
- School of Psychology, University of Birmingham, Birmingham, UK.,Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, UK
| | - Uta Noppeney
- School of Psychology, University of Birmingham, Birmingham, UK.,Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, UK.,Centre for Human Brain Health, University of Birmingham, Birmingham, UK.,Donders Institute for Brain, Cognition, and Behaviour, Nijmegen, The Netherlands
| |
Collapse
|
9
|
Coffman BA, Candelaria-Cook FT, Stephen JM. Unisensory and Multisensory Responses in Fetal Alcohol Spectrum Disorders (FASD): Effects of Spatial Congruence. Neuroscience 2020; 430:34-46. [PMID: 31982473 DOI: 10.1016/j.neuroscience.2020.01.013] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2019] [Revised: 11/22/2019] [Accepted: 01/08/2020] [Indexed: 12/16/2022]
Abstract
While it is generally accepted that structural and functional brain deficits underlie the behavioral deficits associated with Fetal Alcohol Spectrum Disorders (FASD), the degree to which these problems are expressed in sensory pathology is unknown. Electrophysiological measures indicate that neural processing is delayed in visual and auditory domains. Furthermore, multiple reports of white matter deficits due to prenatal alcohol exposure indicate altered cortical connectivity in individuals with FASD. Multisensory integration requires close coordination between disparate cortical areas leading us to hypothesize that individuals with FASD will have impaired multisensory integration relative to healthy control (HC) participants. Participants' neurophysiological responses were recorded using magnetoencephalography (MEG) during passive unisensory or simultaneous, spatially congruent or incongruent multisensory auditory and somatosensory stimuli. Source timecourses from evoked responses were estimated using multi-dipole spatiotemporal modeling. Auditory M100 response latency was faster for the multisensory relative to the unisensory condition but no group differences were observed. M200 auditory latency to congruent stimuli was earlier and congruent amplitude was larger in participants with FASD relative to controls. Somatosensory M100 response latency was faster in right hemisphere for multisensory relative to unisensory stimulation in both groups. FASD participants' somatosensory M200 responses were delayed by 13 ms, but only for the unisensory presentation of the somatosensory stimulus. M200 results indicate that unisensory and multisensory processing is altered in FASD; it remains to be seen if the multisensory response represents a normalization of the unisensory deficits.
Collapse
Affiliation(s)
- Brian A Coffman
- The Mind Research Network and Lovelace Biomedical and Environmental Research Institute, 1101 Yale NE, Albuquerque, NM 87106, USA; Department of Psychology, University of New Mexico, MSC03 2220, 1 University of New Mexico, Albuquerque, NM 87131, USA; Department of Psychiatry, University of Pittsburgh School of Medicine, 3501 Forbes Avenue, Pittsburgh, PA 15213, USA
| | - Felicha T Candelaria-Cook
- The Mind Research Network and Lovelace Biomedical and Environmental Research Institute, 1101 Yale NE, Albuquerque, NM 87106, USA
| | - Julia M Stephen
- The Mind Research Network and Lovelace Biomedical and Environmental Research Institute, 1101 Yale NE, Albuquerque, NM 87106, USA.
| |
Collapse
|
10
|
Cieśla K, Wolak T, Lorens A, Heimler B, Skarżyński H, Amedi A. Immediate improvement of speech-in-noise perception through multisensory stimulation via an auditory to tactile sensory substitution. Restor Neurol Neurosci 2019; 37:155-166. [PMID: 31006700 PMCID: PMC6598101 DOI: 10.3233/rnn-190898] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022]
Abstract
BACKGROUND Hearing loss is becoming a real social and health problem. Its prevalence in the elderly is an epidemic. The risk of developing hearing loss is also growing among younger people. If left untreated, hearing loss can perpetuate development of neurodegenerative diseases, including dementia. Despite recent advancements in hearing aid (HA) and cochlear implant (CI) technologies, hearing impaired users still encounter significant practical and social challenges, with or without aids. In particular, they all struggle with understanding speech in challenging acoustic environments, especially in presence of a competing speaker. OBJECTIVES In the current proof-of-concept study we tested whether multisensory stimulation, pairing audition and a minimal-size touch device would improve intelligibility of speech in noise. METHODS To this aim we developed an audio-to-tactile sensory substitution device (SSD) transforming low-frequency speech signals into tactile vibrations delivered on two finger tips. Based on the inverse effectiveness law, i.e., multisensory enhancement is strongest when signal-to-noise ratio is lowest between senses, we embedded non-native language stimuli in speech-like noise and paired it with a low-frequency input conveyed through touch. RESULTS We found immediate and robust improvement in speech recognition (i.e. in the Signal-To-Noise-ratio) in the multisensory condition without any training, at a group level as well as in every participant. The reported improvement at the group-level of 6 dB was indeed major considering that an increase of 10 dB represents a doubling of the perceived loudness. CONCLUSIONS These results are especially relevant when compared to previous SSD studies showing effects in behavior only after a demanding cognitive training. We discuss the implications of our results for development of SSDs and of specific rehabilitation programs for the hearing impaired either using or not using HAs or CIs. We also discuss the potential application of such a set-up for sense augmentation, such as when learning a new language.
Collapse
Affiliation(s)
- Katarzyna Cieśla
- Institute of Physiology and Pathology of Hearing, World Hearing Center, Warsaw, Poland
- Department of Medical Neurobiology, Institute for Medical Research Israel-Canada, Faculty of Medicine, Hebrew University of Jerusalem, Hadassah Ein-Kerem, Jerusalem, Israel
| | - Tomasz Wolak
- Institute of Physiology and Pathology of Hearing, World Hearing Center, Warsaw, Poland
| | - Artur Lorens
- Institute of Physiology and Pathology of Hearing, World Hearing Center, Warsaw, Poland
| | - Benedetta Heimler
- Department of Medical Neurobiology, Institute for Medical Research Israel-Canada, Faculty of Medicine, Hebrew University of Jerusalem, Hadassah Ein-Kerem, Jerusalem, Israel
| | - Henryk Skarżyński
- Institute of Physiology and Pathology of Hearing, World Hearing Center, Warsaw, Poland
| | - Amir Amedi
- Department of Medical Neurobiology, Institute for Medical Research Israel-Canada, Faculty of Medicine, Hebrew University of Jerusalem, Hadassah Ein-Kerem, Jerusalem, Israel
- The Cognitive Science Program, The Hebrew University of Jerusalem, Jerusalem, Israel
| |
Collapse
|
11
|
Starke J, Ball F, Heinze HJ, Noesselt T. The spatio-temporal profile of multisensory integration. Eur J Neurosci 2017; 51:1210-1223. [PMID: 29057531 DOI: 10.1111/ejn.13753] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2017] [Revised: 10/13/2017] [Accepted: 10/16/2017] [Indexed: 12/29/2022]
Abstract
Task-irrelevant visual stimuli can enhance auditory perception. However, while there is some neurophysiological evidence for mechanisms that underlie the phenomenon, the neural basis of visually induced effects on auditory perception remains unknown. Combining fMRI and EEG with psychophysical measurements in two independent studies, we identified the neural underpinnings and temporal dynamics of visually induced auditory enhancement. Lower- and higher-intensity sounds were paired with a non-informative visual stimulus, while participants performed an auditory detection task. Behaviourally, visual co-stimulation enhanced auditory sensitivity. Using fMRI, enhanced BOLD signals were observed in primary auditory cortex for low-intensity audiovisual stimuli which scaled with subject-specific enhancement in perceptual sensitivity. Concordantly, a modulation of event-related potentials could already be observed over frontal electrodes at an early latency (30-80 ms), which again scaled with subject-specific behavioural benefits. Later modulations starting around 280 ms, that is in the time range of the P3, did not fit this pattern of brain-behaviour correspondence. Hence, the latency of the corresponding fMRI-EEG brain-behaviour modulation points at an early interplay of visual and auditory signals in low-level auditory cortex, potentially mediated by crosstalk at the level of the thalamus. However, fMRI signals in primary auditory cortex, auditory thalamus and the P50 for higher-intensity auditory stimuli were also elevated by visual co-stimulation (in the absence of any behavioural effect) suggesting a general, intensity-independent integration mechanism. We propose that this automatic interaction occurs at the level of the thalamus and might signify a first step of audiovisual interplay necessary for visually induced perceptual enhancement of auditory perception.
Collapse
Affiliation(s)
- Johanna Starke
- Department of Biological Psychology, Faculty of Natural Science, Otto-von-Guericke-University Magdeburg, Magdeburg, Germany.,Department of Neurology, Faculty of Medicine, Otto-von-Guericke-University Magdeburg, Magdeburg, Germany
| | - Felix Ball
- Department of Biological Psychology, Faculty of Natural Science, Otto-von-Guericke-University Magdeburg, Magdeburg, Germany.,Department of Neurology, Faculty of Medicine, Otto-von-Guericke-University Magdeburg, Magdeburg, Germany.,Center for Behavioural Brain Sciences, Otto-von-Guericke-University Magdeburg, Magdeburg, Germany
| | - Hans-Jochen Heinze
- Department of Neurology, Faculty of Medicine, Otto-von-Guericke-University Magdeburg, Magdeburg, Germany.,Center for Behavioural Brain Sciences, Otto-von-Guericke-University Magdeburg, Magdeburg, Germany
| | - Toemme Noesselt
- Department of Biological Psychology, Faculty of Natural Science, Otto-von-Guericke-University Magdeburg, Magdeburg, Germany.,Center for Behavioural Brain Sciences, Otto-von-Guericke-University Magdeburg, Magdeburg, Germany
| |
Collapse
|
12
|
Oscillatory activity in auditory cortex reflects the perceptual level of audio-tactile integration. Sci Rep 2016; 6:33693. [PMID: 27647158 PMCID: PMC5028762 DOI: 10.1038/srep33693] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2016] [Accepted: 08/31/2016] [Indexed: 12/02/2022] Open
Abstract
Cross-modal interactions between sensory channels have been shown to depend on both the spatial disparity and the perceptual similarity between the presented stimuli. Here we investigate the behavioral and neural integration of auditory and tactile stimulus pairs at different levels of spatial disparity. Additionally, we modulated the amplitudes of both stimuli in either a coherent or non-coherent manner. We found that both auditory and tactile localization performance was biased towards the stimulus in the respective other modality. This bias linearly increases with stimulus disparity and is more pronounced for coherently modulated stimulus pairs. Analyses of electroencephalographic (EEG) activity at temporal–cortical sources revealed enhanced event-related potentials (ERPs) as well as decreased alpha and beta power during bimodal as compared to unimodal stimulation. However, while the observed ERP differences are similar for all stimulus combinations, the extent of oscillatory desynchronization varies with stimulus disparity. Moreover, when both stimuli were subjectively perceived as originating from the same direction, the reduction in alpha and beta power was significantly stronger. These observations suggest that in the EEG the level of perceptual integration is mainly reflected by changes in ongoing oscillatory activity.
Collapse
|
13
|
Pishnamazi M, Nojaba Y, Ganjgahi H, Amousoltani A, Oghabian MA. Neural correlates of audiotactile phonetic processing in early-blind readers: an fMRI study. Exp Brain Res 2015; 234:1263-77. [DOI: 10.1007/s00221-015-4515-2] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2014] [Accepted: 11/30/2015] [Indexed: 10/22/2022]
|
14
|
Meredith MA, Allman BL. Single-unit analysis of somatosensory processing in the core auditory cortex of hearing ferrets. Eur J Neurosci 2015; 41:686-98. [PMID: 25728185 DOI: 10.1111/ejn.12828] [Citation(s) in RCA: 36] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2014] [Revised: 11/07/2014] [Accepted: 12/10/2014] [Indexed: 11/29/2022]
Abstract
The recent findings in several species that the primary auditory cortex processes non-auditory information have largely overlooked the possibility of somatosensory effects. Therefore, the present investigation examined the core auditory cortices (anterior auditory field and primary auditory cortex) for tactile responsivity. Multiple single-unit recordings from anesthetised ferret cortex yielded histologically verified neurons (n = 311) tested with electronically controlled auditory, visual and tactile stimuli, and their combinations. Of the auditory neurons tested, a small proportion (17%) was influenced by visual cues, but a somewhat larger number (23%) was affected by tactile stimulation. Tactile effects rarely occurred alone and spiking responses were observed in bimodal auditory-tactile neurons. However, the broadest tactile effect that was observed, which occurred in all neuron types, was that of suppression of the response to a concurrent auditory cue. The presence of tactile effects in the core auditory cortices was supported by a substantial anatomical projection from the rostral suprasylvian sulcal somatosensory area. Collectively, these results demonstrate that crossmodal effects in the auditory cortex are not exclusively visual and that somatosensation plays a significant role in modulation of acoustic processing, and indicate that crossmodal plasticity following deafness may unmask these existing non-auditory functions.
Collapse
Affiliation(s)
- M Alex Meredith
- Department of Anatomy and Neurobiology, Virginia Commonwealth University School of Medicine, 1101 E. Marshall Street, Sanger Hall Rm-12-067, Richmond, VA, 23298-0709, USA
| | | |
Collapse
|
15
|
Leonardelli E, Braun C, Weisz N, Lithari C, Occelli V, Zampini M. Prestimulus oscillatory alpha power and connectivity patterns predispose perceptual integration of an audio and a tactile stimulus. Hum Brain Mapp 2015; 36:3486-98. [PMID: 26109518 DOI: 10.1002/hbm.22857] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2014] [Revised: 05/13/2015] [Accepted: 05/14/2015] [Indexed: 11/06/2022] Open
Abstract
To efficiently perceive and respond to the external environment, our brain has to perceptually integrate or segregate stimuli of different modalities. The temporal relationship between the different sensory modalities is therefore essential for the formation of different multisensory percepts. In this magnetoencephalography study, we created a paradigm where an audio and a tactile stimulus were presented by an ambiguous temporal relationship so that perception of physically identical audiotactile stimuli could vary between integrated (emanating from the same source) and segregated. This bistable paradigm allowed us to compare identical bimodal stimuli that elicited different percepts, providing a possibility to directly infer multisensory interaction effects. Local differences in alpha power over bilateral inferior parietal lobules (IPLs) and superior parietal lobules (SPLs) preceded integrated versus segregated percepts of the two stimuli (audio and tactile). Furthermore, differences in long-range cortical functional connectivity seeded in rIPL (region of maximum difference) revealed differential patterns that predisposed integrated or segregated percepts encompassing secondary areas of all different modalities and prefrontal cortex. We showed that the prestimulus brain states predispose the perception of the audiotactile stimulus both in a global and a local manner. Our findings are in line with a recent consistent body of findings on the importance of prestimulus brain states for perception of an upcoming stimulus. This new perspective on how stimuli originating from different modalities are integrated suggests a non-modality specific network predisposing multisensory perception.
Collapse
Affiliation(s)
| | - Christoph Braun
- Center for Mind/Brain Sciences, University of Trento, Trento, Italy.,MEG Center, University of Tübingen, Tübingen, Germany.,Werner Reichardt Centre for Integrative Neuroscience(CIN), University of Tübingen, Tübingen, Germany
| | - Nathan Weisz
- Center for Mind/Brain Sciences, University of Trento, Trento, Italy
| | - Chrysa Lithari
- Center for Mind/Brain Sciences, University of Trento, Trento, Italy
| | | | | |
Collapse
|
16
|
Audio-visual synchrony modulates the ventriloquist illusion and its neural/spatial representation in the auditory cortex. Neuroimage 2014; 98:425-34. [DOI: 10.1016/j.neuroimage.2014.04.077] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2013] [Revised: 04/25/2014] [Accepted: 04/30/2014] [Indexed: 11/20/2022] Open
|
17
|
Rogers LJ. Asymmetry of brain and behavior in animals: Its development, function, and human relevance. Genesis 2014; 52:555-71. [DOI: 10.1002/dvg.22741] [Citation(s) in RCA: 96] [Impact Index Per Article: 8.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2013] [Revised: 01/04/2014] [Accepted: 01/06/2014] [Indexed: 01/08/2023]
Affiliation(s)
- Lesley J. Rogers
- Centre for Neuroscience and Animal Behavior; School of Science and Technology, University of New England; Armidale New South Wales 2450 Australia
| |
Collapse
|
18
|
Henschke JU, Noesselt T, Scheich H, Budinger E. Possible anatomical pathways for short-latency multisensory integration processes in primary sensory cortices. Brain Struct Funct 2014; 220:955-77. [DOI: 10.1007/s00429-013-0694-4] [Citation(s) in RCA: 61] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2013] [Accepted: 12/17/2013] [Indexed: 01/25/2023]
|
19
|
Ocklenburg S, Wolf CC, Heed T, Ball A, Cramer H, Röder B, Güntürkün O. Multisensory integration across the menstrual cycle. Front Psychol 2013; 4:666. [PMID: 24069015 PMCID: PMC3781309 DOI: 10.3389/fpsyg.2013.00666] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2013] [Accepted: 09/05/2013] [Indexed: 01/15/2023] Open
Abstract
Evidence suggests that spatial processing changes across time in naturally cycling women, which is likely due to neuromodulatory effects of steroid hormones. Yet, it is unknown whether crossmodal spatial processes depend on steroid hormones as well. In the present experiment, the crossmodal congruency task was used to assess visuo-tactile interactions in naturally cycling women, women using hormonal contraceptives and men. Participants adopted either a crossed or uncrossed hands posture. It was tested whether a postural effect of hand crossing on multisensory interactions in the crossmodal congruency task is modulated by women's cycle phase. We found that visuotactile interactions changed according to cycle phase. Naturally cycling women showed a significant difference between the menstrual and the luteal phase for crossed, but not for uncrossed hands postures. The two control groups showed no test sessions effects. Regression analysis revealed a positive relation between estradiol levels and the size of crossmodal congruency effects (CCE), indicating that estradiol seems to have a neuromodulatory effect on posture processing.
Collapse
Affiliation(s)
- Sebastian Ocklenburg
- Department of Biopsychology, Institute of Cognitive Neuroscience, Ruhr-University Bochum Bochum, Germany
| | | | | | | | | | | | | |
Collapse
|
20
|
Abstract
AbstractThere is a strong interaction between multisensory processing and the neuroplasticity of the human brain. On one hand, recent research demonstrates that experience and training in various domains modifies how information from the different senses is integrated; and, on the other hand multisensory training paradigms seem to be particularly effective in driving functional and structural plasticity. Multisensory training affects early sensory processing within separate sensory domains, as well as the functional and structural connectivity between uni- and multisensory brain regions. In this review, we discuss the evidence for interactions of multisensory processes and brain plasticity and give an outlook on promising clinical applications and open questions.
Collapse
|