1
|
Hu J, Vetter P. How the eyes respond to sounds. Ann N Y Acad Sci 2024; 1532:18-36. [PMID: 38152040 DOI: 10.1111/nyas.15093] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/29/2023]
Abstract
Eye movements have been extensively studied with respect to visual stimulation. However, we live in a multisensory world, and how the eyes are driven by other senses has been explored much less. Here, we review the evidence on how audition can trigger and drive different eye responses and which cortical and subcortical neural correlates are involved. We provide an overview on how different types of sounds, from simple tones and noise bursts to spatially localized sounds and complex linguistic stimuli, influence saccades, microsaccades, smooth pursuit, pupil dilation, and eye blinks. The reviewed evidence reveals how the auditory system interacts with the oculomotor system, both behaviorally and neurally, and how this differs from visually driven eye responses. Some evidence points to multisensory interaction, and potential multisensory integration, but the underlying computational and neural mechanisms are still unclear. While there are marked differences in how the eyes respond to auditory compared to visual stimuli, many aspects of auditory-evoked eye responses remain underexplored, and we summarize the key open questions for future research.
Collapse
Affiliation(s)
- Junchao Hu
- Visual and Cognitive Neuroscience Lab, Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Petra Vetter
- Visual and Cognitive Neuroscience Lab, Department of Psychology, University of Fribourg, Fribourg, Switzerland
| |
Collapse
|
2
|
Olszanowski M, Frankowska N, Tołopiło A. "Rear bias" in spatial auditory perception: Attentional and affective vigilance to sounds occurring outside the visual field. Psychophysiology 2023; 60:e14377. [PMID: 37357967 DOI: 10.1111/psyp.14377] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2022] [Revised: 05/11/2023] [Accepted: 05/14/2023] [Indexed: 06/27/2023]
Abstract
Presented studies explored the rear bias phenomenon, that is, the attentional and affective bias to sounds occurring behind the listener. Physiological and psychological reactions (i.e., fEMG, EDA/SCR, Simple Reaction Task-SRT, and self-assessments of affect-related states) were measured in response to tones of different frequencies (Study 1) and emotional vocalizations (Study 2) presented in rear and front spatial locations. Results showed that emotional vocalizations, when located in the back, facilitate reactions related to attention orientation (i.e., auricularis muscle response and simple reaction times) and evoke higher arousal-both physiological (as measured by SCR) and psychological (self-assessment scale). Importantly, observed asymmetries were larger for negative and threat-related signals (e.g., anger) than positive/nonthreatening ones (e.g., achievement). By contrast, there were only small differences for the relatively higher frequency tones. The observed relationships are discussed in terms of one of the postulated auditory system's functions, which is monitoring of the environment in order to quickly detect potential threats that occur outside of the visual field (e.g., behind one's back).
Collapse
Affiliation(s)
- Michal Olszanowski
- Center for Research on Biological Basis of Social Behavior, SWPS University, Warsaw, Poland
| | - Natalia Frankowska
- Center for Research on Biological Basis of Social Behavior, SWPS University, Warsaw, Poland
| | - Aleksandra Tołopiło
- Center for Research on Biological Basis of Social Behavior, SWPS University, Warsaw, Poland
| |
Collapse
|
3
|
Auditory distance perception in front and rear space. Hear Res 2022; 417:108468. [DOI: 10.1016/j.heares.2022.108468] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/15/2021] [Revised: 01/22/2022] [Accepted: 02/12/2022] [Indexed: 11/21/2022]
|
4
|
Can visual capture of sound separate auditory streams? Exp Brain Res 2022; 240:813-824. [PMID: 35048159 DOI: 10.1007/s00221-021-06281-8] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2021] [Accepted: 11/21/2021] [Indexed: 11/04/2022]
Abstract
In noisy contexts, sound discrimination improves when the auditory sources are separated in space. This phenomenon, named Spatial Release from Masking (SRM), arises from the interaction between the auditory information reaching the ear and spatial attention resources. To examine the relative contribution of these two factors, we exploited an audio-visual illusion in a hearing-in-noise task to create conditions in which the initial stimulation to the ears is held constant, while the perceived separation between speech and masker is changed illusorily (visual capture of sound). In two experiments, we asked participants to identify a string of five digits pronounced by a female voice, embedded in either energetic (Experiment 1) or informational (Experiment 2) noise, before reporting the perceived location of the heard digits. Critically, the distance between target digits and masking noise was manipulated both physically (from 22.5 to 75.0 degrees) and illusorily, by pairing target sounds with visual stimuli either at same (audio-visual congruent) or different positions (15 degrees offset, leftward or rightward: audio-visual incongruent). The proportion of correctly reported digits increased with the physical separation between the target and masker, as expected from SRM. However, despite effective visual capture of sounds, performance was not modulated by illusory changes of target sound position. Our results are compatible with a limited role of central factors in the SRM phenomenon, at least in our experimental setting. Moreover, they add to the controversial literature on the limited effects of audio-visual capture in auditory stream separation.
Collapse
|
5
|
Attentional Orienting in Front and Rear Spaces in a Virtual Reality Discrimination Task. Vision (Basel) 2022; 6:vision6010003. [PMID: 35076635 PMCID: PMC8788563 DOI: 10.3390/vision6010003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2021] [Revised: 12/07/2021] [Accepted: 12/23/2021] [Indexed: 11/17/2022] Open
Abstract
Recent studies on covert attention suggested that the visual processing of information in front of us is different, depending on whether the information is present in front of us or if it is a reflection of information behind us (mirror information). This difference in processing suggests that we have different processes for directing our attention to objects in front of us (front space) or behind us (rear space). In this study, we investigated the effects of attentional orienting in front and rear space consecutive of visual or auditory endogenous cues. Twenty-one participants performed a modified version of the Posner paradigm in virtual reality during a spaceship discrimination task. An eye tracker integrated into the virtual reality headset was used to make sure that the participants did not move their eyes and used their covert attention. The results show that informative cues produced faster response times than non-informative cues but no impact on target identification was observed. In addition, we observed faster response times when the target occurred in front space rather than in rear space. These results are consistent with an orienting cognitive process differentiation in the front and rear spaces. Several explanations are discussed. No effect was found on subjects’ eye movements, suggesting that participants did not use their overt attention to improve task performance.
Collapse
|
6
|
Peripersonal space in the front, rear, left and right directions for audio-tactile multisensory integration. Sci Rep 2021; 11:11303. [PMID: 34050213 PMCID: PMC8163804 DOI: 10.1038/s41598-021-90784-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2020] [Accepted: 05/17/2021] [Indexed: 11/30/2022] Open
Abstract
Peripersonal space (PPS) is important for humans to perform body–environment interactions. However, many previous studies only focused on the specific direction of the PPS, such as the front space, despite suggesting that there were PPSs in all directions. We aimed to measure and compare the peri-trunk PPS in four directions (front, rear, left, and right). To measure the PPS, we used a tactile and an audio stimulus because auditory information is available at any time in all directions. We used the approaching and receding task-irrelevant sounds in the experiment. Observers were asked to respond as quickly as possible when a tactile stimulus was applied to a vibrator on their chest. We found that peri-trunk PPS representations exist with an approaching sound, irrespective of the direction.
Collapse
|
7
|
Van der Stoep N, Van der Smagt MJ, Notaro C, Spock Z, Naber M. The additive nature of the human multisensory evoked pupil response. Sci Rep 2021; 11:707. [PMID: 33436889 PMCID: PMC7803952 DOI: 10.1038/s41598-020-80286-1] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2020] [Accepted: 12/14/2020] [Indexed: 12/23/2022] Open
Abstract
Pupillometry has received increased interest for its usefulness in measuring various sensory processes as an alternative to behavioural assessments. This is also apparent for multisensory investigations. Studies of the multisensory pupil response, however, have produced conflicting results. Some studies observed super-additive multisensory pupil responses, indicative of multisensory integration (MSI). Others observed additive multisensory pupil responses even though reaction time (RT) measures were indicative of MSI. Therefore, in the present study, we investigated the nature of the multisensory pupil response by combining methodological approaches of previous studies while using supra-threshold stimuli only. In two experiments we presented auditory and visual stimuli to observers that evoked a(n) (onset) response (be it constriction or dilation) in a simple detection task and a change detection task. In both experiments, the RT data indicated MSI as shown by race model inequality violation. Still, the multisensory pupil response in both experiments could best be explained by linear summation of the unisensory pupil responses. We conclude that the multisensory pupil response for supra-threshold stimuli is additive in nature and cannot be used as a measure of MSI, as only a departure from additivity can unequivocally demonstrate an interaction between the senses.
Collapse
Affiliation(s)
- Nathan Van der Stoep
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Langeveld Building, Room H0.26, Heidelberglaan 1, 3584 CS, Utrecht, The Netherlands.
| | - M J Van der Smagt
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Langeveld Building, Room H0.26, Heidelberglaan 1, 3584 CS, Utrecht, The Netherlands
| | - C Notaro
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Langeveld Building, Room H0.26, Heidelberglaan 1, 3584 CS, Utrecht, The Netherlands
| | - Z Spock
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Langeveld Building, Room H0.26, Heidelberglaan 1, 3584 CS, Utrecht, The Netherlands
| | - M Naber
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Langeveld Building, Room H0.26, Heidelberglaan 1, 3584 CS, Utrecht, The Netherlands
| |
Collapse
|
8
|
Spence C. Senses of place: architectural design for the multisensory mind. Cogn Res Princ Implic 2020; 5:46. [PMID: 32945978 PMCID: PMC7501350 DOI: 10.1186/s41235-020-00243-4] [Citation(s) in RCA: 26] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2020] [Accepted: 08/05/2020] [Indexed: 11/10/2022] Open
Abstract
Traditionally, architectural practice has been dominated by the eye/sight. In recent decades, though, architects and designers have increasingly started to consider the other senses, namely sound, touch (including proprioception, kinesthesis, and the vestibular sense), smell, and on rare occasions, even taste in their work. As yet, there has been little recognition of the growing understanding of the multisensory nature of the human mind that has emerged from the field of cognitive neuroscience research. This review therefore provides a summary of the role of the human senses in architectural design practice, both when considered individually and, more importantly, when studied collectively. For it is only by recognizing the fundamentally multisensory nature of perception that one can really hope to explain a number of surprising crossmodal environmental or atmospheric interactions, such as between lighting colour and thermal comfort and between sound and the perceived safety of public space. At the same time, however, the contemporary focus on synaesthetic design needs to be reframed in terms of the crossmodal correspondences and multisensory integration, at least if the most is to be made of multisensory interactions and synergies that have been uncovered in recent years. Looking to the future, the hope is that architectural design practice will increasingly incorporate our growing understanding of the human senses, and how they influence one another. Such a multisensory approach will hopefully lead to the development of buildings and urban spaces that do a better job of promoting our social, cognitive, and emotional development, rather than hindering it, as has too often been the case previously.
Collapse
Affiliation(s)
- Charles Spence
- Department of Experimental Psychology, Crossmodal Research Laboratory, University of Oxford, Anna Watts Building, Oxford, OX2 6GG, UK.
| |
Collapse
|
9
|
Merz S, Jensen A, Burau C, Spence C, Frings C. Higher-Order Cognition Does Not Affect Multisensory Distractor Processing. Multisens Res 2020; 34:351-364. [PMID: 33706263 DOI: 10.1163/22134808-bja10013] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2020] [Accepted: 08/24/2020] [Indexed: 11/19/2022]
Abstract
Multisensory processing is required for the perception of the majority of everyday objects and events. In the case of irrelevant stimuli, the multisensory processing of features is widely assumed to be modulated by attention. In the present study, we investigated whether the processing of audiovisual distractors is also modulated by higher-order cognition. Participants fixated a visual distractor viewed via a centrally-placed mirror and responded to a laterally-presented audiovisual target. Critically, a distractor tone was presented from the same location as the mirror, while the visual distractor feature was presented at an occluded location, visible only indirectly via mirror reflection. Consequently, it appeared as though the visual and auditory features were presented from the same location though, in fact, they actually originated from different locations. Nevertheless, the results still revealed that the visual and auditory distractor features were processed together just as in the control condition, in which the audiovisual distractor features were both actually presented from fixation. Taken together, these results suggest that the processing of irrelevant multisensory information is not influenced by higher-order cognition.
Collapse
Affiliation(s)
- Simon Merz
- Department of Psychology, University of Trier, D-54286 Trier, Germany
| | - Anne Jensen
- Department of Psychology, University of Trier, D-54286 Trier, Germany
| | - Charlotte Burau
- Department of Psychology, University of Trier, D-54286 Trier, Germany
| | - Charles Spence
- Crossmodal Research Laboratory, Department of Experimental Psychology, Anna Watts Building, University of Oxford, Oxford, OX2 6GG, UK
| | - Christian Frings
- Department of Psychology, University of Trier, D-54286 Trier, Germany
| |
Collapse
|
10
|
Auditory stimuli degrade visual performance in virtual reality. Sci Rep 2020; 10:12363. [PMID: 32703981 PMCID: PMC7378072 DOI: 10.1038/s41598-020-69135-3] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2020] [Accepted: 07/07/2020] [Indexed: 12/01/2022] Open
Abstract
We report an auditory effect of visual performance degradation in a virtual reality (VR) setting, where the viewing conditions are significantly different from previous studies. With the presentation of temporally congruent but spatially incongruent sound, we can degrade visual performance significantly at detection and recognition levels. We further show that this effect is robust to different types and locations of both auditory and visual stimuli. We also analyze participants behavior with an eye tracker to study the underlying cause of the degradation effect. We find that the performance degradation occurs even in the absence of saccades towards the sound source, during normal gaze behavior. This suggests that this effect is not caused by oculomotor phenomena, but rather by neural interactions or attentional shifts.
Collapse
|
11
|
Van der Stoep N, Alais D. Motion Perception: Auditory Motion Encoded in a Visual Motion Area. Curr Biol 2020; 30:R775-R778. [DOI: 10.1016/j.cub.2020.05.010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
12
|
Carlsen AN, Maslovat D, Kaga K. An unperceived acoustic stimulus decreases reaction time to visual information in a patient with cortical deafness. Sci Rep 2020; 10:5825. [PMID: 32242039 PMCID: PMC7118083 DOI: 10.1038/s41598-020-62450-9] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2019] [Accepted: 03/13/2020] [Indexed: 11/16/2022] Open
Abstract
Responding to multiple stimuli of different modalities has been shown to reduce reaction time (RT), yet many different processes can potentially contribute to multisensory response enhancement. To investigate the neural circuits involved in voluntary response initiation, an acoustic stimulus of varying intensities (80, 105, or 120 dB) was presented during a visual RT task to a patient with profound bilateral cortical deafness and an intact auditory brainstem response. Despite being unable to consciously perceive sound, RT was reliably shortened (~100 ms) on trials where the unperceived acoustic stimulus was presented, confirming the presence of multisensory response enhancement. Although the exact locus of this enhancement is unclear, these results cannot be attributed to involvement of the auditory cortex. Thus, these data provide new and compelling evidence that activation from subcortical auditory processing circuits can contribute to other cortical or subcortical areas responsible for the initiation of a response, without the need for conscious perception.
Collapse
Affiliation(s)
| | - Dana Maslovat
- School of Kinesiology, University of British Columbia, Vancouver, Canada
| | - Kimitaka Kaga
- National Institute of Sensory Organs, National Tokyo Medical Center, Tokyo, Japan
| |
Collapse
|
13
|
Abstract
We report two experiments designed to investigate how the implied motion of tactile stimuli influences perceived location. Predicting the location of sensory input is especially important as far as the perception of, and interaction with, the external world is concerned. Using two different experimental approaches, an overall pattern of localization shifts analogous to what has been described previously in the visual and auditory modalities is reported. That is, participants perceive the last location of a dynamic stimulus further along its trajectory than is objectively the case. In Experiment 1, participants judged whether the last vibration in a sequence of three was located closer to the wrist or to the elbow. In Experiment 2, they indicated the last location on a ruler attached to their forearm. We further pinpoint the effects of implied motion on tactile localization by investigating the independent influences of motion direction and perceptual uncertainty. Taken together, these findings underline the importance of dynamic information in localizing tactile stimuli on the skin.
Collapse
|