1
|
Steffens H, Schutte M, Ewert SD. Auditory orientation and distance estimation of sighted humans using virtual echolocation with artificial and self-generated sounds. JASA EXPRESS LETTERS 2022; 2:124403. [PMID: 36586958 DOI: 10.1121/10.0016403] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/17/2023]
Abstract
Active echolocation of sighted humans using predefined synthetic and self-emitted sounds, as habitually used by blind individuals, was investigated. Using virtual acoustics, distance estimation and directional localization of a wall in different rooms were assessed. A virtual source was attached to either the head or hand with realistic or increased source directivity. A control condition was tested with a virtual sound source located at the wall. Untrained echolocation performance comparable to performance in the control condition was achieved on an individual level. On average, the echolocation performance was considerably lower than in the control condition, however, it benefitted from increased directivity.
Collapse
Affiliation(s)
- Henning Steffens
- Medizinische Physik and Cluster of Excellence Hearing4all, Universität Oldenburg, Oldenburg, 26111, Germany , ,
| | - Michael Schutte
- Medizinische Physik and Cluster of Excellence Hearing4all, Universität Oldenburg, Oldenburg, 26111, Germany , ,
| | - Stephan D Ewert
- Medizinische Physik and Cluster of Excellence Hearing4all, Universität Oldenburg, Oldenburg, 26111, Germany , ,
| |
Collapse
|
2
|
Objective Evaluation of Obstacle Perception Using Spontaneous Body Movements of Blind People Evoked by Movements of Acoustic Virtual Wall. HUMAN BEHAVIOR AND EMERGING TECHNOLOGIES 2022. [DOI: 10.1155/2022/9475983] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Obstacle perception using sound is the ability to detect silent objects, such as walls and poles. It is very important for blind people to recognize their environment using acoustic information through their auditory sense when walking or conducting various daily activities. In this paper, to develop an objective method for evaluating the degree of obstacle perception acquisition in the education and rehabilitation of the blind, the authors measured the spontaneous body movements evoked by the approach of an acoustic virtual wall. Ten blind persons who have experienced obstacle perception in their daily life, and seven sighted persons with no such experience participated in the experiment. The reciprocal (approach and receding) movements of the virtual wall were presented using simulated reflected sound, and the spontaneous body movements of the subjects were measured. As the results indicate, eight of the ten blind participants showed large maximum values for the correlation function between the wall and their body movements, whereas six of the seven sighted participants showed small maximum values. These results indicate that body movements can be used for an objective evaluation of obstacle perception. In particular, it was determined that the maximum value of the correlation function is the most appropriate for such an evaluation, because it does not depend on the subject’s physique.
Collapse
|
3
|
McKenzie T, Schlecht SJ, Pulkki V. The auditory perceived aperture position of the transition between rooms. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2022; 152:1871. [PMID: 36182311 DOI: 10.1121/10.0014178] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/02/2022] [Accepted: 09/02/2022] [Indexed: 06/16/2023]
Abstract
This exploratory study investigates the phenomenon of the auditory perceived aperture position (APAP): the point at which one feels they are in the boundary between two adjoined spaces, judged only using auditory senses. The APAP is likely the combined perception of multiple simultaneous auditory cue changes, such as energy, reverberation time, envelopment, decay slope shape, and the direction, amplitude, and colouration of direct and reverberant sound arrivals. A framework for a rendering-free listening test is presented and conducted in situ, avoiding possible inaccuracies from acoustic simulations, impulse response measurements, and auralisation to assess how close the APAP is to the physical aperture position under blindfold conditions, for multiple source positions and two room pairs. Results indicate that the APAP is generally within ± 1 m of the physical aperture position, though reverberation amount, listener orientation, and source position affect precision. Comparison to objective metrics suggests that the APAP generally falls within the period of greatest acoustical change. This study illustrates the non-trivial nature of acoustical room transitions and the detail required for their plausible reproduction in dynamic rendering and game audio engines.
Collapse
Affiliation(s)
- Thomas McKenzie
- Acoustics Lab, Department of Signal Processing and Acoustics, Aalto University, 00076 Espoo, Finland
| | - Sebastian J Schlecht
- Acoustics Lab, Department of Signal Processing and Acoustics, Aalto University, 00076 Espoo, Finland
| | - Ville Pulkki
- Acoustics Lab, Department of Signal Processing and Acoustics, Aalto University, 00076 Espoo, Finland
| |
Collapse
|
4
|
Perceptual Biases as the Side Effect of a Multisensory Adaptive System: Insights from Verticality and Self-Motion Perception. Vision (Basel) 2022; 6:vision6030053. [PMID: 36136746 PMCID: PMC9502132 DOI: 10.3390/vision6030053] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2022] [Revised: 07/22/2022] [Accepted: 08/04/2022] [Indexed: 11/17/2022] Open
Abstract
Perceptual biases can be interpreted as adverse consequences of optimal processes which otherwise improve system performance. The review presented here focuses on the investigation of inaccuracies in multisensory perception by focusing on the perception of verticality and self-motion, where the vestibular sensory modality has a prominent role. Perception of verticality indicates how the system processes gravity. Thus, it represents an indirect measurement of vestibular perception. Head tilts can lead to biases in perceived verticality, interpreted as the influence of a vestibular prior set at the most common orientation relative to gravity (i.e., upright), useful for improving precision when upright (e.g., fall avoidance). Studies on the perception of verticality across development and in the presence of blindness show that prior acquisition is mediated by visual experience, thus unveiling the fundamental role of visuo-vestibular interconnections across development. Such multisensory interactions can be behaviorally tested with cross-modal aftereffect paradigms which test whether adaptation in one sensory modality induces biases in another, eventually revealing an interconnection between the tested sensory modalities. Such phenomena indicate the presence of multisensory neural mechanisms that constantly function to calibrate self-motion dedicated sensory modalities with each other as well as with the environment. Thus, biases in vestibular perception reveal how the brain optimally adapts to environmental requests, such as spatial navigation and steady changes in the surroundings.
Collapse
|
5
|
Neidhardt A, Schneiderwind C, Klein F. Perceptual Matching of Room Acoustics for Auditory Augmented Reality in Small Rooms - Literature Review and Theoretical Framework. Trends Hear 2022; 26:23312165221092919. [PMID: 35505625 PMCID: PMC9073123 DOI: 10.1177/23312165221092919] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/04/2022] Open
Abstract
For the realization of auditory augmented reality (AAR), it is important that the room acoustical properties of the virtual elements are perceived in agreement with the acoustics of the actual environment. This perceptual matching of room acoustics is the subject reviewed in this paper. Realizations of AAR that fulfill the listeners’ expectations were achieved based on pre-characterization of the room acoustics, for example, by measuring acoustic impulse responses or creating detailed room models for acoustic simulations. For future applications, the goal is to realize an online adaptation in (close to) real-time. Perfect physical matching is hard to achieve with these practical constraints. For this reason, an understanding of the essential psychoacoustic cues is of interest and will help to explore options for simplifications. This paper reviews a broad selection of previous studies and derives a theoretical framework to examine possibilities for psychoacoustical optimization of room acoustical matching.
Collapse
|
6
|
Bloomfield L, Lane E, Mangalam M, Kelty-Stephen DG. Perceiving and remembering speech depend on multifractal nonlinearity in movements producing and exploring speech. J R Soc Interface 2021; 18:20210272. [PMID: 34343455 DOI: 10.1098/rsif.2021.0272] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/04/2023] Open
Abstract
Speech perception and memory for speech require active engagement. Gestural theories have emphasized mainly the effect of speaker's movements on speech perception. They fail to address the effects of listener movement, focusing on communication as a boundary condition constraining movement among interlocutors. The present work attempts to break new ground by using multifractal geometry of physical movement as a common currency for supporting both sides of the speaker-listener dyads. Participants self-paced their listening to a narrative, after which they completed a test of memory querying their narrative comprehension and their ability to recognize words from the story. The multifractal evidence of nonlinear interactions across timescales predicted the fluency of speech perception. Self-pacing movements that enabled listeners to control the presentation of speech sounds constituted a rich exploratory process. The multifractal nonlinearity of this exploration supported several aspects of memory for the perceived spoken language. These findings extend the role of multifractal geometry in the speaker's movements to the narrative case of speech perception. In addition to posing novel basic research questions, these findings make a compelling case for calibrating multifractal structure in text-to-speech synthesizers for better perception and memory of speech.
Collapse
Affiliation(s)
| | - Elizabeth Lane
- Department of Psychology, Grinnell College, Grinnell, IA 50112, USA
| | - Madhur Mangalam
- Department of Physical Therapy, Movement and Rehabilitation Sciences, Northeastern University, Boston, MA 02115, USA
| | | |
Collapse
|
7
|
Angkananon K, Wald M, Phetkeaw T. Development and Evaluation of Technology Enhanced Interaction Framework Method for Designing Accessible Technologies for Visually Impaired People. FRONTIERS IN COMPUTER SCIENCE 2021. [DOI: 10.3389/fcomp.2021.671414] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
This research developed and evaluated a software development support method to help non-expert developers evaluating or gathering requirements and designing or evaluating digital technology solutions to accessibility barriers people with visual impairment encounter. The Technology Enhanced Interaction Framework (TEIF) Visual Impairment (VI) Method was developed through literature review and interviews with 20 students with visual impairment, 10 adults with visual impairment and five accessibility experts. It is an extension of the Technology Enhanced Interaction Framework (TEIF) and its “HI-Method” that had been developed and validated and evaluated for hearing impairment and supports other methods by providing multiple-choice questions to help identify requirements, the answers to which help provide technology suggestions that support the design stage. Four accessibility experts and three developer experts reviewed and validated the TEIF VI-Method. It was experimentally evaluated by 18 developers using the TEIF VI-Method and another 18 developers using their preferred “Other Methods” to identify the requirements and solution to a scenario involving barriers for people with visual impairment. The “Other Methods” group were then shown the TEIF VI-Method and both groups were asked their opinions of its ease of use. The mean number of correctly selected requirements was significantly higher (p < 0.001) for developers using the TEIF VI-Method (X̄ = 8.83) than the Other Method (X̄ = 6.22). Developers using the TEIF VI-Method ranked technology solutions closer to the expert rankings than developers using Other Methods (p < 0.05). All developers found the TEIF VI-Method easy to follow. Developers could evaluate requirements and technology solutions to interaction problems involving people with visual impairment using the TEIF VI-Method better than existing Other Methods. Developers could benefit from using the TEIF VI-Method when developing technology solutions to interaction problems faced by people with visual impairment.
Collapse
|
8
|
Andrade R, Waycott J, Baker S, Vetere F. Echolocation as a Means for People with Visual Impairment (PVI) to Acquire Spatial Knowledge of Virtual Space. ACM TRANSACTIONS ON ACCESSIBLE COMPUTING 2021. [DOI: 10.1145/3448273] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
In virtual environments, spatial information is communicated visually. This prevents people with visual impairment (PVI) from accessing such spaces. In this article, we investigate whether echolocation could be used as a tool to convey spatial information by answering the following research questions: What features of virtual space can be perceived by PVI through the use of echolocation? How does active echolocation support PVI in acquiring spatial knowledge of a virtual space? And what are PVI’s opinions regarding the use of echolocation to acquire landmark and survey knowledge of virtual space? To answer these questions, we conducted a two-part within-subjects experiment with 12 people who were blind or had a visual impairment and found that size and materials of rooms and 90-degree turns were detectable through echolocation, participants preferred using echoes derived from footsteps rather than from artificial sound pulses, and echolocation supported the acquisition of mental maps of a virtual space. Ultimately, we propose that appropriately designed echolocation in virtual environments improves understanding of spatial information and access to digital games for PVI.
Collapse
Affiliation(s)
- Ronny Andrade
- The University of Melbourne, Parkville, VIC, Australia
| | - Jenny Waycott
- The University of Melbourne, Parkville, VIC, Australia
| | - Steven Baker
- The University of Melbourne, Parkville, VIC, Australia
| | - Frank Vetere
- The University of Melbourne, Parkville, VIC, Australia
| |
Collapse
|
9
|
Thaler L, De Vos HPJC, Kish D, Antoniou M, Baker CJ, Hornikx MCJ. Human Click-Based Echolocation of Distance: Superfine Acuity and Dynamic Clicking Behaviour. J Assoc Res Otolaryngol 2019; 20:499-510. [PMID: 31286299 PMCID: PMC6797687 DOI: 10.1007/s10162-019-00728-0] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2018] [Accepted: 06/06/2019] [Indexed: 01/25/2023] Open
Abstract
Some people who are blind have trained themselves in echolocation using mouth clicks. Here, we provide the first report of psychophysical and clicking data during echolocation of distance from a group of 8 blind people with experience in mouth click-based echolocation (daily use for > 3 years). We found that experienced echolocators can detect changes in distance of 3 cm at a reference distance of 50 cm, and a change of 7 cm at a reference distance of 150 cm, regardless of object size (i.e. 28.5 cm vs. 80 cm diameter disk). Participants made mouth clicks that were more intense and they made more clicks for weaker reflectors (i.e. same object at farther distance, or smaller object at same distance), but number and intensity of clicks were adjusted independently from one another. The acuity we found is better than previous estimates based on samples of sighted participants without experience in echolocation or individual experienced participants (i.e. single blind echolocators tested) and highlights adaptation of the perceptual system in blind human echolocators. Further, the dynamic adaptive clicking behaviour we observed suggests that number and intensity of emissions serve separate functions to increase SNR. The data may serve as an inspiration for low-cost (i.e. non-array based) artificial ‘cognitive’ sonar and radar systems, i.e. signal design, adaptive pulse repetition rate and intensity. It will also be useful for instruction and guidance for new users of echolocation.
Collapse
Affiliation(s)
- Lore Thaler
- Department of Psychology, Durham University, Science Site, South Road, Durham, DH1 3LE, UK.
| | - H P J C De Vos
- Eindhoven University of Technology, Eindhoven, The Netherlands
| | - D Kish
- World Access for the Blind, Placentia, CA, USA
| | - M Antoniou
- Department of Electronic Electrical and Systems Engineering, University of Birmingham, Birmingham, UK
| | - C J Baker
- Department of Electronic Electrical and Systems Engineering, University of Birmingham, Birmingham, UK
| | - M C J Hornikx
- Eindhoven University of Technology, Eindhoven, The Netherlands
| |
Collapse
|
10
|
Guigou C, Toupet M, Delemps B, Heuschen S, Aho S, Bozorg Grayeli A. Effect of Rotating Auditory Scene on Postural Control in Normal Subjects, Patients With Bilateral Vestibulopathy, Unilateral, or Bilateral Cochlear Implants. Front Neurol 2018; 9:972. [PMID: 30505289 PMCID: PMC6250812 DOI: 10.3389/fneur.2018.00972] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2018] [Accepted: 10/29/2018] [Indexed: 11/13/2022] Open
Abstract
Objective: The aim of this study was to investigate the impact of a rotating sound stimulation on the postural performances in normal subjects, patients with bilateral vestibulopathy (BVP), unilateral (UCI), and bilateral (BCI) cochlear implantees. Materials and Methods: Sixty-nine adults were included (32 women and 37 men) in a multicenter prospective study. The group included 37 healthy subjects, 10 BVP, 15 UCI, and 7 BCI patients. The average of age was 47 ± 2.0 (range: 23–82). In addition to a complete audiovestibular work up, a dynamic posturography (Multitest Framiral, Grasse) was conducted in silence and with a rotating cocktail party sound delivered by headphone. The center of pressure excursion surface (COPS), sensory preferences, as well as fractal, diffusion, and wavelet analysis of stabilometry were collected. Results: The rotating sound seemed to influenced balance in all subgroups except in controls. COPS increased with sound in the BVP and BCI groups in closed eyes and sway-referenced condition indicating a destabilizing effect while it decreased in UCI in the same condition suggesting stabilization (p < 0.05, linear mixed model corrected for age, n = 69). BVP had higher proprioceptive preferences, BCI had higher vestibular and visual preferences, and UCI had only higher vestibular preferences than controls. Sensory preferences were not altered by rotating sound. Conclusions: The rotating sound destabilized BVP and BCI patients with binaural hearing while it stabilized UCI patients with monaural hearing and no sound rotation effect. This difference suggests that binaural auditory cues are exploited in BCI patients for their balance.
Collapse
Affiliation(s)
- Caroline Guigou
- Department of Otolaryngology-Head and Neck Surgery, Dijon University Hospital, Dijon, France.,Le2i Research Laboratory, CNRS, UMR-6306, Dijon, France
| | - Michel Toupet
- Department of Otolaryngology-Head and Neck Surgery, Dijon University Hospital, Dijon, France.,Centre d'Explorations Fonctionnelles Otoneurologiques, Paris, France
| | - Benoit Delemps
- Department of Otolaryngology-Head and Neck Surgery, Dijon University Hospital, Dijon, France.,Audika Auditory Rehabilitation Center, Dijon, France
| | | | - Serge Aho
- Department of Epidemiology, Dijon University Hospital, Dijon, France
| | - Alexis Bozorg Grayeli
- Department of Otolaryngology-Head and Neck Surgery, Dijon University Hospital, Dijon, France.,Le2i Research Laboratory, CNRS, UMR-6306, Dijon, France
| |
Collapse
|
11
|
Abstract
This study investigated the influence of body motion on an echolocation task. We asked a group of blindfolded novice sighted participants to walk along a corridor, made with plastic sound-reflecting panels. By self-generating mouth clicks, the participants attempted to understand some spatial properties of the corridor, i.e. a left turn, a right turn or a dead end. They were asked to explore the corridor and stop whenever they were confident about the corridor shape. Their body motion was captured by a camera system and coded. Most participants were able to accomplish the task with the percentage of correct guesses above the chance level. We found a mutual interaction between some kinematic variables that can lead to optimal echolocation skills. These variables are head motion, accounting for spatial exploration, the motion stop-point of the person and the amount of correct guesses about the spatial structure. The results confirmed that sighted people are able to use self-generated echoes to navigate in a complex environment. The inter-individual variability and the quality of echolocation tasks seems to depend on how and how much the space is explored.
Collapse
|
12
|
Ton C, Omar A, Szedenko V, Tran VH, Aftab A, Perla F, Bernstein MJ, Yang Y. LIDAR Assist Spatial Sensing for the Visually Impaired and Performance Analysis. IEEE Trans Neural Syst Rehabil Eng 2018; 26:1727-1734. [PMID: 30047892 DOI: 10.1109/tnsre.2018.2859800] [Citation(s) in RCA: 20] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Echolocation enables people with impaired or no vision to comprehend the surrounding spatial information through the reflected sound. However, this technique often requires substantial training, and the accuracy of echolocation is subject to various conditions. Furthermore, the individuals who practice this sensing method must simultaneously generate the sound and process the received audio information. This paper proposes and evaluates a proof-of-concept light detection and ranging (LIDAR) assist spatial sensing (LASS) system, which intends to overcome these restrictions by obtaining the spatial information of the user's surroundings through a LIDAR sensor and translating the spatial information into the stereo sound of various pitches. The stereo sound of relative pitch represents the information regarding objects' angular orientation and horizontal distance, respectively, thus granting visually impaired users an enhanced spatial perception of his or her surrounding areas and potential obstacles. This paper is divided into two phases: Phase I is to engineer the hardware and software of the LASS system and Phase II focuses on the system efficacy study. The study, approved by the Penn State Institutional Review Board, included 18 student volunteers, who were recruited through the Penn State Department of Psychology Subject Pool. This paper demonstrates that the blindfolded individuals equipped with the LASS system are able to quantitatively identify the surrounding obstacles, differentiate their relative distance, and distinguish the angular location of multiple objects with minimal training.
Collapse
|
13
|
Thaler L, De Vos R, Kish D, Antoniou M, Baker C, Hornikx M. Human echolocators adjust loudness and number of clicks for detection of reflectors at various azimuth angles. Proc Biol Sci 2018; 285:20172735. [PMID: 29491173 PMCID: PMC5832709 DOI: 10.1098/rspb.2017.2735] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2017] [Accepted: 02/06/2018] [Indexed: 11/15/2022] Open
Abstract
In bats it has been shown that they adjust their emissions to situational demands. Here we report similar findings for human echolocation. We asked eight blind expert echolocators to detect reflectors positioned at various azimuth angles. The same 17.5 cm diameter circular reflector placed at 100 cm distance at 0°, 45° or 90° with respect to straight ahead was detected with 100% accuracy, but performance dropped to approximately 80% when it was placed at 135° (i.e. somewhat behind) and to chance levels (50%) when placed at 180° (i.e. right behind). This can be explained based on poorer target ensonification owing to the beam pattern of human mouth clicks. Importantly, analyses of sound recordings show that echolocators increased loudness and numbers of clicks for reflectors at farther angles. Echolocators were able to reliably detect reflectors when level differences between echo and emission were as low as -27 dB, which is much lower than expected based on previous work. Increasing intensity and numbers of clicks improves signal-to-noise ratio and in this way compensates for weaker target reflections. Our results are, to our knowledge, the first to show that human echolocation experts adjust their emissions to improve sensory sampling. An implication from our findings is that human echolocators accumulate information from multiple samples.
Collapse
Affiliation(s)
- L Thaler
- Department of Psychology, Durham University, Science Site, South Road, Durham DH1 3LE, UK
| | - R De Vos
- Eindhoven University of Technology, 5600 MB Eindhoven, The Netherlands
| | - D Kish
- World Access for the Blind, Placentia 92870, CA, USA
| | - M Antoniou
- Department of Electronic Electrical and Systems Engineering, School of Engineering, University of Birmingham, Edgbaston, Birmingham B15 2TT, UK
| | - C Baker
- Department of Electronic Electrical and Systems Engineering, School of Engineering, University of Birmingham, Edgbaston, Birmingham B15 2TT, UK
| | - M Hornikx
- Eindhoven University of Technology, 5600 MB Eindhoven, The Netherlands
| |
Collapse
|
14
|
Cuturi LF, Gori M. The Effect of Visual Experience on Perceived Haptic Verticality When Tilted in the Roll Plane. Front Neurosci 2017; 11:687. [PMID: 29270109 PMCID: PMC5723665 DOI: 10.3389/fnins.2017.00687] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2017] [Accepted: 11/22/2017] [Indexed: 11/13/2022] Open
Abstract
The orientation of the body in space can influence perception of verticality leading sometimes to biases consistent with priors peaked at the most common head and body orientation, that is upright. In this study, we investigate haptic perception of verticality in sighted individuals and early and late blind adults when tilted counterclockwise in the roll plane. Participants were asked to perform a stimulus orientation discrimination task with their body tilted to their left ear side 90° relative to gravity. Stimuli were presented by using a motorized haptic bar. In order to test whether different reference frames relative to the head influenced perception of verticality, we varied the position of the stimulus on the body longitudinal axis. Depending on the stimulus position sighted participants tended to have biases away or toward their body tilt. Visually impaired individuals instead show a different pattern of verticality estimations. A bias toward head and body tilt (i.e., Aubert effect) was observed in late blind individuals. Interestingly, no strong biases were observed in early blind individuals. Overall, these results posit visual sensory information to be fundamental in influencing the haptic readout of proprioceptive and vestibular information about body orientation relative to gravity. The acquisition of an idiotropic vector signaling the upright might take place through vision during development. Regarding early blind individuals, independent spatial navigation experience likely enhanced by echolocation behavior might have a role in such acquisition. In the case of participants with late onset blindness, early experience of vision might lead them to anchor their visually acquired priors to the haptic modality with no disambiguation between head and body references as observed in sighted individuals (Fraser et al., 2015). With our study, we aim to investigate haptic perception of gravity direction in unusual body tilts when vision is absent due to visual impairment. Insofar, our findings throw light on the influence of proprioceptive/vestibular sensory information on haptic perceived verticality in blind individuals showing how this phenomenon is affected by visual experience.
Collapse
Affiliation(s)
- Luigi F Cuturi
- Unit for Visually Impaired People, Science and Technology for Children and Adults, Istituto Italiano di Tecnologia, Genoa, Italy
| | - Monica Gori
- Unit for Visually Impaired People, Science and Technology for Children and Adults, Istituto Italiano di Tecnologia, Genoa, Italy
| |
Collapse
|
15
|
Rowan D, Papadopoulos T, Archer L, Goodhew A, Cozens H, Lopez RG, Edwards D, Holmes H, Allen R. The detection of 'virtual' objects using echoes by humans: Spectral cues. Hear Res 2017; 350:205-216. [PMID: 28511103 DOI: 10.1016/j.heares.2017.04.008] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/24/2016] [Revised: 04/07/2017] [Accepted: 04/14/2017] [Indexed: 11/30/2022]
Abstract
Some blind people use echoes to detect discrete, silent objects to support their spatial orientation/navigation, independence, safety and wellbeing. The acoustical features that people use for this are not well understood. Listening to changes in spectral shape due to the presence of an object could be important for object detection and avoidance, especially at short range, although it is currently not known whether it is possible with echolocation-related sounds. Bands of noise were convolved with recordings of binaural impulse responses of objects in an anechoic chamber to create 'virtual objects', which were analysed and played to sighted and blind listeners inexperienced in echolocation. The sounds were also manipulated to remove cues unrelated to spectral shape. Most listeners could accurately detect hard flat objects using changes in spectral shape. The useful spectral changes for object detection occurred above approximately 3 kHz, as with object localisation. However, energy in the sounds below 3 kHz was required to exploit changes in spectral shape for object detection, whereas energy below 3 kHz impaired object localisation. Further recordings showed that the spectral changes were diminished by room reverberation. While good high-frequency hearing is generally important for echolocation, the optimal echo-generating stimulus will probably depend on the task.
Collapse
Affiliation(s)
- Daniel Rowan
- Institute of Sound and Vibration Research, University of Southampton, Southampton, Hants, SO17 1BJ, UK.
| | - Timos Papadopoulos
- Biodiversity Institute, Department of Zoology, and Machine Learning Research Group, Department of Engineering Science, University of Oxford, Oxford, UK
| | | | | | | | | | - David Edwards
- Yeovil District Hospital NHS Foundation Trust, Yeovil, UK
| | | | - Robert Allen
- Institute of Sound and Vibration Research, University of Southampton, Southampton, Hants, SO17 1BJ, UK
| |
Collapse
|
16
|
Kolarik AJ, Scarfe AC, Moore BCJ, Pardhan S. Blindness enhances auditory obstacle circumvention: Assessing echolocation, sensory substitution, and visual-based navigation. PLoS One 2017; 12:e0175750. [PMID: 28407000 PMCID: PMC5391114 DOI: 10.1371/journal.pone.0175750] [Citation(s) in RCA: 37] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2016] [Accepted: 03/30/2017] [Indexed: 11/18/2022] Open
Abstract
Performance for an obstacle circumvention task was assessed under conditions of visual, auditory only (using echolocation) and tactile (using a sensory substitution device, SSD) guidance. A Vicon motion capture system was used to measure human movement kinematics objectively. Ten normally sighted participants, 8 blind non-echolocators, and 1 blind expert echolocator navigated around a 0.6 x 2 m obstacle that was varied in position across trials, at the midline of the participant or 25 cm to the right or left. Although visual guidance was the most effective, participants successfully circumvented the obstacle in the majority of trials under auditory or SSD guidance. Using audition, blind non-echolocators navigated more effectively than blindfolded sighted individuals with fewer collisions, lower movement times, fewer velocity corrections and greater obstacle detection ranges. The blind expert echolocator displayed performance similar to or better than that for the other groups using audition, but was comparable to that for the other groups using the SSD. The generally better performance of blind than of sighted participants is consistent with the perceptual enhancement hypothesis that individuals with severe visual deficits develop improved auditory abilities to compensate for visual loss, here shown by faster, more fluid, and more accurate navigation around obstacles using sound.
Collapse
Affiliation(s)
- Andrew J. Kolarik
- Vision and Eye Research Unit (VERU), Postgraduate Medical Institute, Anglia Ruskin University, Cambridge, United Kingdom
- Department of Psychology, University of Cambridge, Cambridge, United Kingdom
- Centre for the Study of the Senses, Institute of Philosophy, University of London, London, United Kingdom
- * E-mail:
| | - Amy C. Scarfe
- Vision and Eye Research Unit (VERU), Postgraduate Medical Institute, Anglia Ruskin University, Cambridge, United Kingdom
- Department of Clinical Engineering, Medical Imaging and Medical Physics Directorate, Sheffield Teaching Hospitals NHS Foundation Trust, Sheffield, United Kingdom
| | - Brian C. J. Moore
- Department of Psychology, University of Cambridge, Cambridge, United Kingdom
| | - Shahina Pardhan
- Vision and Eye Research Unit (VERU), Postgraduate Medical Institute, Anglia Ruskin University, Cambridge, United Kingdom
| |
Collapse
|
17
|
Cuturi LF, Aggius-Vella E, Campus C, Parmiggiani A, Gori M. From science to technology: Orientation and mobility in blind children and adults. Neurosci Biobehav Rev 2016; 71:240-251. [DOI: 10.1016/j.neubiorev.2016.08.019] [Citation(s) in RCA: 44] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2016] [Revised: 08/13/2016] [Accepted: 08/16/2016] [Indexed: 11/27/2022]
|
18
|
Thaler L, Goodale MA. Echolocation in humans: an overview. WILEY INTERDISCIPLINARY REVIEWS. COGNITIVE SCIENCE 2016; 7:382-393. [PMID: 27538733 DOI: 10.1002/wcs.1408] [Citation(s) in RCA: 56] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/28/2016] [Revised: 06/23/2016] [Accepted: 06/27/2016] [Indexed: 01/08/2023]
Abstract
Bats and dolphins are known for their ability to use echolocation. They emit bursts of sounds and listen to the echoes that bounce back to detect the objects in their environment. What is not as well-known is that some blind people have learned to do the same thing, making mouth clicks, for example, and using the returning echoes from those clicks to sense obstacles and objects of interest in their surroundings. The current review explores some of the research that has examined human echolocation and the changes that have been observed in the brains of echolocation experts. We also discuss potential applications and assistive technology based on echolocation. Blind echolocation experts can sense small differences in the location of objects, differentiate between objects of various sizes and shapes, and even between objects made of different materials, just by listening to the reflected echoes from mouth clicks. It is clear that echolocation may enable some blind people to do things that are otherwise thought to be impossible without vision, potentially providing them with a high degree of independence in their daily lives and demonstrating that echolocation can serve as an effective mobility strategy in the blind. Neuroimaging has shown that the processing of echoes activates brain regions in blind echolocators that would normally support vision in the sighted brain, and that the patterns of these activations are modulated by the information carried by the echoes. This work is shedding new light on just how plastic the human brain is. WIREs Cogn Sci 2016, 7:382-393. doi: 10.1002/wcs.1408 For further resources related to this article, please visit the WIREs website.
Collapse
Affiliation(s)
- Lore Thaler
- Department of Psychology, Durham University, Durham, UK.
| | - Melvyn A Goodale
- The Brain and Mind Institute, Department of Psychology, University of Western Ontario, Ontario, Canada
| |
Collapse
|
19
|
An assessment of auditory-guided locomotion in an obstacle circumvention task. Exp Brain Res 2016; 234:1725-35. [PMID: 26879767 PMCID: PMC4851710 DOI: 10.1007/s00221-016-4567-y] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2015] [Accepted: 11/30/2015] [Indexed: 12/11/2022]
Abstract
This study investigated how effectively audition can be used to guide navigation around an obstacle. Ten blindfolded normally sighted participants navigated around a 0.6 × 2 m obstacle while producing self-generated mouth click sounds. Objective movement performance was measured using a Vicon motion capture system. Performance with full vision without generating sound was used as a baseline for comparison. The obstacle’s location was varied randomly from trial to trial: it was either straight ahead or 25 cm to the left or right relative to the participant. Although audition provided sufficient information to detect the obstacle and guide participants around it without collision in the majority of trials, buffer space (clearance between the shoulder and obstacle), overall movement times, and number of velocity corrections were significantly (p < 0.05) greater with auditory guidance than visual guidance. Collisions sometime occurred under auditory guidance, suggesting that audition did not always provide an accurate estimate of the space between the participant and obstacle. Unlike visual guidance, participants did not always walk around the side that afforded the most space during auditory guidance. Mean buffer space was 1.8 times higher under auditory than under visual guidance. Results suggest that sound can be used to generate buffer space when vision is unavailable, allowing navigation around an obstacle without collision in the majority of trials.
Collapse
|
20
|
Wallmeier L, Kish D, Wiegrebe L, Flanagin VL. Aural localization of silent objects by active human biosonar: neural representations of virtual echo-acoustic space. Eur J Neurosci 2015; 41:533-45. [PMID: 25728174 DOI: 10.1111/ejn.12843] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2014] [Revised: 12/17/2014] [Accepted: 12/22/2014] [Indexed: 02/04/2023]
Abstract
Some blind humans have developed the remarkable ability to detect and localize objects through the auditory analysis of self-generated tongue clicks. These echolocation experts show a corresponding increase in 'visual' cortex activity when listening to echo-acoustic sounds. Echolocation in real-life settings involves multiple reflections as well as active sound production, neither of which has been systematically addressed. We developed a virtualization technique that allows participants to actively perform such biosonar tasks in virtual echo-acoustic space during magnetic resonance imaging (MRI). Tongue clicks, emitted in the MRI scanner, are picked up by a microphone, convolved in real time with the binaural impulse responses of a virtual space, and presented via headphones as virtual echoes. In this manner, we investigated the brain activity during active echo-acoustic localization tasks. Our data show that, in blind echolocation experts, activations in the calcarine cortex are dramatically enhanced when a single reflector is introduced into otherwise anechoic virtual space. A pattern-classification analysis revealed that, in the blind, calcarine cortex activation patterns could discriminate left-side from right-side reflectors. This was found in both blind experts, but the effect was significant for only one of them. In sighted controls, 'visual' cortex activations were insignificant, but activation patterns in the planum temporale were sufficient to discriminate left-side from right-side reflectors. Our data suggest that blind and echolocation-trained, sighted subjects may recruit different neural substrates for the same active-echolocation task.
Collapse
Affiliation(s)
- Ludwig Wallmeier
- Graduate School of Systemic Neuroscience, Ludwig-Maximilians-Universität München, Großhaderner Str. 2, 82152, Planegg-Martinsried, Germany; German Center for Vertigo and Balance Disorders, Ludwig-Maximilians-Universität München, Munich, Germany; Division of Neurobiology, Department Biologie II, Ludwig-Maximilians-Universität München, Großhaderner Str. 2, 82152, Planegg-Martinsried, Germany
| | | | | | | |
Collapse
|
21
|
Wallmeier L, Wiegrebe L. Ranging in human sonar: effects of additional early reflections and exploratory head movements. PLoS One 2014; 9:e115363. [PMID: 25551226 PMCID: PMC4281102 DOI: 10.1371/journal.pone.0115363] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2014] [Accepted: 11/21/2014] [Indexed: 11/19/2022] Open
Abstract
Many blind people rely on echoes from self-produced sounds to assess their environment. It has been shown that human subjects can use echolocation for directional localization and orientation in a room, but echo-acoustic distance perception--e.g. to determine one's position in a room--has received little scientific attention, and systematic studies on the influence of additional early reflections and exploratory head movements are lacking. This study investigates echo-acoustic distance discrimination in virtual echo-acoustic space, using the impulse responses of a real corridor. Six blindfolded sighted subjects and a blind echolocation expert had to discriminate between two positions in the virtual corridor, which differed by their distance to the front wall, but not to the lateral walls. To solve this task, participants evaluated echoes that were generated in real time from self-produced vocalizations. Across experimental conditions, we systematically varied the restrictions for head rotations, the subjects' orientation in virtual space and the reference position. Three key results were observed. First, all participants successfully solved the task with discrimination thresholds below 1 m for all reference distances (0.75-4 m). Performance was best for the smallest reference distance of 0.75 m, with thresholds around 20 cm. Second, distance discrimination performance was relatively robust against additional early reflections, compared to other echolocation tasks like directional localization. Third, free head rotations during echolocation can improve distance discrimination performance in complex environmental settings. However, head movements do not necessarily provide a benefit over static echolocation from an optimal single orientation. These results show that accurate distance discrimination through echolocation is possible over a wide range of reference distances and environmental conditions. This is an important functional benefit of human echolocation, which may also play a major role in the calibration of auditory space representations.
Collapse
Affiliation(s)
- Ludwig Wallmeier
- Graduate School of Systemic Neuroscience, Ludwig-Maximilians-Universität München, Planegg-Martinsried, Germany
- Division of Neurobiology, Department Biologie II, Ludwig-Maximilians-Universität München, Planegg-Martinsried, Germany
| | - Lutz Wiegrebe
- Graduate School of Systemic Neuroscience, Ludwig-Maximilians-Universität München, Planegg-Martinsried, Germany
- Division of Neurobiology, Department Biologie II, Ludwig-Maximilians-Universität München, Planegg-Martinsried, Germany
| |
Collapse
|