1
|
Jiang P, Rossiter J, Kent C. Auditory and tactile frequency mapping for visual distance perception: A step forward in sensory substitution and augmentation. PLoS One 2025; 20:e0318354. [PMID: 40029888 PMCID: PMC11875370 DOI: 10.1371/journal.pone.0318354] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2024] [Accepted: 01/14/2025] [Indexed: 03/06/2025] Open
Abstract
Vision is crucial for daily tasks and interacting with the environment, but visual impairment can hinder these activities. Many sensory substitution products and studies prioritize providing abundant and accurate information, yet often overlook the inherent relationship between different modalities, potentially preventing users from receiving information intuitively. This study investigated the representation of visual distance using auditory and vibrotactile frequency through a series of psychological cross-modal matching experiments. By establishing mapping functions between auditory/vibrotactile frequency and visual distance, we aim to facilitate the design of sensory substitution devices that take visual distance information (ranging from 1 m to 12 m) and convert it into non-visual information (auditory frequency within the range 47-2764 Hz or vibrotactile frequency within the range 10-99 Hz). Results show distinct patterns regarding the correlation between visual distance and frequency in both auditory (auditory frequency-to-visual distance) and vibrotactile (vibrotactile frequency-to-visual distance) domains. The prevailing trend (59%) was a monotonic negative correlation (i.e. higher frequencies are associated with shorter distances), while 24% of participants demonstrated a consistently positive correlation. Additionally, we compare this study with our previous investigations into the reverse cross-modal mapping of visual distance-to-auditory frequency and visual distance-to-vibrotactile frequency. We reveal common patterns between these two studies (negative and positive correlations), suggesting a bidirectional mapping between visual distance and frequency in both auditory and vibrotactile domains, and the potential for new sensory substitution devices for those with visual impairment by integrating underlying cross-modal mechanisms to enhance intuitive and natural human-machine interaction.
Collapse
Affiliation(s)
- Pingping Jiang
- School of Engineering Mathematics and Technology, University of Bristol, Bristol, United Kingdom
- SoftLab, Bristol Robotics Laboratory, Bristol, United Kingdom
| | - Jonathan Rossiter
- School of Engineering Mathematics and Technology, University of Bristol, Bristol, United Kingdom
- SoftLab, Bristol Robotics Laboratory, Bristol, United Kingdom
| | - Christopher Kent
- School of Psychological Science, University of Bristol, Bristol, United Kingdom
| |
Collapse
|
2
|
Lian Y, Liu DE, Ji WZ. Survey and analysis of the current status of research in the field of outdoor navigation for the blind. Disabil Rehabil Assist Technol 2024; 19:1657-1675. [PMID: 37402242 DOI: 10.1080/17483107.2023.2227224] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2022] [Revised: 03/14/2023] [Accepted: 06/13/2023] [Indexed: 07/06/2023]
Abstract
PURPOSE In this article, we comprehensively review the current situation and research on technology related to outdoor travel for blind and visually impaired people (BVIP), given the diverse types and incomplete functionality of navigation aids for the blind. This aims to provide a reference for related research in the fields of outdoor travel for BVIP and blind navigation. MATERIALS AND METHODS We compiled articles related to blind navigation, of which a total of 227 of them are included in the search criteria. One hundred and seventy-nine articles are selected from the initial set, from a technical point of view, to elaborate on five aspects of blind navigation: system equipment, data sources, guidance algorithms, optimization of related methods, and navigation maps. RESULTS The wearable form of assistive devices for the blind has the most research, followed by the handheld type of aids. The RGB data class based on vision sensor is the most common source of navigation environment information data. Object detection based on picture data is also particularly rich among navigation algorithms and associated methods, indicating that computer vision technology has become an important study content in the field of blind navigation. However, research on navigation maps is relatively less. CONCLUSIONS In the study and development of assistive equipment for BVIP, there will be an emphasis on prioritizing attributes, such as lightness, portability, and efficiency. In light of the upcoming driverless era, the research focus will be on the development of visual sensors and computer vision technologies that can aid in navigation for the blind.
Collapse
Affiliation(s)
- Yue Lian
- School of Civil Engineering and Mapping and Engineering, Jiangxi University of Technology, Ganzhou, Jiangxi, China
| | - De-Er Liu
- School of Civil Engineering and Mapping and Engineering, Jiangxi University of Technology, Ganzhou, Jiangxi, China
| | - Wei-Zhen Ji
- State Key Laboratory of Remote Sensing Science, Beijing Normal University, Beijing, China
| |
Collapse
|
3
|
Jiang P, Kent C, Rossiter J. Towards sensory substitution and augmentation: Mapping visual distance to audio and tactile frequency. PLoS One 2024; 19:e0299213. [PMID: 38530828 DOI: 10.1371/journal.pone.0299213] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2023] [Accepted: 02/07/2024] [Indexed: 03/28/2024] Open
Abstract
Multimodal perception is the predominant means by which individuals experience and interact with the world. However, sensory dysfunction or loss can significantly impede this process. In such cases, cross-modality research offers valuable insight into how we can compensate for these sensory deficits through sensory substitution. Although sight and hearing are both used to estimate the distance to an object (e.g., by visual size and sound volume) and the perception of distance is an important element in navigation and guidance, it is not widely studied in cross-modal research. We investigate the relationship between audio and vibrotactile frequencies (in the ranges 47-2,764 Hz and 10-99 Hz, respectively) and distances uniformly distributed in the range 1-12 m. In our experiments participants mapped the distance (represented by an image of a model at that distance) to a frequency via adjusting a virtual tuning knob. The results revealed that the majority (more than 76%) of participants demonstrated a strong negative monotonic relationship between frequency and distance, across both vibrotactile (represented by a natural log function) and auditory domains (represented by an exponential function). However, a subgroup of participants showed the opposite positive linear relationship between frequency and distance. The strong cross-modal sensory correlation could contribute to the development of assistive robotic technologies and devices to augment human perception. This work provides the fundamental foundation for future assisted HRI applications where a mapping between distance and frequency is needed, for example for people with vision or hearing loss, drivers with loss of focus or response delay, doctors undertaking teleoperation surgery, and users in augmented reality (AR) or virtual reality (VR) environments.
Collapse
Affiliation(s)
- Pingping Jiang
- Department of Engineering Mathematics, University of Bristol, Bristol, United Kingdom
- SoftLab, Bristol Robotics Laboratory, Bristol, United Kingdom
| | - Christopher Kent
- School of Psychological Science, University of Bristol, Bristol, United Kingdom
| | - Jonathan Rossiter
- Department of Engineering Mathematics, University of Bristol, Bristol, United Kingdom
- SoftLab, Bristol Robotics Laboratory, Bristol, United Kingdom
| |
Collapse
|
4
|
Lee JW, Yu KH. Wearable Drone Controller: Machine Learning-Based Hand Gesture Recognition and Vibrotactile Feedback. SENSORS (BASEL, SWITZERLAND) 2023; 23:2666. [PMID: 36904870 PMCID: PMC10006975 DOI: 10.3390/s23052666] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/10/2023] [Revised: 02/14/2023] [Accepted: 02/23/2023] [Indexed: 06/18/2023]
Abstract
We proposed a wearable drone controller with hand gesture recognition and vibrotactile feedback. The intended hand motions of the user are sensed by an inertial measurement unit (IMU) placed on the back of the hand, and the signals are analyzed and classified using machine learning models. The recognized hand gestures control the drone, and the obstacle information in the heading direction of the drone is fed back to the user by activating the vibration motor attached to the wrist. Simulation experiments for drone operation were performed, and the participants' subjective evaluations regarding the controller's convenience and effectiveness were investigated. Finally, experiments with a real drone were conducted and discussed to validate the proposed controller.
Collapse
Affiliation(s)
- Ji-Won Lee
- KEPCO Research Institute, Daejeon 34056, Republic of Korea
| | - Kee-Ho Yu
- Department of Aerospace Engineering, Jeonbuk National University, Jeonju 54896, Republic of Korea
- Future Air Mobility Research Center, Jeonbuk National University, Jeonju 54896, Republic of Korea
| |
Collapse
|
5
|
Adil E, Mikou M, Mouhsen A. A novel algorithm for distance measurement using stereo camera. CAAI TRANSACTIONS ON INTELLIGENCE TECHNOLOGY 2022. [DOI: 10.1049/cit2.12098] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022] Open
Affiliation(s)
- Elmehdi Adil
- MISI Laboratory Faculty of Sciences and Techniques Hassan First University of Settat Settat Morocco
| | - Mohammed Mikou
- MISI Laboratory Faculty of Sciences and Techniques Hassan First University of Settat Settat Morocco
| | - Ahmed Mouhsen
- IMII Laboratory Faculty of Sciences and Techniques Hassan First University of Settat Settat Morocco
| |
Collapse
|
6
|
Dernayka A, Amorim MA, Leroux R, Bogaert L, Farcy R. Tom Pouce III, an Electronic White Cane for Blind People: Ability to Detect Obstacles and Mobility Performances. SENSORS (BASEL, SWITZERLAND) 2021; 21:6854. [PMID: 34696067 PMCID: PMC8539875 DOI: 10.3390/s21206854] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/11/2021] [Revised: 09/24/2021] [Accepted: 10/11/2021] [Indexed: 11/21/2022]
Abstract
We present a protocol for evaluating the efficiency of an electronic white cane for improving the mobility of blind people. The electronic cane used during the test is the Tom Pouce III, made of LIDAR sensors (light detection and ranging) with tactile feedback. The protocol comprises two parts. The first part, the "detection test", evaluates the efficiency of the sensors in the Tom Pouce III for detecting the obstacles found in everyday life (thin and large poles, apertures) under different environmental conditions (darkness, sun light, rain). The second part of the test, the "mobility test", compares the ability of blind participants to cross a 25 m path by avoiding obstacles with the simple white cane and the electronic cane. The 12 blind participants had between 2 and 20 years of experience of everyday usage of Tom Pouce devices. The results show a significant improvement in the capacity to avoid obstacles with the electronic cane relative to the simple white cane, and there was no speed difference. There was no correlation between the results and the years of experience of the users.
Collapse
Affiliation(s)
- Aya Dernayka
- Laboratoire Aimé Cotton, Université Paris-Saclay, Centre National de la Recherche Scientifique, 91405 Orsay, France; (A.D.); (R.L.); (L.B.)
- Complexité Innovation Activités Motrices et Sportives, Université Paris-Saclay, 91405 Orsay, France;
- Complexité Innovation Activités Motrices et Sportives, Université d’Orléans, 45067 Orléans, France
| | - Michel-Ange Amorim
- Complexité Innovation Activités Motrices et Sportives, Université Paris-Saclay, 91405 Orsay, France;
- Complexité Innovation Activités Motrices et Sportives, Université d’Orléans, 45067 Orléans, France
| | - Roger Leroux
- Laboratoire Aimé Cotton, Université Paris-Saclay, Centre National de la Recherche Scientifique, 91405 Orsay, France; (A.D.); (R.L.); (L.B.)
| | - Lucas Bogaert
- Laboratoire Aimé Cotton, Université Paris-Saclay, Centre National de la Recherche Scientifique, 91405 Orsay, France; (A.D.); (R.L.); (L.B.)
| | - René Farcy
- Laboratoire Aimé Cotton, Université Paris-Saclay, Centre National de la Recherche Scientifique, 91405 Orsay, France; (A.D.); (R.L.); (L.B.)
| |
Collapse
|
7
|
El-taher FEZ, Taha A, Courtney J, Mckeever S. A Systematic Review of Urban Navigation Systems for Visually Impaired People. SENSORS (BASEL, SWITZERLAND) 2021; 21:3103. [PMID: 33946857 PMCID: PMC8125253 DOI: 10.3390/s21093103] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/31/2021] [Revised: 04/22/2021] [Accepted: 04/25/2021] [Indexed: 11/16/2022]
Abstract
Blind and Visually impaired people (BVIP) face a range of practical difficulties when undertaking outdoor journeys as pedestrians. Over the past decade, a variety of assistive devices have been researched and developed to help BVIP navigate more safely and independently. In addition, research in overlapping domains are addressing the problem of automatic environment interpretation using computer vision and machine learning, particularly deep learning, approaches. Our aim in this article is to present a comprehensive review of research directly in, or relevant to, assistive outdoor navigation for BVIP. We breakdown the navigation area into a series of navigation phases and tasks. We then use this structure for our systematic review of research, analysing articles, methods, datasets and current limitations by task. We also provide an overview of commercial and non-commercial navigation applications targeted at BVIP. Our review contributes to the body of knowledge by providing a comprehensive, structured analysis of work in the domain, including the state of the art, and guidance on future directions. It will support both researchers and other stakeholders in the domain to establish an informed view of research progress.
Collapse
Affiliation(s)
- Fatma El-zahraa El-taher
- School of Computer Science, Technological University Dublin, D07EWV4 Dublin, Ireland; (F.E.-z.E.-t.); (A.T.); (J.C.)
| | - Ayman Taha
- School of Computer Science, Technological University Dublin, D07EWV4 Dublin, Ireland; (F.E.-z.E.-t.); (A.T.); (J.C.)
- Faculty of Computers and Artificial Intelligence, Cairo University, Cairo 12613, Egypt
| | - Jane Courtney
- School of Computer Science, Technological University Dublin, D07EWV4 Dublin, Ireland; (F.E.-z.E.-t.); (A.T.); (J.C.)
| | - Susan Mckeever
- School of Computer Science, Technological University Dublin, D07EWV4 Dublin, Ireland; (F.E.-z.E.-t.); (A.T.); (J.C.)
| |
Collapse
|
8
|
Tapu R, Mocanu B, Zaharia T. Wearable assistive devices for visually impaired: A state of the art survey. Pattern Recognit Lett 2020. [DOI: 10.1016/j.patrec.2018.10.031] [Citation(s) in RCA: 33] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
9
|
When Ultrasonic Sensors and Computer Vision Join Forces for Efficient Obstacle Detection and Recognition. SENSORS 2016; 16:s16111807. [PMID: 27801834 PMCID: PMC5134466 DOI: 10.3390/s16111807] [Citation(s) in RCA: 31] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/30/2016] [Revised: 10/24/2016] [Accepted: 10/25/2016] [Indexed: 11/19/2022]
Abstract
In the most recent report published by the World Health Organization concerning people with visual disabilities it is highlighted that by the year 2020, worldwide, the number of completely blind people will reach 75 million, while the number of visually impaired (VI) people will rise to 250 million. Within this context, the development of dedicated electronic travel aid (ETA) systems, able to increase the safe displacement of VI people in indoor/outdoor spaces, while providing additional cognition of the environment becomes of outmost importance. This paper introduces a novel wearable assistive device designed to facilitate the autonomous navigation of blind and VI people in highly dynamic urban scenes. The system exploits two independent sources of information: ultrasonic sensors and the video camera embedded in a regular smartphone. The underlying methodology exploits computer vision and machine learning techniques and makes it possible to identify accurately both static and highly dynamic objects existent in a scene, regardless on their location, size or shape. In addition, the proposed system is able to acquire information about the environment, semantically interpret it and alert users about possible dangerous situations through acoustic feedback. To determine the performance of the proposed methodology we have performed an extensive objective and subjective experimental evaluation with the help of 21 VI subjects from two blind associations. The users pointed out that our prototype is highly helpful in increasing the mobility, while being friendly and easy to learn.
Collapse
|