1
|
Kutbi M, Li H, Chang Y, Sun B, Li X, Cai C, Agadakos N, Hua G, Mordohai P. Egocentric Computer Vision for Hands-Free Robotic Wheelchair Navigation. J INTELL ROBOT SYST 2023; 107:10. [DOI: 10.1007/s10846-023-01807-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/15/2023]
Abstract
AbstractIn this paper, we present an approach for navigating a robotic wheelchair that provides users with multiple levels of autonomy and navigation capabilities to fit their individual needs and preferences. We focus on three main aspects: (i) egocentric computer vision based motion control to provide a natural human-robot interface to wheelchair users with impaired hand usage; (ii) techniques that enable user to initiate autonomous navigation to a location, object or person without use of the hands; and (iii) a framework that learns to navigate the wheelchair according to its user’s, often subjective, criteria and preferences. These contributions are evaluated qualitatively and quantitatively in user studies with several subjects demonstrating their effectiveness. These studies have been conducted with healthy subjects, but they still indicate that clinical tests of the proposed technology can be initiated.
Collapse
|
2
|
de Sá AAR, Morère Y, Naves ELM. Skills assessment metrics of electric powered wheelchair driving in a virtual environment: a survey. Med Biol Eng Comput 2022; 60:323-335. [PMID: 35013870 DOI: 10.1007/s11517-022-02500-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2020] [Accepted: 12/28/2021] [Indexed: 11/24/2022]
Abstract
The purpose of this review is to present studies on the parameters for assessing the skills of users of electric wheelchair driving simulators in a virtual environment. In addition, this study also aims to identify the most widely used and validated parameters for the quantification of electric wheelchair driving ability in a virtual environment and to suggest challenges for future research. To carry out this research, the criteria of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) were adopted. Literature searches in English, French, and Portuguese were conducted up to December 2020 in the PubMed, SciELO, Science Direct, World Wide Science, and Scopus databases. The keywords used were electric wheelchair, simulator, performance indicators, performance skills, driving skills, training platform, virtual environment, and virtual reality. We excluded studies involving "real" wheelchairs without a simulator in a virtual environment. We have selected a total of 42 items. In these studies, we identified 32 parameters (3 qualitative and 29 quantitative) that are used as parameters for the evaluation of the ability to control a powered wheelchair in a virtual environment. Although the amount of research in this area has increased significantly in recent years, additional studies are still needed to provide a more accurate and objective assessment of skills among the target population. A challenge for future work is the increasing application of artificial intelligence techniques and the exploration of biomedical data measurements, which may be a promising alternative to improve the quantification of user competencies.
Collapse
Affiliation(s)
- Angela A R de Sá
- Faculty of Electrical Engineering, Federal University of Uberlândia, Assistive Technologies Group, Av Joao Naves de Avila, 2160 - Bloco 3N, Uberlandia, Brazil.
| | - Yann Morère
- LCOMS - Laboratoire de Conception, Optimisation Et Modélisation Des Systèmes, Université de Lorraine, 7 rue Marconi, 57070, Metz, France
| | - Eduardo L M Naves
- Faculty of Electrical Engineering, Federal University of Uberlândia, Assistive Technologies Group, Av Joao Naves de Avila, 2160 - Bloco 3N, Uberlandia, Brazil
| |
Collapse
|
3
|
Gil Ó, Garrell A, Sanfeliu A. Social Robot Navigation Tasks: Combining Machine Learning Techniques and Social Force Model. Sensors (Basel) 2021; 21:7087. [PMID: 34770395 DOI: 10.3390/s21217087] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/07/2021] [Revised: 10/12/2021] [Accepted: 10/15/2021] [Indexed: 11/26/2022]
Abstract
Social robot navigation in public spaces, buildings or private houses is a difficult problem that is not well solved due to environmental constraints (buildings, static objects etc.), pedestrians and other mobile vehicles. Moreover, robots have to move in a human-aware manner—that is, robots have to navigate in such a way that people feel safe and comfortable. In this work, we present two navigation tasks, social robot navigation and robot accompaniment, which combine machine learning techniques with the Social Force Model (SFM) allowing human-aware social navigation. The robots in both approaches use data from different sensors to capture the environment knowledge as well as information from pedestrian motion. The two navigation tasks make use of the SFM, which is a general framework in which human motion behaviors can be expressed through a set of functions depending on the pedestrians’ relative and absolute positions and velocities. Additionally, in both social navigation tasks, the robot’s motion behavior is learned using machine learning techniques: in the first case using supervised deep learning techniques and, in the second case, using Reinforcement Learning (RL). The machine learning techniques are combined with the SFM to create navigation models that behave in a social manner when the robot is navigating in an environment with pedestrians or accompanying a person. The validation of the systems was performed with a large set of simulations and real-life experiments with a new humanoid robot denominated IVO and with an aerial robot. The experiments show that the combination of SFM and machine learning can solve human-aware robot navigation in complex dynamic environments.
Collapse
|
4
|
Xi L, Shino M. Shared Control of an Electric Wheelchair Considering Physical Functions and Driving Motivation. Int J Environ Res Public Health 2020; 17:E5502. [PMID: 32751490 DOI: 10.3390/ijerph17155502] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/14/2020] [Revised: 07/20/2020] [Accepted: 07/22/2020] [Indexed: 11/17/2022]
Abstract
Individuals with severe physical impairments have difficulties operating electric wheelchairs (EWs), especially in situations where fine steering abilities are required. Automatic driving partly solves the problem, although excessive reliance on automatic driving is not conducive to maintaining their residual physical functions and may cause more serious diseases in the future. The objective of this study was to develop a shared control system that can be adapted to different environments by completely utilizing the operating ability of the user while maintaining the motivation of the user to drive. The operating characteristics of individuals with severe physical impairments were first analyzed to understand their difficulties when operating EWs. Subsequently, a novel reinforcement learning-based shared control method was proposed to adjust the control weight between the user and the machine to meet the requirements of fully exploiting the operating abilities of the users while assisting them when necessary. Experimental results showed that the proposed shared control system gradually adjusted the control weights between the user and the machine, providing safe operation of the EW while ensuring full use of the control signals from the user. It was also found that the shared control results were deeply affected by the types of users.
Collapse
|
5
|
|
6
|
|
7
|
Abstract
This paper presents the application of a mobile robot designed as an Assistant Personal Robot (APR) as a walk-helper tool. The hypothesis is that the height and weight of this mobile robot can be used also to provide a dynamic physical support and guidance to people while they walk. This functionality is presented as a soft walking aid at home but not as a substitute of an assistive cane or a walker device, which may withstand higher weights and provide better stability during a walking. The APR operates as a walk-helper tool by providing user interaction using the original arms of the mobile robot and by using the onboard sensors of the mobile robot in order to avoid obstacles and guide the walking through free areas. The results of the experiments conducted with the walk-helper have showed the automatic generation of smooth walking trajectories and a reduction in the number of manual trajectory corrections required to complete a walking displacement.
Collapse
|
8
|
Rabhi Y, Mrabet M, Fnaiech F. Intelligent Control Wheelchair Using a New Visual Joystick. J Healthc Eng 2018; 2018:6083565. [PMID: 29599953 PMCID: PMC5823424 DOI: 10.1155/2018/6083565] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/05/2017] [Revised: 11/07/2017] [Accepted: 11/27/2017] [Indexed: 11/18/2022]
Abstract
A new control system of a hand gesture-controlled wheelchair (EWC) is proposed. This smart control device is suitable for a large number of patients who cannot manipulate a standard joystick wheelchair. The movement control system uses a camera fixed on the wheelchair. The patient's hand movements are recognized using a visual recognition algorithm and artificial intelligence software; the derived corresponding signals are thus used to control the EWC in real time. One of the main features of this control technique is that it allows the patient to drive the wheelchair with a variable speed similar to that of a standard joystick. The designed device "hand gesture-controlled wheelchair" is performed at low cost and has been tested on real patients and exhibits good results. Before testing the proposed control device, we have created a three-dimensional environment simulator to test its performances with extreme security. These tests were performed on real patients with diverse hand pathologies in Mohamed Kassab National Institute of Orthopedics, Physical and Functional Rehabilitation Hospital of Tunis, and the validity of this intelligent control system had been proved.
Collapse
Affiliation(s)
- Yassine Rabhi
- Laboratoire SIME, Ecole Nationale Supérieure d'Ingénieurs de Tunis (ENSIT), Université de Tunis, 5 Av. Taha Hussein, 1008 Tunis, Tunisia
| | - Makrem Mrabet
- Laboratoire SIME, Ecole Nationale Supérieure d'Ingénieurs de Tunis (ENSIT), Université de Tunis, 5 Av. Taha Hussein, 1008 Tunis, Tunisia
| | - Farhat Fnaiech
- Laboratoire SIME, Ecole Nationale Supérieure d'Ingénieurs de Tunis (ENSIT), Université de Tunis, 5 Av. Taha Hussein, 1008 Tunis, Tunisia
| |
Collapse
|
9
|
Abstract
SUMMARYThis paper considers applications where a human agent is navigating a semi-autonomous mobile robot in an environment with obstacles. The human input to the robot can be based on a desired navigation objective, which may not be known to the robot. Additionally, the semi-autonomous robot can be programmed to ensure obstacle avoidance as it navigates the environment. A shared control architecture can be used to appropriately fuse the human and the autonomy inputs to obtain a net control input that drives the robot. In this paper, an adaptive, near-continuous control allocation function is included in the shared controller, which continuously varies the control effort exerted by the human and the autonomy based on the position of the robot relative to obstacles. The developed control allocation function facilitates the human to freely navigate the robot when away from obstacles, and it causes the autonomy control input to progressively dominate as the robot approaches obstacles. A harmonic potential field-based non-linear sliding mode controller is developed to obtain the autonomy control input for obstacle avoidance. In addition, a robust feed-forward term is included in the autonomy control input to maintain stability in the presence of adverse human inputs, which can be critical in applications such as to prevent collision or roll-over of smart wheelchairs due to erroneous human inputs. Lyapunov-based stability analysis is presented to guarantee finite-time stability of the developed shared controller, i.e., the autonomy guarantees obstacle avoidance as the human navigates the robot. Experimental results are provided to validate the performance of the developed shared controller.
Collapse
|
10
|
Viswanathan P, Zambalde EP, Foley G, Graham JL, Wang RH, Adhikari B, Mackworth AK, Mihailidis A, Miller WC, Mitchell IM. Intelligent wheelchair control strategies for older adults with cognitive impairment: user attitudes, needs, and preferences. Auton Robots 2017; 41:539-54. [DOI: 10.1007/s10514-016-9568-y] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|
11
|
|
12
|
Affiliation(s)
- Wenxia Xu
- Key Laboratory of Ministry of Education for Image Processing and Intelligent Control, School of Automation, Huazhong University of Science and Technology, Wuhan, China
| | - Jian Huang
- Key Laboratory of Ministry of Education for Image Processing and Intelligent Control, School of Automation, Huazhong University of Science and Technology, Wuhan, China
| | - Yongji Wang
- Key Laboratory of Ministry of Education for Image Processing and Intelligent Control, School of Automation, Huazhong University of Science and Technology, Wuhan, China
| | | | - Lei Cheng
- School of Information Science and Engineering, Wuhan University of Science and Technology, Wuhan, China
| |
Collapse
|
13
|
|
14
|
|
15
|
Urdiales C, Pérez EJ, Peinado G, Fdez-Carmona M, Peula JM, Annicchiarico R, Sandoval F, Caltagirone C. On the construction of a skill-based wheelchair navigation profile. IEEE Trans Neural Syst Rehabil Eng 2013; 21:917-27. [PMID: 23475373 DOI: 10.1109/tnsre.2013.2241454] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
Assisted wheelchair navigation is of key importance for persons with severe disabilities. The problem has been solved in different ways, usually based on the shared control paradigm. This paradigm consists of giving the user more or less control on a need basis. Naturally, these approaches require personalization: each wheelchair user has different skills and needs and it is hard to know a priori from diagnosis how much assistance must be provided. Furthermore, since there is no such thing as an average user, sometimes it is difficult to quantify the benefits of these systems. This paper proposes a new method to extract a prototype user profile using real traces based on more than 70 volunteers presenting different physical and cognitive skills. These traces are clustered to determine the average behavior that can be expected from a wheelchair user in order to cope with significant situations. Processed traces provide a prototype user model for comparison purposes, plus a simple method to obtain without supervision a skill-based navigation profile for any user while he/she is driving. This profile is useful for benchmarking but also to determine the situations in which a given user might require more assistance after evaluating how well he/she compares to the benchmark. Profile-based shared control has been successfully tested by 18 volunteers affected by left or right brain stroke at Fondazione Santa Lucia, in Rome, Italy.
Collapse
|