1
|
Poirier S, Côté-Allard U, Routhier F, Campeau-Lecours A. Efficient Self-Attention Model for Speech Recognition-Based Assistive Robots Control. SENSORS (BASEL, SWITZERLAND) 2023; 23:6056. [PMID: 37447906 DOI: 10.3390/s23136056] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/05/2023] [Revised: 06/15/2023] [Accepted: 06/27/2023] [Indexed: 07/15/2023]
Abstract
Assistive robots are tools that people living with upper body disabilities can leverage to autonomously perform Activities of Daily Living (ADL). Unfortunately, conventional control methods still rely on low-dimensional, easy-to-implement interfaces such as joysticks that tend to be unintuitive and cumbersome to use. In contrast, vocal commands may represent a viable and intuitive alternative. This work represents an important step toward providing a viable vocal interface for people living with upper limb disabilities by proposing a novel lightweight vocal command recognition system. The proposed model leverages the MobileNet2 architecture, augmenting it with a novel approach to the self-attention mechanism, achieving a new state-of-the-art performance for Keyword Spotting (KWS) on the Google Speech Commands Dataset (GSCD). Moreover, this work presents a new dataset, referred to as the French Speech Commands Dataset (FSCD), comprising 4963 vocal command utterances. Using the GSCD as the source, we used Transfer Learning (TL) to adapt the model to this cross-language task. TL has been shown to significantly improve the model performance on the FSCD. The viability of the proposed approach is further demonstrated through real-life control of a robotic arm by four healthy participants using both the proposed vocal interface and a joystick.
Collapse
Affiliation(s)
- Samuel Poirier
- Université Laval, Quebec City, QC G1V 0A6, Canada
- Centre for Interdisciplinary Research in Rehabilitation and Social Integration, CIUSSS de la Capitale-Nationale, Quebec City, QC G1M 2S8, Canada
| | | | - François Routhier
- Université Laval, Quebec City, QC G1V 0A6, Canada
- Centre for Interdisciplinary Research in Rehabilitation and Social Integration, CIUSSS de la Capitale-Nationale, Quebec City, QC G1M 2S8, Canada
| | - Alexandre Campeau-Lecours
- Université Laval, Quebec City, QC G1V 0A6, Canada
- Centre for Interdisciplinary Research in Rehabilitation and Social Integration, CIUSSS de la Capitale-Nationale, Quebec City, QC G1M 2S8, Canada
| |
Collapse
|
2
|
Rulik I, Sunny MSH, Sanjuan De Caro JD, Zarif MII, Brahmi B, Ahamed SI, Schultz K, Wang I, Leheng T, Longxiang JP, Rahman MH. Control of a Wheelchair-Mounted 6DOF Assistive Robot With Chin and Finger Joysticks. Front Robot AI 2022; 9:885610. [PMID: 35937617 PMCID: PMC9354078 DOI: 10.3389/frobt.2022.885610] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2022] [Accepted: 06/15/2022] [Indexed: 11/13/2022] Open
Abstract
Throughout the last decade, many assistive robots for people with disabilities have been developed; however, researchers have not fully utilized these robotic technologies to entirely create independent living conditions for people with disabilities, particularly in relation to activities of daily living (ADLs). An assistive system can help satisfy the demands of regular ADLs for people with disabilities. With an increasing shortage of caregivers and a growing number of individuals with impairments and the elderly, assistive robots can help meet future healthcare demands. One of the critical aspects of designing these assistive devices is to improve functional independence while providing an excellent human–machine interface. People with limited upper limb function due to stroke, spinal cord injury, cerebral palsy, amyotrophic lateral sclerosis, and other conditions find the controls of assistive devices such as power wheelchairs difficult to use. Thus, the objective of this research was to design a multimodal control method for robotic self-assistance that could assist individuals with disabilities in performing self-care tasks on a daily basis. In this research, a control framework for two interchangeable operating modes with a finger joystick and a chin joystick is developed where joysticks seamlessly control a wheelchair and a wheelchair-mounted robotic arm. Custom circuitry was developed to complete the control architecture. A user study was conducted to test the robotic system. Ten healthy individuals agreed to perform three tasks using both (chin and finger) joysticks for a total of six tasks with 10 repetitions each. The control method has been tested rigorously, maneuvering the robot at different velocities and under varying payload (1–3.5 lb) conditions. The absolute position accuracy was experimentally found to be approximately 5 mm. The round-trip delay we observed between the commands while controlling the xArm was 4 ms. Tests performed showed that the proposed control system allowed individuals to perform some ADLs such as picking up and placing items with a completion time of less than 1 min for each task and 100% success.
Collapse
Affiliation(s)
- Ivan Rulik
- Department of Computer Sciences, University of Wisconsin-Milwaukee, Milwaukee, WI, United States
- *Correspondence: Ivan Rulik,
| | - Md Samiul Haque Sunny
- Department of Computer Sciences, University of Wisconsin-Milwaukee, Milwaukee, WI, United States
| | | | | | - Brahim Brahmi
- Electrical Engineering Department, Collège Ahuntsic, Montreal, QC, Canada
| | | | - Katie Schultz
- Assistive Technology Program, Clement J. Zablocki VA Medical Center, Milwaukee, WI, United States
| | - Inga Wang
- Department of Rehabilitation Sciences & Technology, University of Wisconsin-Milwaukee, Milwaukee, WI, United States
| | - Tony Leheng
- UFACTORY Technology Co., Ltd., Shenzhen, China
| | | | - Mohammad H. Rahman
- Department of Mechanical Engineering, University of Wisconsin-Milwaukee, Milwaukee, WI, United States
| |
Collapse
|
3
|
IMU-Based Hand Gesture Interface Implementing a Sequence-Matching Algorithm for the Control of Assistive Technologies. SIGNALS 2021. [DOI: 10.3390/signals2040043] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
Assistive technologies (ATs) often have a high-dimensionality of possible movements (e.g., assistive robot with several degrees of freedom or a computer), but the users have to control them with low-dimensionality sensors and interfaces (e.g., switches). This paper presents the development of an open-source interface based on a sequence-matching algorithm for the control of ATs. Sequence matching allows the user to input several different commands with low-dimensionality sensors by not only recognizing their output, but also their sequential pattern through time, similarly to Morse code. In this paper, the algorithm is applied to the recognition of hand gestures, inputted using an inertial measurement unit worn by the user. An SVM-based algorithm, that is aimed to be robust, with small training sets (e.g., five examples per class) is developed to recognize gestures in real-time. Finally, the interface is applied to control a computer’s mouse and keyboard. The interface was compared against (and combined with) the head movement-based AssystMouse software. The hand gesture interface showed encouraging results for this application but could also be used with other body parts (e.g., head and feet) and could control various ATs (e.g., assistive robotic arm and prosthesis).
Collapse
|
4
|
Wu L, Alqasemi R, Dubey R. Development of Smartphone-Based Human-Robot Interfaces for Individuals With Disabilities. IEEE Robot Autom Lett 2020. [DOI: 10.1109/lra.2020.3010453] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
5
|
Controlling a robotic arm for functional tasks using a wireless head-joystick: A case study of a child with congenital absence of upper and lower limbs. PLoS One 2020; 15:e0226052. [PMID: 32756553 PMCID: PMC7406178 DOI: 10.1371/journal.pone.0226052] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2019] [Accepted: 06/29/2020] [Indexed: 11/19/2022] Open
Abstract
Children with movement impairments needing assistive devices for activities of daily living often require novel methods for controlling these devices. Body-machine interfaces, which rely on body movements, are particularly well-suited for children as they are non-invasive and have high signal-to-noise ratios. Here, we examined the use of a head-joystick to enable a child with congenital absence of all four limbs to control a seven degree-of-freedom robotic arm. Head movements were measured with a wireless inertial measurement unit and used to control a robotic arm to perform two functional tasks-a drinking task and a block stacking task. The child practiced these tasks over multiple sessions; a control participant performed the same tasks with a manual joystick. Our results showed that the child was able to successfully perform both tasks, with movement times decreasing by ~40-50% over 6-8 sessions of training. The child's performance with the head-joystick was also comparable to the control participant using a manual joystick. These results demonstrate the potential of using head movements for the control of high degree-of-freedom tasks in children with limited movement repertoire.
Collapse
|
6
|
Poirier S, Routhier F, Campeau-Lecours A. Voice Control Interface Prototype for Assistive Robots for People Living with Upper Limb Disabilities. IEEE Int Conf Rehabil Robot 2019; 2019:46-52. [PMID: 31374605 DOI: 10.1109/icorr.2019.8779524] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
This paper presents a voice control interface prototype for assistive robots aiming to help people living with upper limb disabilities to perform daily activities autonomously. Assistive robotic devices can be used to help people with upper-body disabilities gain more autonomy in their daily life. However, it is very difficult or even impossible for certain users to control the robot with conventional control systems (e.g. joystick, sip-and-puff). This paper presents the design and preliminary evaluation of a voice command system prototype for the control of assistive robotic arms' movements. This work aims at making the control of assistive robots more intuitive and fluid, and to perform various tasks in less time and with a lesser effort. The prototype of the voice command interface developed is first presented, followed by two experiments with five able-bodied subjects in order to assess the system's performance and guide future development.
Collapse
|
7
|
Rudigkeit N, Gebhard M. AMiCUS-A Head Motion-Based Interface for Control of an Assistive Robot. SENSORS 2019; 19:s19122836. [PMID: 31242706 PMCID: PMC6630260 DOI: 10.3390/s19122836] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/23/2019] [Revised: 06/13/2019] [Accepted: 06/18/2019] [Indexed: 11/16/2022]
Abstract
Within this work we present AMiCUS, a Human-Robot Interface that enables tetraplegics to control a multi-degree of freedom robot arm in real-time using solely head motion, empowering them to perform simple manipulation tasks independently. The article describes the hardware, software and signal processing of AMiCUS and presents the results of a volunteer study with 13 able-bodied subjects and 6 tetraplegics with severe head motion limitations. As part of the study, the subjects performed two different pick-and-place tasks. The usability was assessed with a questionnaire. The overall performance and the main control elements were evaluated with objective measures such as completion rate and interaction time. The results show that the mapping of head motion onto robot motion is intuitive and the given feedback is useful, enabling smooth, precise and efficient robot control and resulting in high user-acceptance. Furthermore, it could be demonstrated that the robot did not move unintendedly, giving a positive prognosis for safety requirements in the framework of a certification of a product prototype. On top of that, AMiCUS enabled every subject to control the robot arm, independent of prior experience and degree of head motion limitation, making the system available for a wide range of motion impaired users.
Collapse
Affiliation(s)
- Nina Rudigkeit
- Group of Sensors and Actuators, Department of Electrical Engineering and Applied Physics, Westphalian University of Applied Sciences, 45877 Gelsenkirchen, Germany.
| | - Marion Gebhard
- Group of Sensors and Actuators, Department of Electrical Engineering and Applied Physics, Westphalian University of Applied Sciences, 45877 Gelsenkirchen, Germany.
| |
Collapse
|
8
|
Campeau-Lecours A, Cote-Allard U, Vu DS, Routhier F, Gosselin B, Gosselin C. Intuitive Adaptive Orientation Control for Enhanced Human–Robot Interaction. IEEE T ROBOT 2019. [DOI: 10.1109/tro.2018.2885464] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
9
|
Fall CL, Quevillon F, Blouin M, Latour S, Campeau-Lecours A, Gosselin C, Gosselin B. A Multimodal Adaptive Wireless Control Interface for People With Upper-Body Disabilities. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS 2018; 12:564-575. [PMID: 29877820 DOI: 10.1109/tbcas.2018.2810256] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
This paper describes a multimodal body-machine interface (BoMI) to help individuals with upper-limb disabilities using advanced assistive technologies, such as robotic arms. The proposed system uses a wearable and wireless body sensor network (WBSN) supporting up to six sensor nodes to measure the natural upper-body gesture of the users and translate it into control commands. Natural gesture of the head and upper-body parts, as well as muscular activity, are measured using inertial measurement units (IMUs) and surface electromyography (sEMG) using custom-designed multimodal wireless sensor nodes. An IMU sensing node is attached to a headset worn by the user. It has a size of 2.9 cm 2.9 cm, a maximum power consumption of 31 mW, and provides angular precision of 1. Multimodal patch sensor nodes, including both IMU and sEMG sensing modalities are placed over the user able-body parts to measure the motion and muscular activity. These nodes have a size of 2.5 cm 4.0 cm and a maximum power consumption of 11 mW. The proposed BoMI runs on a Raspberry Pi. It can adapt to several types of users through different control scenarios using the head and shoulder motion, as well as muscular activity, and provides a power autonomy of up to 24 h. JACO, a 6-DoF assistive robotic arm, is used as a testbed to evaluate the performance of the proposed BoMI. Ten able-bodied subjects performed ADLs while operating the AT device, using the Test d'Évaluation des Membres Supérieurs de Personnes Âgées to evaluate and compare the proposed BoMI with the conventional joystick controller. It is shown that the users can perform all tasks with the proposed BoMI, almost as fast as with the joystick controller, with only 30% time overhead on average, while being potentially more accessible to the upper-body disabled who cannot use the conventional joystick controller. Tests show that control performance with the proposed BoMI improved by up to 17% on average, after three trials.
Collapse
|
10
|
Jackowski A, Gebhard M, Thietje R. Head Motion and Head Gesture-Based Robot Control: A Usability Study. IEEE Trans Neural Syst Rehabil Eng 2018; 26:161-170. [PMID: 29324407 DOI: 10.1109/tnsre.2017.2765362] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
The assistive robot system adaptive head motion control for user-friendly support (AMiCUS) has been developed to increase the autonomy of motion impaired people. The six degrees of freedom robot arm with gripper is controlled with head motion and head gestures only, so especially tetraplegics benefit from collaboration with AMiCUS. In this paper, a usability study with a total number of 30 subjects was conducted to validate the AMiCUS interaction technology and design. 24 able-bodied subjects of demographically diverse groups and 6 tetraplegics participated in this paper. All subjects performed different pick and place tasks by controlling AMiCUS. The evaluation of the interaction design was carried out subjectively with a questionnaire as well as objectively by measurement of time, completion rate, and number of trials for correct head gesture performance. The influence of several factors like age, sex, motion impairment, and previous experience on head motion-based human-robot interaction was analyzed. The interaction design has been proven successful in laboratory environment and assessed overall positive by the subjects. The results of the presented paper confirm the usability of the assistive robot AMiCUS. AMiCUS has the potential to benefit tetraplegics by improving their independence in activities of daily living and adapted workplaces.
Collapse
|
11
|
Struijk LNSA, Lontis R. Comparison of tongue interface with keyboard for control of an assistive robotic arm. IEEE Int Conf Rehabil Robot 2017; 2017:925-928. [PMID: 28813939 DOI: 10.1109/icorr.2017.8009367] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
This paper demonstrates how an assistive 6 DoF robotic arm with a gripper can be controlled manually using a tongue interface. The proposed method suggests that it possible for a user to manipulate the surroundings with his or her tongue using the inductive tongue control system as deployed in this study. The sensors of an inductive tongue-computer interface were mapped to the Cartesian control of an assistive robotic arm. The resulting control system was tested manually in order to compare manual control of the robot using a standard keyboard and using the tongue interface. Two healthy subjects controlled the robotic arm to precisely move a bottle of water from one location to another. The results shows that the tongue interface was able to fully control the robotic arm in a similar manner as the standard keyboard resulting in the same number of successful manipulations and an average increase in task duration of up to 30% as compared with the standard keyboard.
Collapse
|
12
|
Vu DS, Allard UC, Gosselin C, Routhier F, Gosselin B, Campeau-Lecours A. Intuitive adaptive orientation control of assistive robots for people living with upper limb disabilities. IEEE Int Conf Rehabil Robot 2017; 2017:795-800. [PMID: 28813917 DOI: 10.1109/icorr.2017.8009345] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Robotic assistive devices enhance the autonomy of individuals living with physical disabilities in their day-to-day life. Although the first priority for such devices is safety, they must also be intuitive and efficient from an engineering point of view in order to be adopted by a broad range of users. This is especially true for assistive robotic arms, as they are used for the complex control tasks of daily living. One challenge in the control of such assistive robots is the management of the end-effector orientation which is not always intuitive for the human operator, especially for neophytes. This paper presents a novel orientation control algorithm designed for robotic arms in the context of human-robot interaction. This work aims at making the control of the robot's orientation easier and more intuitive for the user, in particular, individuals living with upper limb disabilities. The performance and intuitiveness of the proposed orientation control algorithm is assessed through two experiments with 25 able-bodied subjects and shown to significantly improve on both aspects.
Collapse
|