1
|
Cela A, Oña E, Jardón A. Serious Gaming for Upper Limbs Rehabilitation-Game Controllers Features: A Scoping Review. Games Health J 2025. [PMID: 40398964 DOI: 10.1089/g4h.2024.0122] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/23/2025] Open
Abstract
The use of exergames in the rehabilitation of patients with upper limb dysfunctions has increased significantly. This scoping review aimed to investigate the game controllers (GCs) employed in exergame systems used for rehabilitation, offering insights into the platforms, sensors, and techniques used in their development, implementation, and utilization. We conducted a comprehensive search of Scopus and PubMed databases, encompassing articles published between February 2013 and February 2023. The eligibility criteria included studies on upper limbs (UL) rehabilitation using exergames published in English-language journals, resulting in the identification of 175 pertinent articles. Seven key categories were identified: pathology, participants' conditions, dosage of sessions, GCs, sensors, specific part of the UL rehabilitated, and ergonomics. Stroke (55.4%) and cerebral palsy (6.3%) were the most frequently addressed medical conditions in the exergame-based rehabilitation. The number of participants in the reviewed articles was from one to several hundred. Three types of participants were identified: patients, specialists, and volunteers. Randomized controlled trial (RCT) studies consistently featured a controlled number of sessions (ranging from 6 to 40) lasting an average of 20 minutes, while non-RCT studies displayed more variability. Commercial platforms were favored, accounting for 74.3% of GCs, with physical controllers (57.1%) surpassing virtual ones. Cameras were the predominant sensors (50.3%), although a wide array of sensor types including IMUs, push buttons, and force sensors were also used. Rehabilitation focuses 68% on general UL, 20.6% on hands, 4% on elbows, and 3.4% on arms and shoulders. Notably, only 26.3% of the studies considered ergonomics in the rehabilitation system. Although exergame systems are advancing rehabilitation treatments, there remains a need for further development and research on various aspects, such as ergonomics, controller design, and sensor integration, to enhance their suitability for patient use.
Collapse
Affiliation(s)
- Andrés Cela
- Automation and Industrial Control Department, National Polytechnic School, Quito, Ecuador
| | - Edwin Oña
- Systems and Automatics Department, Universidad Carlos III de Madrid, Madrid, Spain
| | - Alberto Jardón
- Systems and Automatics Department, Universidad Carlos III de Madrid, Madrid, Spain
| |
Collapse
|
2
|
Zare S, Beaber SI, Sun Y. NeuroFlex: Feasibility of EEG-Based Motor Imagery Control of a Soft Glove for Hand Rehabilitation. SENSORS (BASEL, SWITZERLAND) 2025; 25:610. [PMID: 39943246 PMCID: PMC11820135 DOI: 10.3390/s25030610] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/09/2024] [Revised: 12/28/2024] [Accepted: 01/20/2025] [Indexed: 02/16/2025]
Abstract
Motor impairments resulting from neurological disorders, such as strokes or spinal cord injuries, often impair hand and finger mobility, restricting a person's ability to grasp and perform fine motor tasks. Brain plasticity refers to the inherent capability of the central nervous system to functionally and structurally reorganize itself in response to stimulation, which underpins rehabilitation from brain injuries or strokes. Linking voluntary cortical activity with corresponding motor execution has been identified as effective in promoting adaptive plasticity. This study introduces NeuroFlex, a motion-intent-controlled soft robotic glove for hand rehabilitation. NeuroFlex utilizes a transformer-based deep learning (DL) architecture to decode motion intent from motor imagery (MI) EEG data and translate it into control inputs for the assistive glove. The glove's soft, lightweight, and flexible design enables users to perform rehabilitation exercises involving fist formation and grasping movements, aligning with natural hand functions for fine motor practices. The results show that the accuracy of decoding the intent of fingers making a fist from MI EEG can reach up to 85.3%, with an average AUC of 0.88. NeuroFlex demonstrates the feasibility of detecting and assisting the patient's attempted movements using pure thinking through a non-intrusive brain-computer interface (BCI). This EEG-based soft glove aims to enhance the effectiveness and user experience of rehabilitation protocols, providing the possibility of extending therapeutic opportunities outside clinical settings.
Collapse
Affiliation(s)
- Soroush Zare
- Department of Mechanical and Aerospace Engineering, University of Virginia, Charlottesville, VA 22903, USA; (S.Z.); (S.I.B.)
| | - Sameh I. Beaber
- Department of Mechanical and Aerospace Engineering, University of Virginia, Charlottesville, VA 22903, USA; (S.Z.); (S.I.B.)
| | - Ye Sun
- Department of Mechanical and Aerospace Engineering, University of Virginia, Charlottesville, VA 22903, USA; (S.Z.); (S.I.B.)
- Department of Electrical and Computer Engineering, University of Virginia, Charlottesville, VA 22903, USA
| |
Collapse
|
3
|
Hoyle AC, Stevenson R, Leonhardt M, Gillett T, Martinez-Hernandez U, Gompertz N, Clarke C, Cazzola D, Metcalfe BW. Exploring the 'EarSwitch' concept: a novel ear based control method for assistive technology. J Neuroeng Rehabil 2024; 21:210. [PMID: 39623474 PMCID: PMC11613744 DOI: 10.1186/s12984-024-01500-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2024] [Accepted: 10/30/2024] [Indexed: 12/06/2024] Open
Abstract
BACKGROUND Loss of communication with loved ones and carers is one of the most isolating and debilitating effects of many neurological disorders. Assistive technology (AT) supports individuals with communication, but the acceptability of AT solutions is highly variable. In this paper a novel ear based control method of AT, the concept of 'EarSwitch', is presented. This new approach is based on detecting ear rumbling, which is the voluntary contraction of the tensor tympani muscle (TTM), resulting in observable movement of the eardrum and a dull rumbling sound. 'EarSwitch' has the potential to be a discreet method that can complement existing AT control methods. However, only a subset of the population can ear rumble and little is known about the ability of rumbling in populations with neurological disorders. METHODS To explore the viability of the 'EarSwitch' concept as an AT control method we conducted in-depth online surveys with (N=1853) respondents from the general population and (N=170) respondents with self-declared neurological disorders including Motor Neurone Disease (MND) and Multiple Sclerosis (MS).This is the largest ever study to explore ear rumbling and the first to explore whether rumbling is preserved among individuals with neurological disorders. In addition, we validated rumbling, and investigated usability of the 'EarSwitch' concept as a control input, using in-person otoscopic examination with a subset of participants. RESULTS A significant proportion of the population with neurological disorders could benefit from 'EarSwitch' controllable AT. The upper bound prevalence of the ability to rumble without accompanying movements was 55% in the general population, 38% in the neurological population, and 20% of participants with MND (N=95) reported this ability. During the validation procedure, participants achieved high accuracy in self-reporting the ability to rumble (80%) and proved concept of using the 'EarSwitch' method to control a basic interface. DISCUSSION 'EarSwitch' is a potential new AT control method control, either by itself or as a supplement to other existing methods. Results demonstrate self-reported ear rumbling is present among patients with different neurological disorders, including MND. Further research should explore how well the ability to rumble is preserved in different types and stages of neurological disorders.
Collapse
Affiliation(s)
- Anna C Hoyle
- Department of Electronic and Electrical Engineering, University of Bath, Bath, UK
- Bath Institute for the Augmented Human (IAH), University of Bath, Bath, UK
| | | | - Martin Leonhardt
- Department of Electronic and Electrical Engineering, University of Bath, Bath, UK
| | - Thomas Gillett
- School of Engineering and Physical Science, Heriot-Watt University, Edinburgh, UK
| | - Uriel Martinez-Hernandez
- Department of Electronic and Electrical Engineering, University of Bath, Bath, UK
- Bath Institute for the Augmented Human (IAH), University of Bath, Bath, UK
| | | | - Christopher Clarke
- Department of Computer Science, University of Bath, Bath, UK
- Centre for Analysis of Motion and Entertainment Research and Application (CAMERA), University of Bath, Bath, UK
- Bath Institute for the Augmented Human (IAH), University of Bath, Bath, UK
| | - Dario Cazzola
- Department for Health, University of Bath, Bath, UK
- Centre for Analysis of Motion and Entertainment Research and Application (CAMERA), University of Bath, Bath, UK
- Bath Institute for the Augmented Human (IAH), University of Bath, Bath, UK
| | - Benjamin W Metcalfe
- Department of Electronic and Electrical Engineering, University of Bath, Bath, UK.
- Bath Institute for the Augmented Human (IAH), University of Bath, Bath, UK.
| |
Collapse
|
4
|
Losanno E, Ceradini M, Agnesi F, Righi G, Del Popolo G, Shokur S, Micera S. A Virtual Reality-Based Protocol to Determine the Preferred Control Strategy for Hand Neuroprostheses in People With Paralysis. IEEE Trans Neural Syst Rehabil Eng 2024; 32:2261-2269. [PMID: 38865234 DOI: 10.1109/tnsre.2024.3413192] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/14/2024]
Abstract
Hand neuroprostheses restore voluntary movement in people with paralysis through neuromodulation protocols. There are a variety of strategies to control hand neuroprostheses, which can be based on residual body movements or brain activity. There is no universally superior solution, rather the best approach may vary from patient to patient. Here, we propose a protocol based on an immersive virtual reality (VR) environment that simulates the use of a hand neuroprosthesis to allow patients to experience and familiarize themselves with various control schemes in clinically relevant tasks and choose the preferred one. We used our VR environment to compare two alternative control strategies over 5 days of training in four patients with C6 spinal cord injury: (a) control via the ipsilateral wrist, (b) control via the contralateral shoulder. We did not find a one-fits-all solution but rather a subject-specific preference that could not be predicted based only on a general clinical assessment. The main results were that the VR simulation allowed participants to experience the pros and cons of the proposed strategies and make an educated choice, and that there was a longitudinal improvement. This shows that our VR-based protocol is a useful tool for personalization and training of the control strategy of hand neuroprostheses, which could help to promote user comfort and thus acceptance.
Collapse
|
5
|
Mohammadi M, Cardoso ASS, Andreasen Struijk LNS. Using workspace restrictiveness for adaptive velocity adjustment of assistive robots and upper limb exoskeletons. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2023; 2023:1-4. [PMID: 38082906 DOI: 10.1109/embc40787.2023.10341183] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2023]
Abstract
Individuals with severe disabilities can benefit from assistive robotic systems (ARS) for performing activities of daily living. However, limited control interfaces are available for individuals who cannot use their hands for the control, and most of these interfaces require high effort to perform simple tasks. Therefore, autonomous and intelligent control strategies were proposed for assisting with the control in complex tasks. In this paper, we presented an autonomous and adaptive method for adjusting an assistive robot's velocity in different regions of its workspace and reducing the robot velocity where fine control is required. Two participants controlled a JACO assistive robot to grasp and lift a bottle with and without the velocity adjustment method. The task was performed 9.1% faster with velocity adjustment. Furthermore, analyzing the robot trajectory showed that the method recognized highly restrictive regions and reduced the robot end-effector velocity accordingly.Clinical relevance- The autonomous velocity adjustment method can ease the control of ARSs and improve their usability, leading to a higher quality of life for individuals with severe disabilities who can benefit from ARSs.
Collapse
|
6
|
Pinheiro DJLL, Faber J, Micera S, Shokur S. Human-machine interface for two-dimensional steering control with the auricular muscles. Front Neurorobot 2023; 17:1154427. [PMID: 37342389 PMCID: PMC10277645 DOI: 10.3389/fnbot.2023.1154427] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2023] [Accepted: 05/16/2023] [Indexed: 06/22/2023] Open
Abstract
Human-machine interfaces (HMIs) can be used to decode a user's motor intention to control an external device. People that suffer from motor disabilities, such as spinal cord injury, can benefit from the uses of these interfaces. While many solutions can be found in this direction, there is still room for improvement both from a decoding, hardware, and subject-motor learning perspective. Here we show, in a series of experiments with non-disabled participants, a novel decoding and training paradigm allowing naïve participants to use their auricular muscles (AM) to control two degrees of freedom with a virtual cursor. AMs are particularly interesting because they are vestigial muscles and are often preserved after neurological diseases. Our method relies on the use of surface electromyographic records and the use of contraction levels of both AMs to modulate the velocity and direction of a cursor in a two-dimensional paradigm. We used a locking mechanism to fix the current position of each axis separately to enable the user to stop the cursor at a certain location. A five-session training procedure (20-30 min per session) with a 2D center-out task was performed by five volunteers. All participants increased their success rate (Initial: 52.78 ± 5.56%; Final: 72.22 ± 6.67%; median ± median absolute deviation) and their trajectory performances throughout the training. We implemented a dual task with visual distractors to assess the mental challenge of controlling while executing another task; our results suggest that the participants could perform the task in cognitively demanding conditions (success rate of 66.67 ± 5.56%). Finally, using the Nasa Task Load Index questionnaire, we found that participants reported lower mental demand and effort in the last two sessions. To summarize, all subjects could learn to control the movement of a cursor with two degrees of freedom using their AM, with a low impact on the cognitive load. Our study is a first step in developing AM-based decoders for HMIs for people with motor disabilities, such as spinal cord injury.
Collapse
Affiliation(s)
- Daniel J. L. L. Pinheiro
- Division of Neuroscience, Department of Neurology and Neurosurgery, Neuroengineering and Neurocognition Laboratory, Escola Paulista de Medicina, Universidade Federal de São Paulo, São Paulo, Brazil
- Translational Neural Engineering Lab, Institute Neuro X, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Jean Faber
- Division of Neuroscience, Department of Neurology and Neurosurgery, Neuroengineering and Neurocognition Laboratory, Escola Paulista de Medicina, Universidade Federal de São Paulo, São Paulo, Brazil
- Neuroengineering Laboratory, Division of Biomedical Engineering, Instituto de Ciência e Tecnologia, Universidade Federal de São Paulo, São José dos Campos, Brazil
| | - Silvestro Micera
- Translational Neural Engineering Lab, Institute Neuro X, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
- Department of Excellence in Robotics and AI, Institute of BioRobotics Interdisciplinary Health Center, Scuola Superiore Sant'Anna, Pisa, Italy
| | - Solaiman Shokur
- Translational Neural Engineering Lab, Institute Neuro X, École Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| |
Collapse
|
7
|
Thøgersen MB, Mohammadi M, Gull MA, Bengtson SH, Kobbelgaard FV, Bentsen B, Khan BYA, Severinsen KE, Bai S, Bak T, Moeslund TB, Kanstrup AM, Andreasen Struijk LNS. User Based Development and Test of the EXOTIC Exoskeleton: Empowering Individuals with Tetraplegia Using a Compact, Versatile, 5-DoF Upper Limb Exoskeleton Controlled through Intelligent Semi-Automated Shared Tongue Control. SENSORS (BASEL, SWITZERLAND) 2022; 22:6919. [PMID: 36146260 PMCID: PMC9502221 DOI: 10.3390/s22186919] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 09/06/2022] [Accepted: 09/07/2022] [Indexed: 06/16/2023]
Abstract
This paper presents the EXOTIC- a novel assistive upper limb exoskeleton for individuals with complete functional tetraplegia that provides an unprecedented level of versatility and control. The current literature on exoskeletons mainly focuses on the basic technical aspects of exoskeleton design and control while the context in which these exoskeletons should function is less or not prioritized even though it poses important technical requirements. We considered all sources of design requirements, from the basic technical functions to the real-world practical application. The EXOTIC features: (1) a compact, safe, wheelchair-mountable, easy to don and doff exoskeleton capable of facilitating multiple highly desired activities of daily living for individuals with tetraplegia; (2) a semi-automated computer vision guidance system that can be enabled by the user when relevant; (3) a tongue control interface allowing for full, volitional, and continuous control over all possible motions of the exoskeleton. The EXOTIC was tested on ten able-bodied individuals and three users with tetraplegia caused by spinal cord injury. During the tests the EXOTIC succeeded in fully assisting tasks such as drinking and picking up snacks, even for users with complete functional tetraplegia and the need for a ventilator. The users confirmed the usability of the EXOTIC.
Collapse
Affiliation(s)
- Mikkel Berg Thøgersen
- Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, 9220 Aalborg, Denmark
| | - Mostafa Mohammadi
- Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, 9220 Aalborg, Denmark
| | - Muhammad Ahsan Gull
- Department of Materials and Production Technology, Aalborg University, 9220 Aalborg, Denmark
| | - Stefan Hein Bengtson
- Visual Analysis and Perception (VAP) Lab, Department of Architecture, Design, and Media Technology, Aalborg University, 9000 Aalborg, Denmark
| | | | - Bo Bentsen
- Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, 9220 Aalborg, Denmark
| | - Benjamin Yamin Ali Khan
- Spinal Cord Injury Centre of Western Denmark, Viborg Regional Hospital, 8800 Viborg, Denmark
| | - Kåre Eg Severinsen
- Spinal Cord Injury Centre of Western Denmark, Viborg Regional Hospital, 8800 Viborg, Denmark
| | - Shaoping Bai
- Department of Materials and Production Technology, Aalborg University, 9220 Aalborg, Denmark
| | - Thomas Bak
- Department of Electronic Systems, Aalborg University, 9220 Aalborg, Denmark
| | - Thomas Baltzer Moeslund
- Visual Analysis and Perception (VAP) Lab, Department of Architecture, Design, and Media Technology, Aalborg University, 9000 Aalborg, Denmark
| | | | - Lotte N. S. Andreasen Struijk
- Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, 9220 Aalborg, Denmark
| |
Collapse
|
8
|
Kirtas O, Veltink P, Lontis R, Mohammadi M, Andreasen Struijk LNS. Development of inductive sensors for a robotic interface based on noninvasive tongue control. IEEE Int Conf Rehabil Robot 2022; 2022:1-6. [PMID: 36176082 DOI: 10.1109/icorr55369.2022.9896548] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
Tongue based robotic interfaces have shown the potential to control assistive robotic devices developed for individuals with severe disabilities due to spinal cord injury. However, current tongue-robotic interfaces require invasive methods such as piercing to attach an activation unit (AU) to the tongue. A noninvasive tongue interface concept, which used a frame integrated AU instead of a tongue attached AU, was previously proposed. However, there is a need for the development of compact one-piece sensor printed circuit boards (PCBs) to enable activation of all inductive sensors. In this study, we developed and tested four designs of compact one-piece sensor PCBs incorporating inductive sensors for the design of a noninvasive tongue-robotic interface. We measured electrical parameters of the developed sensors to detect activation and compared them with a sensor of the current version of the inductive tongue-computer interface (ITCI) by moving AUs with different contact surfaces at the surface of the sensors. Results showed that, the newly developed inductive sensors had higher and wider activation than the sensor of ITCI and the AU with a flat contact surface had 3.5 - 4 times higher activation than the AU with a spherical contact surface. A higher sensor activation can result in a higher signal to noise ratio and thus a higher AU tracking resolution.
Collapse
|
9
|
Cardoso ASS, Andreasen Struijk LNS, Kaeseler RL, Jochumsen M. Comparing the Usability of Alternative EEG Devices to Traditional Electrode Caps for SSVEP-BCI Controlled Assistive Robots. IEEE Int Conf Rehabil Robot 2022; 2022:1-6. [PMID: 36176154 DOI: 10.1109/icorr55369.2022.9896588] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
Despite having the potential to improve the lives of severely paralyzed users, non-invasive Brain Computer Interfaces (BCI) have yet to be integrated into their daily lives. The widespread adoption of BCI-driven assistive technology is hindered by its lacking usability, as both end-users and researchers alike find fault with traditional EEG caps. In this paper, we compare the usability of four EEG recording devices for Steady-State Visually Evoked Potentials (SSVEP)-BCI applications: an EEG cap (active gel electrodes), two headbands (passive gel or active dry electrodes), and two adhesive electrodes placed on each mastoid. Ten able-bodied participants tested each device by completing an 8-target SSVEP paradigm. Setup times were recorded, and participants rated their satisfaction with each device. The EEG cap obtained the best classification accuracies (Median = 98.96%), followed by the gel electrode headband (Median = 93.75%), and the dry electrode headband (Median = 91.14%). The mastoid electrodes obtained classification accuracies close to chance level (Med = 29.69%). Unknowing of the classification accuracy, participants found the mastoid electrodes to be the most comfortable and discrete. The dry electrode headband obtained the lowest user satisfaction score and was criticized for being too uncomfortable. Participants also noted that the EEG cap was too conspicuous. The gel-based headband provided a good trade-off between BCI performance and user satisfaction.
Collapse
|
10
|
Computer Vision-Based Adaptive Semi-Autonomous Control of an Upper Limb Exoskeleton for Individuals with Tetraplegia. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12094374] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
We propose the use of computer vision for adaptive semi-autonomous control of an upper limb exoskeleton for assisting users with severe tetraplegia to increase independence and quality of life. A tongue-based interface was used together with the semi-autonomous control such that individuals with complete tetraplegia were able to use it despite being paralyzed from the neck down. The semi-autonomous control uses computer vision to detect nearby objects and estimate how to grasp them to assist the user in controlling the exoskeleton. Three control schemes were tested: non-autonomous (i.e., manual control using the tongue) control, semi-autonomous control with a fixed level of autonomy, and a semi-autonomous control with a confidence-based adaptive level of autonomy. Studies on experimental participants with and without tetraplegia were carried out. The control schemes were evaluated both in terms of their performance, such as the time and number of commands needed to complete a given task, as well as ratings from the users. The studies showed a clear and significant improvement in both performance and user ratings when using either of the semi-autonomous control schemes. The adaptive semi-autonomous control outperformed the fixed version in some scenarios, namely, in the more complex tasks and with users with more training in using the system.
Collapse
|