1
|
Liu R, Song Q, Ma T, Pan H, Li H, Zhao X. SoftBoMI: a non-invasive wearable body-machine interface for mapping movement of shoulder to commands. J Neural Eng 2024; 21:066007. [PMID: 39454612 DOI: 10.1088/1741-2552/ad8b6e] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2024] [Accepted: 10/25/2024] [Indexed: 10/28/2024]
Abstract
Objective.Customized human-machine interfaces for controlling assistive devices are vital in improving the self-help ability of upper limb amputees and tetraplegic patients. Given that most of them possess residual shoulder mobility, using it to generate commands to operate assistive devices can serve as a complementary approach to brain-computer interfaces.Approach.We propose a hybrid body-machine interface prototype that integrates soft sensors and an inertial measurement unit. This study introduces both a rule-based data decoding method and a user intent inference-based decoding method to map human shoulder movements into continuous commands. Additionally, by incorporating prior knowledge of the user's operational performance into a shared autonomy framework, we implement an adaptive switching command mapping approach. This approach enables seamless transitions between the two decoding methods, enhancing their adaptability across different tasks.Main results.The proposed method has been validated on individuals with cervical spinal cord injury, bilateral arm amputation, and healthy subjects through a series of center-out target reaching tasks and a virtual powered wheelchair driving task. The experimental results show that using both the soft sensors and the gyroscope exhibits the most well-rounded performance in intent inference. Additionally, the rule-based method demonstrates better dynamic performance for wheelchair operation, while the intent inference method is more accurate but has higher latency. Adaptive switching decoding methods offer the best adaptability by seamlessly transitioning between decoding methods for different tasks. Furthermore, we discussed the differences and characteristics among the various types of participants in the experiment.Significance.The proposed method has the potential to be integrated into clothing, enabling non-invasive interaction with assistive devices in daily life, and could serve as a tool for rehabilitation assessment in the future.
Collapse
Affiliation(s)
- Rongkai Liu
- Hefei Institutes of Physical Science (HFIPS), Chinese Academy of Sciences, Hefei 230031, Anhui, People's Republic of China
- University of Science and Technology of China (USTC), Hefei 230026, Anhui, People's Republic of China
| | - Quanjun Song
- Hefei Institutes of Physical Science (HFIPS), Chinese Academy of Sciences, Hefei 230031, Anhui, People's Republic of China
| | - Tingting Ma
- Hefei Institutes of Physical Science (HFIPS), Chinese Academy of Sciences, Hefei 230031, Anhui, People's Republic of China
- University of Science and Technology of China (USTC), Hefei 230026, Anhui, People's Republic of China
| | - Hongqing Pan
- Hefei Institutes of Physical Science (HFIPS), Chinese Academy of Sciences, Hefei 230031, Anhui, People's Republic of China
| | - Hao Li
- Hefei Institutes of Physical Science (HFIPS), Chinese Academy of Sciences, Hefei 230031, Anhui, People's Republic of China
| | - Xinyan Zhao
- Hefei Institutes of Physical Science (HFIPS), Chinese Academy of Sciences, Hefei 230031, Anhui, People's Republic of China
- University of Science and Technology of China (USTC), Hefei 230026, Anhui, People's Republic of China
| |
Collapse
|
2
|
Pierella C, D'Antuono C, Marchesi G, Menotti CE, Casadio M. A Computer Interface Controlled by Upper Limb Muscles: Effects of a Two Weeks Training on Younger and Older Adults. IEEE Trans Neural Syst Rehabil Eng 2023; 31:3744-3751. [PMID: 37676798 DOI: 10.1109/tnsre.2023.3312981] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/09/2023]
Abstract
As the population worldwide ages, there is a growing need for assistive technology and effective human-machine interfaces to address the wider range of motor disabilities that older adults may experience. Motor disabilities can make it difficult for individuals to perform basic daily tasks, such as getting dressed, preparing meals, or using a computer. The goal of this study was to investigate the effect of two weeks of training with a myoelectric computer interface (MCI) on motor functions in younger and older adults. Twenty people were recruited in the study: thirteen younger (range: 22-35 years old) and seven older (range: 61-78 years old) adults. Participants completed six training sessions of about 2 hours each, during which the activity of right and left biceps and trapezius were mapped into a control signal for the cursor of a computer. Results highlighted significant improvements in cursor control, and therefore in muscle coordination, in both groups. All participants with training became faster and more accurate, although people in different age range learned with a different dynamic. Results of the questionnaire on system usability and quality highlighted a general consensus about easiness of use and intuitiveness. These findings suggest that the proposed MCI training can be a powerful tool in the framework of assistive technologies for both younger and older adults. Further research is needed to determine the optimal duration and intensity of MCI training for different age groups and to investigate long-term effects of training on physical and cognitive function.
Collapse
|
3
|
Jiang B, Dollahon D, Manoharan S, Oh S, Park H, Kim J. Improving Tongue Command Accuracy: Unlocking the Power of Electrotactile Feedback Training. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2023; 2023:1-5. [PMID: 38083132 DOI: 10.1109/embc40787.2023.10340808] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2023]
Abstract
People with spinal cord injury or neurological disorders frequently require aid in performing daily tasks. Utilizing hand-free assistive technologies (ATs), particularly tongue-controlled ATs, may offer a feasible solution as the tongue is controlled by a cranial nerve and remains functional in the presence of spinal cord injury. However, existing intra-oral ATs require a significant level of training to accurately issuing these commands. To minimize the training process, we have designed intuitive tongue commands for our Multifunctional intraORal Assistive technology (MORA). Our prior works demonstrated that electrotactile feedback outperformed visual feedback in tasks involving tongue motor learning. In this study, we implement electrical stimulation (E-stim) as electrotactile feedback on the tongue to teach new tongue commands of MORA, and quantitatively analyze the efficacy of the electrotactile feedback in command accuracy and precision. The random command task was adopted to evaluate tongue command accuracy with 14 healthy participants. The average sensors contacted per trial dropped significantly from 1.57 ± 0.15 to 1.16 ± 0.05 with electrotactile feedback. After training with electrotactile feedback, 83% of the trials were completed with only one command having been activated. These results suggest that E-stim enhanced both the accuracy and precision of subjects' tongue command training. The results of this study pave the way for the implementation of electrotactile feedback as an accurate and precise command training technique for MORA.
Collapse
|
4
|
Ahlawat S, Kanaujia BK, Rambabu K, Peter I, Matekovits L. Circularly polarized differential intra-oral antenna design validation and characterization for tongue drive system. Sci Rep 2023; 13:9935. [PMID: 37336931 DOI: 10.1038/s41598-023-36717-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2023] [Accepted: 06/08/2023] [Indexed: 06/21/2023] Open
Abstract
Assistive devices are becoming increasingly popular for physically disabled persons suffering tetraplegia and spinal cord injuries. Intraoral tongue drive system (iTDS) is one of the most feasible and non-invasive assistive technology (AT), which utilises the transferring and inferring of user intentions through different tongue gestures. Wireless transferring is of prime importance and requires a suitable design of the intra-oral antenna. In this paper, a compact circularly polarized differential intra-oral antenna is designed, and its performance is analysed within heterogeneous multilayer mouth and head models. It works at 2.4 GHz in the Industrial, Scientific, and Medical (ISM) band. The footprint of the differential antenna prototype is 0.271 λg [Formula: see text] 0.271 λg [Formula: see text] 0.015 λg. It is achieved using two pairs of spiral segments loaded in diagonal form near the edges of the central rotated square slot and a high dielectric constant substrate. Its spiral-slotted geometry further provides the desired swirling and miniaturization at the desired frequency band for both mouth scenarios. Additionally, corner triangular slits on the radiating patch assist in tuning the axial ratio (< 3 dB) in the desired ISM band. To validate the performance of the proposed in-mouth antenna, the measurement was carried out using the minced pork and the saline solution for closed and opened mouth cases, respectively. The measured - 10 dB impedance bandwidth and peak gain values in the minced pork are from 2.28 to 2.53 GHz (10.39%) and - 18.17 dBi, respectively, and in the saline solution, are from 2.3 to 2.54 GHz (9.92%) and - 15.47 dBi, respectively. Further, the specific absorption rate (SAR) is estimated, and the data communication link is computed with and without a balun loss. This confirms that the proposed differential intraoral antenna can establish direct interfacing at the RF front end of the intraoral tongue drive system.
Collapse
Affiliation(s)
- Sarita Ahlawat
- School of Computational and Integrative Sciences, Jawaharlal Nehru University, New Delhi, 110067, India
| | - Binod Kumar Kanaujia
- School of Computational and Integrative Sciences, Jawaharlal Nehru University, New Delhi, 110067, India
- Dr. Ambedkar National Institute of Technology, Jalandhar, 144011, India
| | - Karumudi Rambabu
- Department of Electrical and Computer Engineering, University of Alberta, Edmonton, AB, T6G 2V4, Canada
| | - Ildiko Peter
- Department of Industrial Engineering and Management, Faculty of Engineering and Information Technology, George Emil Palade University of Medicine, Pharmacy, Science, and Technology of Targu Mures, 540139, Târgu-Mureş, Romania.
| | - Ladislau Matekovits
- Department of Electronics and Telecommunications, Politecnico di Torino, 10129, Turin, Italy
- Department of Measurements and Optical Electronics, Politehnica University Timisoara, 300223, Timisoara, Romania
- Instituto di Elettronica e di Ingegneria dell'informazione e delle Telecomunicazioni, National Research Council of Italy, 10129, Turin, Italy
| |
Collapse
|
5
|
Zhang X, Li J, Jin L, Zhao J, Huang Q, Song Z, Liu X, Luh DB. Design and Evaluation of the Extended FBS Model Based Gaze-Control Power Wheelchair for Individuals Facing Manual Control Challenges. SENSORS (BASEL, SWITZERLAND) 2023; 23:5571. [PMID: 37420738 PMCID: PMC10303982 DOI: 10.3390/s23125571] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/11/2023] [Revised: 06/09/2023] [Accepted: 06/12/2023] [Indexed: 07/09/2023]
Abstract
This study addresses the challenges faced by individuals with upper limb disadvantages in operating power wheelchair joysticks by utilizing the extended Function-Behavior-Structure (FBS) model to identify design requirements for an alternative wheelchair control system. A gaze-controlled wheelchair system is proposed based on design requirements from the extended FBS model and prioritized using the MosCow method. This innovative system relies on the user's natural gaze and comprises three levels: perception, decision making, and execution. The perception layer senses and acquires information from the environment, including user eye movements and driving context. The decision-making layer processes this information to determine the user's intended direction, while the execution layer controls the wheelchair's movement accordingly. The system's effectiveness was validated through indoor field testing, with an average driving drift of less than 20 cm for participates. Additionally, the user experience scale revealed overall positive user experiences and perceptions of the system's usability, ease of use, and satisfaction.
Collapse
Affiliation(s)
- Xiaochen Zhang
- Department of Industrial Design, Guangdong University of Technology, Guangzhou 510090, China; (X.Z.)
- Guangdong International Center of Advanced Design, Guangdong University of Technology, Guangzhou 510090, China
| | - Jiazhen Li
- Department of Industrial Design, Guangdong University of Technology, Guangzhou 510090, China; (X.Z.)
| | - Lingling Jin
- Department of Industrial Design, Guangdong University of Technology, Guangzhou 510090, China; (X.Z.)
| | - Jie Zhao
- Department of Industrial Design, Guangdong University of Technology, Guangzhou 510090, China; (X.Z.)
| | - Qianbo Huang
- Department of Industrial Design, Guangdong University of Technology, Guangzhou 510090, China; (X.Z.)
| | - Ziyang Song
- Department of Industrial Design, Guangdong University of Technology, Guangzhou 510090, China; (X.Z.)
| | - Xinyu Liu
- Department of Industrial Design, Guangdong University of Technology, Guangzhou 510090, China; (X.Z.)
| | - Ding-Bang Luh
- Department of Industrial Design, Guangdong University of Technology, Guangzhou 510090, China; (X.Z.)
- Guangdong International Center of Advanced Design, Guangdong University of Technology, Guangzhou 510090, China
| |
Collapse
|
6
|
Assistive Technologies and Quadriplegia: A Map Point on the Development and Spread of the Tongue Barbell Piercing. Healthcare (Basel) 2022; 11:healthcare11010101. [PMID: 36611561 PMCID: PMC9818748 DOI: 10.3390/healthcare11010101] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2022] [Revised: 12/14/2022] [Accepted: 12/21/2022] [Indexed: 12/31/2022] Open
Abstract
The barbell piercing can be used as an assistive device that allows people with severe disabilities, such as tetraplegia, to control their environments using the movement of the tongue. The human tongue can move rapidly and accurately, such that the tip can touch every tooth. Lingual control systems allow people with disabilities to take advantage of their residual skills for easier communication and to improve the control of mobility and the surrounding environment. The aim of this study was to conduct a narrative review of the development and dissemination of the assistive technologies based on tongue control by means of the barbell piercing. The design of the study was based on: (I) an overview of Pubmed complemented with other databases and Web searches (also institutional); (II) an organization according to a standardized checklist for narrative reviews; (III) an arrangement with four different perspectives: the trends in the scientific literature, technological evolution and categorization, dominant approaches, issues of incorporation into the health domain-such as acceptance, safety, and regulations. The results have highlighted: (1) that the volume of scientific productions, which started in this sector before the smartphone expansion, has not increased; (2) that it is possible to make a map point of the technological evolution and categorization; (3) that these assistive technologies have a high degree of acceptance and performance, especially when integrated with aid tools with mechatronics; (4) and the complexity of the regulatory framework in this area. The study, from a general point of view, highlighted the high potential of these systems and we suggest investing the energy into agreement tools for assistive technologies (AT)s, such as health technology assessment studies, comparative assessment analysis, or consensus conferences that could allow a better diffusion and use of ATs, including these systems.
Collapse
|
7
|
Cornelio P, Haggard P, Hornbaek K, Georgiou O, Bergström J, Subramanian S, Obrist M. The sense of agency in emerging technologies for human–computer integration: A review. Front Neurosci 2022; 16:949138. [PMID: 36172040 PMCID: PMC9511170 DOI: 10.3389/fnins.2022.949138] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2022] [Accepted: 08/05/2022] [Indexed: 11/13/2022] Open
Abstract
Human–computer integration is an emerging area in which the boundary between humans and technology is blurred as users and computers work collaboratively and share agency to execute tasks. The sense of agency (SoA) is an experience that arises by a combination of a voluntary motor action and sensory evidence whether the corresponding body movements have somehow influenced the course of external events. The SoA is not only a key part of our experiences in daily life but also in our interaction with technology as it gives us the feeling of “I did that” as opposed to “the system did that,” thus supporting a feeling of being in control. This feeling becomes critical with human–computer integration, wherein emerging technology directly influences people’s body, their actions, and the resulting outcomes. In this review, we analyse and classify current integration technologies based on what we currently know about agency in the literature, and propose a distinction between body augmentation, action augmentation, and outcome augmentation. For each category, we describe agency considerations and markers of differentiation that illustrate a relationship between assistance level (low, high), agency delegation (human, technology), and integration type (fusion, symbiosis). We conclude with a reflection on the opportunities and challenges of integrating humans with computers, and finalise with an expanded definition of human–computer integration including agency aspects which we consider to be particularly relevant. The aim this review is to provide researchers and practitioners with guidelines to situate their work within the integration research agenda and consider the implications of any technologies on SoA, and thus overall user experience when designing future technology.
Collapse
Affiliation(s)
- Patricia Cornelio
- Ultraleap Ltd., Bristol, United Kingdom
- Department of Computer Science, University College London, London, United Kingdom
- *Correspondence: Patricia Cornelio,
| | - Patrick Haggard
- Department of Computer Science, University College London, London, United Kingdom
| | - Kasper Hornbaek
- Department of Computer Science, University of Copenhagen, Copenhagen, Denmark
| | | | - Joanna Bergström
- Department of Computer Science, University of Copenhagen, Copenhagen, Denmark
| | - Sriram Subramanian
- Department of Computer Science, University College London, London, United Kingdom
| | - Marianna Obrist
- Department of Computer Science, University College London, London, United Kingdom
| |
Collapse
|
8
|
Gantenbein J, Meyer JT, Jager L, Sigrist R, Gassert R, Lambercy O. An Analysis of Intention Detection Strategies to Control Advanced Assistive Technologies at the CYBATHLON. IEEE Int Conf Rehabil Robot 2022; 2022:1-6. [PMID: 36176133 DOI: 10.1109/icorr55369.2022.9896539] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
With the increasing range of functionalities of advanced assistive technologies (AAT), reliable control and initiation of the desired actions become increasingly challenging for users. In this work, we present an analysis of current practices, user preferences, and usability of AAT intention detection strategies based on a survey among participants with disabilities at the CYBATHLON 2020 Global Edition. We collected data from 35 respondents, using devices in various disciplines and levels of technology maturity. We found that conventional, direct inputs such as buttons and joysticks are used by the majority of AAT (71.4%) due to their simplicity and learnability. However, 22 respondents (62.8%) reported a desire for more natural control using muscle or non-invasive brain signals, and 37.1% even reported an openness to invasive strategies for potentially improved control. The usability of the used strategies in terms of the explored attributes (reliability, mental effort, required learning) was mainly perceived positively, whereas no significant difference was observed across intention detection strategies and device types. It can be assumed that the strategies used during the CYBATHLON realistically represent options to control an AAT in a dynamic, physically and mentally demanding environment. Thus, this work underlines the need for carefully considering user needs and preferences for the selection of intention detection strategies in a context of use outside the laboratory.
Collapse
|
9
|
Jiang B, Kim J, Park H. Palatal Electrotactile Display Outperforms Visual Display in Tongue Motor learning. IEEE Trans Neural Syst Rehabil Eng 2022; 30:529-539. [PMID: 35245197 DOI: 10.1109/tnsre.2022.3156398] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Incomplete tongue motor control is a common yet challenging issue among individuals with neurotraumas and neurological disorders. In development of the training protocols, multiple sensory modalities including visual, auditory, and tactile feedback have been employed. However, the effectiveness of each sensory modality in tongue motor learning is still in question. The object of this study was to test the effectiveness of visual and electrotactile assistance on tongue motor learning, respectively. Eight healthy subjects performed the tongue pointing task, in which they were visually instructed to touch the target on the palate by their tongue tip as accurately as possible. Each subject wore a custom-made dental retainer with 12 electrodes distributed over the palatal area. For visual training, 3×4 LED array on the computer screen, corresponding to the electrode layout, was turned on with different colors according to the tongue contact. For electrotactile training, electrical stimulation was applied to the tongue with frequencies depending on the distance between the tongue contact and the target, along with a small protrusion on the retainer as an indicator of the target. One baseline session, one training session, and three post-training sessions were conducted over four-day duration. Experimental result showed that the error was decreased after both visual and electrotactile trainings, from 3.56±0.11 (Mean±STE) to 1.27±0.16, and from 3.97±0.11 to 0.53±0.19, respectively. The result also showed that electrotactile training leads to stronger retention than visual training, as the improvement was retained as 62.68±1.81% after electrotactile training and 36.59±2.24% after visual training, at 3-day post training.
Collapse
|
10
|
Mohammadi M, Knoche H, Thøgersen M, Bengtson SH, Gull MA, Bentsen B, Gaihede M, Severinsen KE, Andreasen Struijk LNS. Eyes-Free Tongue Gesture and Tongue Joystick Control of a Five DOF Upper-Limb Exoskeleton for Severely Disabled Individuals. Front Neurosci 2022; 15:739279. [PMID: 34975367 PMCID: PMC8718615 DOI: 10.3389/fnins.2021.739279] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2021] [Accepted: 11/23/2021] [Indexed: 11/30/2022] Open
Abstract
Spinal cord injury can leave the affected individual severely disabled with a low level of independence and quality of life. Assistive upper-limb exoskeletons are one of the solutions that can enable an individual with tetraplegia (paralysis in both arms and legs) to perform simple activities of daily living by mobilizing the arm. Providing an efficient user interface that can provide full continuous control of such a device—safely and intuitively—with multiple degrees of freedom (DOFs) still remains a challenge. In this study, a control interface for an assistive upper-limb exoskeleton with five DOFs based on an intraoral tongue-computer interface (ITCI) for individuals with tetraplegia was proposed. Furthermore, we evaluated eyes-free use of the ITCI for the first time and compared two tongue-operated control methods, one based on tongue gestures and the other based on dynamic virtual buttons and a joystick-like control. Ten able-bodied participants tongue controlled the exoskeleton for a drinking task with and without visual feedback on a screen in three experimental sessions. As a baseline, the participants performed the drinking task with a standard gamepad. The results showed that it was possible to control the exoskeleton with the tongue even without visual feedback and to perform the drinking task at 65.1% of the speed of the gamepad. In a clinical case study, an individual with tetraplegia further succeeded to fully control the exoskeleton and perform the drinking task only 5.6% slower than the able-bodied group. This study demonstrated the first single-modal control interface that can enable individuals with complete tetraplegia to fully and continuously control a five-DOF upper limb exoskeleton and perform a drinking task after only 2 h of training. The interface was used both with and without visual feedback.
Collapse
Affiliation(s)
- Mostafa Mohammadi
- Neurorehabilitation Robotics and Engineering, Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, Aalborg, Denmark
| | - Hendrik Knoche
- Human Machine Interaction, Department of Architecture, Design and Media Technology, Aalborg University, Aalborg, Denmark
| | - Mikkel Thøgersen
- Neurorehabilitation Robotics and Engineering, Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, Aalborg, Denmark
| | - Stefan Hein Bengtson
- Human Machine Interaction, Department of Architecture, Design and Media Technology, Aalborg University, Aalborg, Denmark
| | - Muhammad Ahsan Gull
- Department of Materials and Production, Aalborg University, Aalborg, Denmark
| | - Bo Bentsen
- Neurorehabilitation Robotics and Engineering, Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, Aalborg, Denmark
| | - Michael Gaihede
- Department of Clinical Medicine, Aalborg University, Aalborg, Denmark
| | | | - Lotte N S Andreasen Struijk
- Neurorehabilitation Robotics and Engineering, Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, Aalborg, Denmark
| |
Collapse
|
11
|
Taheri A, Weissman Z, Sra M. Design and Evaluation of a Hands-Free Video Game Controller for Individuals With Motor Impairments. FRONTIERS IN COMPUTER SCIENCE 2021. [DOI: 10.3389/fcomp.2021.751455] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Over the past few decades, video gaming has evolved at a tremendous rate although game input methods have been slower to change. Game input methods continue to rely on two-handed control of the joystick and D-pad or the keyboard and mouse for simultaneously controlling player movement and camera actions. Bi-manual input poses a significant play impediment to those with severe motor impairments. In this work, we propose and evaluate a hands-free game input control method that uses real-time facial expression recognition. Through our novel input method, our goal is to enable and empower individuals with neurological and neuromuscular diseases, who may lack hand muscle control, to be able to independently play video games. To evaluate the usability and acceptance of our system, we conducted a remote user study with eight severely motor-impaired individuals. Our results indicate high user satisfaction and greater preference for our input system with participants rating the input system as easy to learn. With this work, we aim to highlight that facial expression recognition can be a valuable input method.
Collapse
|
12
|
Kim J, Wichmann T, Inan OT, DeWeerth SP. Fitts Law-Based Performance Metrics to Quantify Tremor in Individuals with Essential Tremor. IEEE J Biomed Health Inform 2021; 26:2169-2179. [PMID: 34851839 DOI: 10.1109/jbhi.2021.3129989] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
Current methods of evaluating essential tremor (ET) either rely on subjective ratings or use limited tremor metrics (i.e., severity/amplitude and frequency). In this study, we explored performance metrics from Fitts law tasks that replicate and expand existing tremor metrics, to enable low-cost, home-based tremor quantification and analyze the cursor movements of individuals using a 3D mouse while performing a collection of drawing tasks. We analyzed the 3D mouse cursor movements of 11 patients with ET and three controls, on three computer-based tasksa spiral navigation (SPN) task, a rectangular track navigation (RTN) task, and multi-directional tapping/clicking (MDT)with several performance metrics (i.e., outside area (OA), throughput (TP in Fitts law), path efficiency (PE), and completion time (CT)). Using an accelerometer and scores from the Essential Tremor Rating Assessment Scale (TETRAS), we correlated the proposed performance metrics with the baseline tremor metrics and found that the OA of the SPN and RTN tasks were strongly correlated with baseline tremor severity (R2=0.57 and R2=0.83). We also found that the TP in the MDT tasks were strongly correlated with tremor frequency (R2=0.70). In addition, as the OA of the SPN and RTN tasks was correlated with tremor severity and frequency, it may represent an independent metric that increases the dimensionality of the characterization of an individuals tremor. Thus, this pilot study of the analysis of those with ET-associated tremor performing Fitts law tasks demonstrates the feasibility of introducing a new tremor metric that can be expanded for repeatable multi-dimensional data analyses.
Collapse
|
13
|
Sebkhi N, Bhavsar A, Sahadat MN, Baldwin J, Walling E, Biniker A, Hoefnagel M, Tonuzi G, Osborne R, Anderson D, Inan O. Evaluation of a Head-Tongue Controller for Power Wheelchair Driving By People With Quadriplegia. IEEE Trans Biomed Eng 2021; 69:1302-1309. [PMID: 34529559 DOI: 10.1109/tbme.2021.3113291] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
The head-tongue controller (HTC) is a multimodal alternative controller designed for people with quadriplegia to access complex control capabilities by combining tongue and head tracking to offer both discrete and proportional controls in a single controller. In this human study, 17 patients with quadriplegia and current users of alternative controllers were asked to perform four trials of either simple driving tasks or advanced maneuvers in a custom-designed course. Completion time and accuracy were compared between their personal alternative controller (PAC) and various combinations of driving modalities with the HTC. Out of 8 subjects assigned to simple driving, the best HTC trial of 3 subjects was completed faster than their PAC for the tasks of rolling forward and turning around cones, and 5 subjects in rolling backward. Across all these subjects, the average completion time of their best HTC modality is 23s for rolling forward, 15s for rolling backward, and 70s for turning around cones as compared to 19s, 17s, and 45s with their PAC. For advanced driving, the course was completed faster with the HTC by 1 out of 9 subjects, while the best HTC trials of all subjects are less than 1.3 times of their best PAC completion time with an average of 170s for the HTC and 140s for their PAC. The qualitative feedback provided by all subjects to a post-study questionnaire scored to an average of 7.5 out of 10 which shows their interests in the HTC and acknowledgement of its usefulness for this population.
Collapse
|
14
|
Zhang Z, Prilutsky BI, Butler AJ, Shinohara M, Ghovanloo M. Design and Preliminary Evaluation of a Tongue-Operated Exoskeleton System for Upper Limb Rehabilitation. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2021; 18:8708. [PMID: 34444456 PMCID: PMC8393282 DOI: 10.3390/ijerph18168708] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/08/2021] [Revised: 08/10/2021] [Accepted: 08/12/2021] [Indexed: 01/17/2023]
Abstract
Stroke is a devastating condition that may cause upper limb paralysis. Robotic rehabilitation with self-initiated and assisted movements is a promising technology that could help restore upper limb function. Previous studies have established that the tongue motion can be used to communicate human intent and control a rehabilitation robot/assistive device. The goal of this study was to evaluate a tongue-operated exoskeleton system (TDS-KA), which we have developed for upper limb rehabilitation. We adopted a tongue-operated assistive technology, called the tongue drive system (TDS), and interfaced it with the exoskeleton KINARM. We also developed arm reaching and tracking tasks, controlled by different tongue operation modes, for training and evaluation of arm motor function. Arm reaching and tracking tasks were tested in 10 healthy participants (seven males and three females, 23-60 years) and two female stroke survivors with upper extremity impairment (32 and 58 years). All healthy and two stroke participants successfully performed the tasks. One stroke subject demonstrated a clinically significant improvement in Fugl-Meyer upper extremity score after practicing the tasks in six 3-h sessions. We conclude that the TDS-KA system can accurately translate tongue commands to exoskeleton arm movements, quantify the function of the arm, and perform rehabilitation training.
Collapse
Affiliation(s)
- Zhenxuan Zhang
- School of Electrical & Computer Engineering, Georgia Institute of Technology, Atlanta, GA 30308, USA;
| | - Boris I. Prilutsky
- School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA 30332, USA;
| | - Andrew J. Butler
- School of Health Professions, The University of Alabama at Birmingham, Birmingham, AL 35294, USA;
| | - Minoru Shinohara
- School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA 30332, USA;
| | | |
Collapse
|
15
|
A Bibliometric Analysis of Human-Machine Interaction Methodology for Electric-Powered Wheelchairs Driving from 1998 to 2020. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2021; 18:ijerph18147567. [PMID: 34300017 PMCID: PMC8304937 DOI: 10.3390/ijerph18147567] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/23/2021] [Revised: 05/17/2021] [Accepted: 07/10/2021] [Indexed: 11/17/2022]
Abstract
Electric power wheelchairs (EPWs) enhance the mobility capability of the elderly and the disabled, while the human-machine interaction (HMI) determines how well the human intention will be precisely delivered and how human-machine system cooperation will be efficiently conducted. A bibliometric quantitative analysis of 1154 publications related to this research field, published between 1998 and 2020, was conducted. We identified the development status, contributors, hot topics, and potential future research directions of this field. We believe that the combination of intelligence and humanization of an EPW HMI system based on human-machine collaboration is an emerging trend in EPW HMI methodology research. Particular attention should be paid to evaluating the applicability and benefits of the EPW HMI methodology for the users, as well as how much it contributes to society. This study offers researchers a comprehensive understanding of EPW HMI studies in the past 22 years and latest trends from the evolutionary footprints and forward-thinking insights regarding future research.
Collapse
|
16
|
Mohammadi M, Knoche H, Struijk LNSA. Continuous Tongue Robot Mapping for Paralyzed Individuals Improves the Functional Performance of Tongue-Based Robotic Assistance. IEEE Trans Biomed Eng 2021; 68:2552-2562. [PMID: 33513095 DOI: 10.1109/tbme.2021.3055250] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
Individuals with tetraplegia have a challenging life due to a lack of independence and autonomy. Assistive robots have the potential to assist with the activities of daily living and thus improve the quality of life. However, an efficient and reliable control interface for severely disabled individuals is still missing. An intraoral tongue-computer interface (ITCI) for people with tetraplegia has previously been introduced and tested for controlling a robotic manipulator in a study deploying discrete tongue robot mapping. To improve the efficiency of the interface, the current study proposed the use of virtual buttons based on the ITCI and evaluated them in combination with a joystick-like control implementation, enabling continuous control commands. Twelve able-bodied volunteers participated in a three-day experiment. They controlled an assistive robotic manipulator through the tongue to perform two tasks: Pouring water in a cup (PW) and picking up a roll of tape (PUT). Four different tongue-robot mapping methods were compared. The results showed that using continuous commands reduced the task completion time by 16% and the number of commands of the PUT test by 20% compared with discrete commands. The highest success rate for completing the tasks was 77.8% for the PUT test and 100% for the PW test, both achieved by the control methods with continuous commands. Thus, the study demonstrated that incorporating continuous commands can improve the performance of the ITCI system for controlling robotic manipulators.
Collapse
|
17
|
Shared control methodology based on head positioning and vector fields for people with quadriplegia. ROBOTICA 2021. [DOI: 10.1017/s0263574721000606] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
AbstractMobile robotic systems are used in a wide range of applications. Especially in the assistive field, they can enhance the mobility of the elderly and disable people. Modern robotic technologies have been implemented in wheelchairs to give them intelligence. Thus, by equipping wheelchairs with intelligent algorithms, controllers, and sensors, it is possible to share the wheelchair control between the user and the autonomous system. The present research proposes a methodology for intelligent wheelchairs based on head movements and vector fields. In this work, the user indicates where to go, and the system performs obstacle avoidance and planning. The focus is developing an assistive technology for people with quadriplegia that presents partial movements, such as the shoulder and neck musculature. The developed system uses shared control of velocity. It employs a depth camera to recognize obstacles in the environment and an inertial measurement unit (IMU) sensor to recognize the desired movement pattern measuring the user’s head inclination. The proposed methodology computes a repulsive vector field and works to increase maneuverability and safety. Thus, global localization and mapping are unnecessary. The results were evaluated by simulated models and practical tests using a Pioneer-P3DX differential robot to show the system’s applicability.
Collapse
|
18
|
Kutbi M, Du X, Chang Y, Sun B, Agadakos N, Li H, Hua G, Mordohai P. Usability Studies of an Egocentric Vision-Based Robotic Wheelchair. ACM TRANSACTIONS ON HUMAN-ROBOT INTERACTION 2021. [DOI: 10.1145/3399434] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
Motivated by the need to improve the quality of life for the elderly and disabled individuals who rely on wheelchairs for mobility, and who may have limited or no hand functionality at all, we propose an egocentric computer vision based co-robot wheelchair to enhance their mobility without hand usage. The robot is built using a commercially available powered wheelchair modified to be controlled by head motion. Head motion is measured by tracking an egocentric camera mounted on the user’s head and faces outward. Compared with previous approaches to hands-free mobility, our system provides a more natural human robot interface because it enables the user to control the speed and direction of motion in a continuous fashion, as opposed to providing a small number of discrete commands. This article presents three usability studies, which were conducted on 37 subjects. The first two usability studies focus on comparing the proposed control method with existing solutions while the third study was conducted to assess the effectiveness of training subjects to operate the wheelchair over several sessions. A limitation of our studies is that they have been conducted with healthy participants. Our findings, however, pave the way for further studies with subjects with disabilities.
Collapse
Affiliation(s)
| | - Xiaoxue Du
- Teachers College, Columbia University, New York, NY, USA
| | - Yizhe Chang
- California State Polytechnic University, Pomona, California, USA
| | - Bo Sun
- Stevens Institute of Technology, Castle Point on Hudson, Hoboken, NJ, USA
| | | | | | - Gang Hua
- Wormpex AI Research, Bellevue, WA, USA
| | - Philippos Mordohai
- Stevens Institute of Technology, Castle Point on Hudson, Hoboken, NJ, USA
| |
Collapse
|
19
|
Pierella C, Galofaro E, De Luca A, Losio L, Gamba S, Massone A, Mussa-Ivaldi FA, Casadio M. Recovery of Distal Arm Movements in Spinal Cord Injured Patients with a Body-Machine Interface: A Proof-of-Concept Study. SENSORS (BASEL, SWITZERLAND) 2021; 21:2243. [PMID: 33807007 PMCID: PMC8004832 DOI: 10.3390/s21062243] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/10/2021] [Revised: 03/18/2021] [Accepted: 03/19/2021] [Indexed: 11/29/2022]
Abstract
BACKGROUND The recovery of upper limb mobility and functions is essential for people with cervical spinal cord injuries (cSCI) to maximize independence in daily activities and ensure a successful return to normality. The rehabilitative path should include a thorough neuromotor evaluation and personalized treatments aimed at recovering motor functions. Body-machine interfaces (BoMI) have been proven to be capable of harnessing residual joint motions to control objects like computer cursors and virtual or physical wheelchairs and to promote motor recovery. However, their therapeutic application has still been limited to shoulder movements. Here, we expanded the use of BoMI to promote the whole arm's mobility, with a special focus on elbow movements. We also developed an instrumented evaluation test and a set of kinematic indicators for assessing residual abilities and recovery. METHODS Five inpatient cSCI subjects (four acute, one chronic) participated in a BoMI treatment complementary to their standard rehabilitative routine. The subjects wore a BoMI with sensors placed on both proximal and distal arm districts and practiced for 5 weeks. The BoMI was programmed to promote symmetry between right and left arms use and the forearms' mobility while playing games. To evaluate the effectiveness of the treatment, the subjects' kinematics were recorded while performing an evaluation test that involved functional bilateral arms movements, before, at the end, and three months after training. RESULTS At the end of the training, all subjects learned to efficiently use the interface despite being compelled by it to engage their most impaired movements. The subjects completed the training with bilateral symmetry in body recruitment, already present at the end of the familiarization, and they increased the forearm activity. The instrumental evaluation confirmed this. The elbow motion's angular amplitude improved for all subjects, and other kinematic parameters showed a trend towards the normality range. CONCLUSION The outcomes are preliminary evidence supporting the efficacy of the proposed BoMI as a rehabilitation tool to be considered for clinical practice. It also suggests an instrumental evaluation protocol and a set of indicators to assess and evaluate motor impairment and recovery in cSCI.
Collapse
Affiliation(s)
- Camilla Pierella
- Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), University of Genova, 16132 Genoa, Italy
- Department of Informatics, Bioengineering, Robotics and Systems Engineering (DIBRIS), University of Genoa, 16145 Genoa, Italy; (E.G.); (A.D.L.)
- Department of Physiology, Northwestern University, Chicago, IL 60611, USA;
- Shirley Ryan Ability Lab, Chicago, IL 60611, USA
| | - Elisa Galofaro
- Department of Informatics, Bioengineering, Robotics and Systems Engineering (DIBRIS), University of Genoa, 16145 Genoa, Italy; (E.G.); (A.D.L.)
- Assistive Robotics and Interactive Exosuits (ARIES) Lab, Institute of Computer Engineering (ZITI), University of Heidelberg, 69117 Heidelberg, Germany
| | - Alice De Luca
- Department of Informatics, Bioengineering, Robotics and Systems Engineering (DIBRIS), University of Genoa, 16145 Genoa, Italy; (E.G.); (A.D.L.)
- Movendo Technology, 16128 Genoa, Italy
- Recovery and Functional Reeducation Unit, Santa Corona Hospital, ASL2 Savonese, 17027 Pietra Ligure, Italy
| | - Luca Losio
- S.C. Unità Spinale Unipolare, Santa Corona Hospital, ASL2 Savonese, 17027 Pietra Ligure, Italy; (L.L.); (S.G.); (A.M.)
- Italian Spinal Cord Laboratory (SCIL), 17027 Pietra Ligure, Italy
| | - Simona Gamba
- S.C. Unità Spinale Unipolare, Santa Corona Hospital, ASL2 Savonese, 17027 Pietra Ligure, Italy; (L.L.); (S.G.); (A.M.)
- Italian Spinal Cord Laboratory (SCIL), 17027 Pietra Ligure, Italy
| | - Antonino Massone
- S.C. Unità Spinale Unipolare, Santa Corona Hospital, ASL2 Savonese, 17027 Pietra Ligure, Italy; (L.L.); (S.G.); (A.M.)
- Italian Spinal Cord Laboratory (SCIL), 17027 Pietra Ligure, Italy
| | - Ferdinando A. Mussa-Ivaldi
- Department of Physiology, Northwestern University, Chicago, IL 60611, USA;
- Shirley Ryan Ability Lab, Chicago, IL 60611, USA
- Department of Physical Medicine and Rehabilitation, Northwestern University, Evanston, IL 60208, USA
- Department of Biomedical Engineering, Northwestern University, Evanston, IL 60208, USA
| | - Maura Casadio
- Department of Informatics, Bioengineering, Robotics and Systems Engineering (DIBRIS), University of Genoa, 16145 Genoa, Italy; (E.G.); (A.D.L.)
- Department of Physiology, Northwestern University, Chicago, IL 60611, USA;
- Italian Spinal Cord Laboratory (SCIL), 17027 Pietra Ligure, Italy
| |
Collapse
|
20
|
Wang J, Yu J, Wang T, Li C, Wei Y, Deng X, Chen X. Emerging intraoral biosensors. J Mater Chem B 2021; 8:3341-3356. [PMID: 31904075 DOI: 10.1039/c9tb02352f] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
Biomedical devices that involved continuous and real-time health-care monitoring have drawn much attention in modern medicine, of which skin electronics and implantable devices are widely investigated. Skin electronics are characterized for their non-invasive access to the physiological signals, and implantable devices are superior at the diagnosis and therapy integration. Despite the significant progress achieved, many gaps remain to be explored to provide a more comprehensive overview of human health. As the connecting point of the outer environment and human systems, the oral cavity contains many unique biomarkers that are absent in skin or inner organs, and hence, this could become a promising alternative locus for designing health-care monitoring devices. In this review, we outline the status of the oral cavity during the communication of the environment and human systems and compare the intraoral devices with skin electronics and implantable devices from the biophysical and biochemical aspects. We further summarize the established diagnosis database and technologies that could be adopted to design intraoral biosensors. Finally, the challenges and potential opportunities for intraoral biosensors are discussed. Intraoral biosensors could become an important complement for existing biomedical devices to constitute a more reliable health-care monitoring system.
Collapse
Affiliation(s)
- Jianwu Wang
- Innovative Centre for Flexible Devices (iFLEX), School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, 639798, Singapore.
| | | | | | | | | | | | | |
Collapse
|
21
|
Controlling a robotic arm for functional tasks using a wireless head-joystick: A case study of a child with congenital absence of upper and lower limbs. PLoS One 2020; 15:e0226052. [PMID: 32756553 PMCID: PMC7406178 DOI: 10.1371/journal.pone.0226052] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2019] [Accepted: 06/29/2020] [Indexed: 11/19/2022] Open
Abstract
Children with movement impairments needing assistive devices for activities of daily living often require novel methods for controlling these devices. Body-machine interfaces, which rely on body movements, are particularly well-suited for children as they are non-invasive and have high signal-to-noise ratios. Here, we examined the use of a head-joystick to enable a child with congenital absence of all four limbs to control a seven degree-of-freedom robotic arm. Head movements were measured with a wireless inertial measurement unit and used to control a robotic arm to perform two functional tasks-a drinking task and a block stacking task. The child practiced these tasks over multiple sessions; a control participant performed the same tasks with a manual joystick. Our results showed that the child was able to successfully perform both tasks, with movement times decreasing by ~40-50% over 6-8 sessions of training. The child's performance with the head-joystick was also comparable to the control participant using a manual joystick. These results demonstrate the potential of using head movements for the control of high degree-of-freedom tasks in children with limited movement repertoire.
Collapse
|
22
|
Elliott MA, Malvar H, Maassel LL, Campbell J, Kulkarni H, Spiridonova I, Sophy N, Beavers J, Paradiso A, Needham C, Rifley J, Duffield M, Crawford J, Wood B, Cox EJ, Scanlan JM. Eye-controlled, power wheelchair performs well for ALS patients. Muscle Nerve 2019; 60:513-519. [PMID: 31397910 PMCID: PMC6851551 DOI: 10.1002/mus.26655] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2019] [Revised: 07/24/2019] [Accepted: 08/06/2019] [Indexed: 11/10/2022]
Abstract
BACKGROUND Our pilot study tested the feasibility and performance of an eye-controlled power wheelchair for amyotrophic lateral sclerosis (ALS) patients. METHODS In this prospective pilot study, participants drove the wheelchair three times around an indoor course. We assessed the time to complete the course; starting and stopping on command; turning 90, 135, and 180 degrees; time to backup; and obstacle negotiation. Following their use of the wheelchair, subjects were given a questionnaire to assess user experience. RESULTS Twelve patients participated, and all were able to complete three trials without difficulty. Eight participants completed all of the individual tasks (eg, turning, stopping, etc.) without any errors. Overall performance ratings were high across all participants (4.6/5-excellent). CONCLUSIONS Our eye-controlled power wheelchair prototype is feasible and has a very favorable user experience. This system has the potential to improve the mobility and independence of ALS patients, and other groups with motor impairments.
Collapse
Affiliation(s)
| | | | | | | | | | | | | | | | | | | | | | | | | | - Becky Wood
- Swedish Neuroscience InstituteSeattleWashington
| | - Emily J. Cox
- Providence Medical Research CenterSpokaneWashington
| | | |
Collapse
|
23
|
Kong F, Sahadat MN, Ghovanloo M, Durgin GD. A Stand-Alone Intraoral Tongue-Controlled Computer Interface for People With Tetraplegia. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS 2019; 13:848-857. [PMID: 31283486 DOI: 10.1109/tbcas.2019.2926755] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
The intraoral Tongue Drive System (iTDS) is an embedded wireless tongue-operated assistive technology developed for people with tetraplegia to provide them a higher level of independence in performing daily living tasks, such as accessing computers, smartphones, and driving wheelchairs. The iTDS was built as an arch-shaped dental retainer hermetically sealed and placed in the buccal shelf area of the mouth, completely hidden from sight. To provide high level of comfort, the iTDS is customized based on the users' oral anatomy to stably fix onto the lower teeth. We have presented a standalone version of the iTDS, capable of recognizing tongue gestures/commands by processing raw magnetic sensor data with a built-in pattern recognition algorithm in real time. The iTDS then sends the commands out in 10-b packets through a custom-designed high-gain intraoral antenna at 2.4 GHz to an external receiver. To evaluate the standalone iTDS performance, four subjects performed a computer access task by issuing random tongue commands over five sessions. Subjects completed 99.2% of the commands, and achieved an information transfer rate of 150.1 b/min. Moreover, a new typing method, designed specifically for the iTDS, resulted in typing at a rate of 3.76 words/min and error rate of 2.23%.
Collapse
|
24
|
Hildebrand M, Bonde F, Kobborg RVN, Andersen C, Norman AF, Thogersen M, Bengtson SH, Dosen S, Struijk NSLA. Semi-Autonomous Tongue Control of an Assistive Robotic Arm for Individuals with Quadriplegia. IEEE Int Conf Rehabil Robot 2019; 2019:157-162. [PMID: 31374623 DOI: 10.1109/icorr.2019.8779457] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Individuals suffering from quadriplegia can achieve increased independence by using an assistive robotic manipulator (ARM). However, due to their disability, the interfaces that can be used to operate such devices become limited. A versatile intraoral tongue control interface (ITCI) has previously been develop for this user group, as the tongue is usually spared from disability. A previous study has shown that the ITCI can provide direct and continuous control of 6-7 degrees of freedom (DoF) of an ARM, due to a high number of provided inputs (18). In the present pilot study we investigated whether semi-automation might further improve the efficiency of the ITCI, when controlling an ARM. This was achieved by adding a camera to the end effector of the ARM and using computer vision algorithms to guide the ARM to grasp a target object. Three ITCI and one joystick control scheme were tested and compared: 1) manual Cartesian control with a base frame reference point, 2) manual Cartesian control with an end effector reference point 3) manual Cartesian control with an end effector reference point and an autonomous grasp function 4) regular JACO2 joystick control. The results indicated that end effector control was superior to the base frame control in total task time, number of commands issued and path efficiency. The addition of the automatic grasp function did not improve the performance, but resulted in fewer collisions/displacements of the target object when grasping.
Collapse
|
25
|
Mohammadi M, Knoche H, Gaihede M, Bentsen B, Andreasen Struijk LNS. A high-resolution tongue-based joystick to enable robot control for individuals with severe disabilities. IEEE Int Conf Rehabil Robot 2019; 2019:1043-1048. [PMID: 31374767 DOI: 10.1109/icorr.2019.8779434] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Assistive robotic arms have shown the potential to improve the quality of life of people with severe disabilities. However, a high performance and intuitive control interface for robots with 6-7 DOFs is still missing for these individuals. An inductive tongue computer interface (ITCI) was recently tested for control of robots and the study illustrated potential in this field. The paper describes the investigation of the possibility of developing a high performance tongue-based joystick-like controller for robots through two studies. The first compared different methods for mapping the 18 sensor signals to a 2D coordinate, as a touchpad. The second evaluated the performance of a novel approach for emulating an analog joystick by the ITCI based on the ISO9241-411 standard. Two subjects performed a multi-directional tapping test using a standard analog joystick, the ITCI system held in hand and operated by the other hand, and finally by tongue when mounted inside the mouth. Throughput was measured as the evaluation parameter. The results show that the contact on the touchpads can be localized by almost 1 mm accuracy. The effective throughput of ITCI system for the multi-directional tapping test was 2.03 bps while keeping it in the hand and 1.31 bps when using it inside the mouth.
Collapse
|
26
|
Chanthaphun S, Heck SL, Winstein CJ, Baker L. Development of a training paradigm for voluntary control of the peri-auricular muscles: a feasibility study. J Neuroeng Rehabil 2019; 16:75. [PMID: 31200729 PMCID: PMC6570846 DOI: 10.1186/s12984-019-0540-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2018] [Accepted: 05/24/2019] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Spinal cord injury (SCI) can lead to severe and permanent functional deficits. In humans, peri-auricular muscles (PAMs) do not serve any physiological function, though their innervation is preserved in even high level SCI. Auricular control systems provide a good example of leveraging contemporary technologies (e.g., sEMG controlled computer games) to enable those with disabilities. Our primary objective is to develop and test the effectiveness of an auricular muscle training protocol to facilitate isolated and coordinated, bilateral voluntary control that could be used in individuals without volitional control of the vestigial PAMs. METHODS Seventeen non-disabled persons were screened; 13 were eligible and 10 completed the entire protocol. The facilitation phase, included one session of sub-motor threshold, sensory electrical stimulation followed by neuromuscular electrical stimulation paired with ear movement feedback for up to 8 additional sessions. Participants progressed to the skill acquisition phase where they dawned an auricular control device that used sEMG signals to control movements of a cursor through three levels of computer games, each requiring increasingly more complex PAM coordination. RESULTS The 10 who completed the protocol, finished the facilitation phase in 3 to 9 sessions and achieved some level of voluntary auricle movement that ranged between 1 and 5 mm. Qualitative analysis of longitudinal post-session auricular movement, revealed two subgroups of learners. Six successfully completed all 3 games-the "Learners". Two were partially successful in game completion and two were unable to complete a single game--"Poor/Non-Learners". Quantitative analysis revealed a significant group difference in auricular amplitude for both facilitation and skill phases (p < .05), and a significant relationship between performance in the two phases (R2 = 0.84, p = 0.004). CONCLUSION Sixty percent of those who completed the facilitation phase were able to learn and demonstrate functional voluntary control of the vestigial PAMs. Those who progressed the fastest through facilitation were also those who were most proficient in skill acquisition with the device. There was considerable variability in progression through the two-phase protocol, with 20% deemed Poor/Non-Learners and unable to complete even the most basic game following training. There were no serious adverse events. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT02358915 , first posted February 9, 2015.
Collapse
Affiliation(s)
- Siwaphorn Chanthaphun
- Department of Biokinesiology and Physical Therapy, Health Sciences Campus, Herman Ostrow School of Dentistry, University of Southern California, Los Angeles, California, 90089, USA
| | | | - Carolee J Winstein
- Department of Biokinesiology and Physical Therapy, Health Science Campus, Herman Ostrow School of Dentistry, Department of Neurology, Keck School of Medicine, University of Southern California, Los Angeles, California, 90089, USA.
| | - Lucinda Baker
- Department of Biokinesiology and Physical Therapy, Health Sciences Campus, Herman Ostrow School of Dentistry, University of Southern California, Los Angeles, California, 90089, USA
| |
Collapse
|
27
|
Letaief M, Rezzoug N, Gorce P. Comparison between joystick- and gaze-controlled electric wheelchair during narrow doorway crossing: Feasibility study and movement analysis. Assist Technol 2019; 33:26-37. [PMID: 30945980 DOI: 10.1080/10400435.2019.1586011] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022] Open
Abstract
Due to motor deficiencies inducing low force capabilities or tremor, many persons have great difficulties to use joystick-operated wheelchairs. To alleviate such difficulties, alternative interfaces using vocal, gaze, or brain signals are now becoming available. While promising, these systems still need to be evaluated thoroughly. In this framework, the aims of this study are to analyze and evaluate the behavior of 11 able-bodied subjects during a navigation task executed with gaze or joystick-operated electric wheelchair involving a door crossing. An electric wheelchair was equipped with retroreflective markers, and their movements were recorded with an optoelectronic system. The gaze commands were detected using an eye tracking device. Apart from classical, forward, backward, stop, left and right commands, the chosen screen-based interface integrated forward-right and forward-left commands. The global success rate with the gaze-based control was 80.3%. The path optimally ratio was 0.97 and the subject adopted similar trajectories with both systems. The results for gaze control are promising and highlight the important utilization of the forward-left and forward-right commands (25% of all issued commands) that may explain the similarity between the trajectories using both interfaces.
Collapse
Affiliation(s)
- Manel Letaief
- Laboratoire de Biomodélisation et d'Ingénierie des Handicaps, Université de Toulon , Toulon -var, La Garde, France
| | - Nasser Rezzoug
- Laboratoire de Biomodélisation et d'Ingénierie des Handicaps, Université de Toulon , Toulon -var, La Garde, France
| | - Philippe Gorce
- Laboratoire de Biomodélisation et d'Ingénierie des Handicaps, Université de Toulon , Toulon -var, La Garde, France
| |
Collapse
|
28
|
Campeau-Lecours A, Cote-Allard U, Vu DS, Routhier F, Gosselin B, Gosselin C. Intuitive Adaptive Orientation Control for Enhanced Human–Robot Interaction. IEEE T ROBOT 2019. [DOI: 10.1109/tro.2018.2885464] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
29
|
Shahin MK, Tharwat A, Gaber T, Hassanien AE. A Wheelchair Control System Using Human-Machine Interaction: Single-Modal and Multimodal Approaches. JOURNAL OF INTELLIGENT SYSTEMS 2019. [DOI: 10.1515/jisys-2017-0085] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
Abstract
Recent research studies showed that brain-controlled systems/devices are breakthrough technology. Such devices can provide disabled people with the power to control the movement of the wheelchair using different signals (e.g. EEG signals, head movements, and facial expressions). With this technology, disabled people can remotely steer a wheelchair, a computer, or a tablet. This paper introduces a simple, low-cost human-machine interface system to help chaired people to control their wheelchair using several control sources. To achieve this paper’s aim, a laptop was installed on a wheelchair in front of the sitting person, and the 14-electrode Emotiv EPOC headset was used to collect the person’s head impressions from the skull surface. The superficially picked-up signals, containing the brain thoughts, head gestures, and facial emotions, were electrically encoded and then wirelessly sent to a personal computer to be interpreted and then translated into useful control instructions. Using these signals, two wheelchair control modes were proposed: automatic (using single-modal and multimodal approaches) and manual control. The automatic mode controller was accomplished using a software controller (Arduino), whereas a simple hardware controller was used for the manual mode. The proposed solution was designed using wheelchair, Emotiv EPOC EEG headset, Arduino microcontroller, and Processing language. It was then tested by totally chaired volunteers under different levels of trajectories. The results showed that the person’s thoughts can be used to seamlessly control his/her wheelchair and the proposed system can be configured to suit many levels and degrees of disability.
Collapse
|
30
|
Kong F, Zada M, Yoo H, Ghovanloo M. Adaptive Matching Transmitter With Dual-Band Antenna for Intraoral Tongue Drive System. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS 2018; 12:1279-1288. [PMID: 30605083 DOI: 10.1109/tbcas.2018.2866960] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
The intraoral Tongue Drive System (iTDS) is a wireless assistive technology that detects users' voluntary tongue gestures, and converts them to user-defined commands, enabling them to access computers and navigate powered wheelchairs. In this paper, we presented a transmitter (Tx) with adaptive matching and three bands (27, 433, and 915 MHz) to create a robust wireless link between iTDS and an external receiver (Rx) by addressing the effects of external RF interference and impedance variations of the Tx antenna in the dynamic mouth environment. The upper two Tx bands share a dual-band antenna, while the lower band drives a coil. The Tx antenna is simulated in a simplified human mouth model in HFSS as well as a real human head model. The adaptive triple-band Tx chip was fabricated in a 0.35-μm 4P2M standard CMOS process. The Tx chip and antenna have been characterized in a human subject as part of an iTDS prototype under open-and closed-mouth scenarios, which present the peak gain of -24.4 and -15.63 dBi at 433 and 915 MHz, respectively. Two adaptive matching networks for these bands compensate variations of the Tx antenna impedance via a feedback mechanism. The measured S11 tuning range of the proposed network can cover up to 60 and 75 jΩ at 433 and 915 MHz, respectively.
Collapse
|
31
|
Chu FJ, Xu R, Zhang Z, Vela PA, Ghovanloo M. The Helping Hand: An Assistive Manipulation Framework Using Augmented Reality and Tongue-Drive Interfaces. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2018; 2018:2158-2161. [PMID: 30440831 DOI: 10.1109/embc.2018.8512668] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
A human-in-the-loop system is proposed to enable collaborative manipulation tasks for person with physical disabilities. Studies show that the cognitive burden of subject reduces with increased autonomy of assistive system. Our framework obtains high-level intent from the user to specify manipulation tasks. The system processes sensor input to interpret the user's environment. Augmented reality glasses provide ego-centric visual feedback of the interpretation and summarize robot affordances on a menu. A tongue drive system serves as the input modality for triggering a robotic arm to execute the tasks. Assistance experiments compare the system to Cartesian control and to state-of-the-art approaches. Our system achieves competitive results with faster completion time by simplifying manipulation tasks.
Collapse
|
32
|
Rabhi Y, Mrabet M, Fnaiech F. A facial expression controlled wheelchair for people with disabilities. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2018; 165:89-105. [PMID: 30337084 DOI: 10.1016/j.cmpb.2018.08.013] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/18/2017] [Revised: 08/03/2018] [Accepted: 08/17/2018] [Indexed: 06/08/2023]
Abstract
BACKGROUND AND OBJECTIVES In order to improve assistive technologies for people with reduced mobility, this paper develops a new intelligent real-time emotion detection system to control equipment, such as electric wheelchairs (EWC) or robotic assistance vehicles. Every year, degenerative diseases and traumas prohibit thousands of people to easily control the joystick of their wheelchairs with their hands. Most current technologies are considered invasive and uncomfortable such as those requiring the user to wear some body sensor to control the wheelchair. METHODS In this work, the proposed Human Machine Interface (HMI) provides an efficient hands-free option that does not require sensors or objects attached to the user's body. It allows the user to drive the wheelchair using its facial expressions which can be flexibly updated. This intelligent solution is based on a combination of neural networks (NN) and specific image preprocessing steps. First, the Viola-Jones combination is used to detect the face of the disability from a video. Subsequently, a neural network is used to classify the emotions displayed on the face. This solution called "The Mathematics Behind Emotion" is capable of classifying many facial expressions in real time, such as smiles and raised eyebrows, which are translated into signals for wheelchair control. On the hardware side, this solution only requires a smartphone and a Raspberry Pi card that can be easily mounted on the wheelchair. RESULTS Many experiments have been conducted to evaluate the efficiency of the control acquisition process and the user experience in driving a wheelchair through facial expressions. The classification accuracy can expect 98.6% and it can offer an average recall rate of 97.1%. Thus, all these experiments have proven that the proposed system is able of accurately recognizing user commands in real time. Indeed, the obtained results indicate that the suggested system is more comfortable and better adapted to severely disabled people in their daily lives, than conventional methods. Among the advantages of this system, we cite its real time ability to identify facial emotions from different angles. CONCLUSIONS The proposed system takes into account the patient's pathology. It is intuitive, modern, doesn't require physical effort and can be integrated into a smartphone or tablet. The results obtained highlight the efficiency and reliability of this system, which ensures safe navigation for the disabled patient.
Collapse
Affiliation(s)
- Yassine Rabhi
- University of Tunis, National Higher School of Engineers of Tunis, Laboratory of Signal Image and Energy Mastery (SIME), 5 Avenue Taha Hussein, P.O. Box 56, Tunis 1008, Tunisia.
| | - Makrem Mrabet
- University of Tunis, National Higher School of Engineers of Tunis, Laboratory of Signal Image and Energy Mastery (SIME), 5 Avenue Taha Hussein, P.O. Box 56, Tunis 1008, Tunisia
| | - Farhat Fnaiech
- University of Tunis, National Higher School of Engineers of Tunis, Laboratory of Signal Image and Energy Mastery (SIME), 5 Avenue Taha Hussein, P.O. Box 56, Tunis 1008, Tunisia
| |
Collapse
|
33
|
Sahadat MN, Alreja A, Mikail N, Ghovanloo M. Comparing the Use of Single vs. Multiple Combined Abilities in Conducting Complex Computer Tasks Hands-free. IEEE Trans Neural Syst Rehabil Eng 2018; 26:1868-1877. [PMID: 30106683 DOI: 10.1109/tnsre.2018.2864120] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
OBJECTIVE Assistive technologies often focus on a remaining ability of their users, particularly those with physical disabilities, e.g. tetraplegia, to facilitate their computer access. We hypothesized that by combining multiple remaining abilities of the end users in an intuitive fashion, it is possible to improve the quality of computer access. In this study, 15 able-bodied subjects completed four computer access tasks without using their hands: center-out tapping, on-screen maze navigation, playing a game, and sending an email. They used the multimodal Tongue Drive System (mTDS), which offers proportional cursor control via head motion, discrete clicks via tongue gestures, and typing via speech recognition simultaneously. Their performances were compared against unimodal tongue gestures (TDS), and Keyboard & Mouse combination (KnM), as the gold standard. RESULTS Center-out tapping task average throughputs using mTDS and TDS were 0.84 bps and 0.94 bps, which were 21% and 22.4% of the throughput using mouse, respectively, while the average error rate and missed targets using mTDS were 4.1% and 25.5% less than TDS. Maze navigation throughputs using mTDS and TDS were 0.35 bps and 0.46 bps, which were 16.6% and 21.8% of the throughput using mouse, respectively. Participants achieved 72.32% higher score using mTDS than TDS when playing a simple game. Average email generating time with mTDS was ~2x longer than KnM with a mean typing accuracy of 78.1%. CONCLUSION Engaging multimodal abilities helped participants perform considerably better in complex tasks, such as sending an email, compared to a unimodal system (TDS). Their performances were similar for simpler task, while multimodal inputs improved interaction accuracy. Cursor navigation with head motion led to higher score in less constrained tasks, such as game, than a highly constrained maze task.
Collapse
|
34
|
Rabhi Y, Mrabet M, Fnaiech F, Gorce P, Miri I, Dziri C. Intelligent Touchscreen Joystick for Controlling Electric Wheelchair. Ing Rech Biomed 2018. [DOI: 10.1016/j.irbm.2018.04.003] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
|
35
|
Puanhvuan D, Khemmachotikun S, Wechakarn P, Wijarn B, Wongsawat Y. Navigation-synchronized multimodal control wheelchair from brain to alternative assistive technologies for persons with severe disabilities. Cogn Neurodyn 2017; 11:117-134. [PMID: 28348644 PMCID: PMC5350091 DOI: 10.1007/s11571-017-9424-6] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2016] [Revised: 11/25/2016] [Accepted: 02/03/2017] [Indexed: 11/28/2022] Open
Abstract
Currently, electric wheelchairs are commonly used to improve mobility in disabled people. In severe cases, the user is unable to control the wheelchair by themselves because his/her motor functions are disabled. To restore mobility function, a brain-controlled wheelchair (BCW) would be a promising system that would allow the patient to control the wheelchair by their thoughts. P300 is a reliable brain electrical signal, a component of visual event-related potentials (ERPs), that could be used for interpreting user commands. This research aimed to propose a prototype BCW to allowed severe motor disabled patients to practically control a wheelchair for use in their home environment. The users were able to select from 9 possible destination commands in the automatic mode and from 4 directional commands (forward, backward, turn left and right) in the shared-control mode. These commands were selected via the designed P300 processing system. The wheelchair was steered to the desired location by the implemented navigation system. Safety of the user was ensured during wheelchair navigation due to the included obstacle detection and avoidance features. A combination of P300 and EOG was used as a hybrid BCW system. The user could fully operate the system such as enabling P300 detection system, mode shifting and stop/cancelation command by performing a different consecutive blinks to generate eye blinking patterns. The results revealed that the prototype BCW could be operated in either of the proposed modes. With the new design of the LED-based P300 stimulator, the average accuracies of the P300 detection algorithm in the shared-control and automatic modes were 95.31 and 83.42% with 3.09 and 3.79 bits/min, respectively. The P300 classification error was acceptable, as the user could cancel an incorrect command by blinking 2 times. Moreover, the proposed navigation system had a flexible design that could be interfaced with other assistive technologies. This research developed 3 alternative input modules: an eye tracker module and chin and hand controller modules. The user could select the most suitable assistive technology based on his/her level of disability. Other existing assistive technologies could also be connected to the proposed system in the future using the same protocol.
Collapse
Affiliation(s)
- Dilok Puanhvuan
- Department of Biomedical Engineering, Faculty of Engineering, Mahidol Unversity, 25/25, Putthamonthol 4 Road, Salaya, Putthamonthol, Nakhon Pathom 73170 Thailand
| | - Sarawin Khemmachotikun
- Department of Biomedical Engineering, Faculty of Engineering, Mahidol Unversity, 25/25, Putthamonthol 4 Road, Salaya, Putthamonthol, Nakhon Pathom 73170 Thailand
| | - Pongsakorn Wechakarn
- Department of Biomedical Engineering, Faculty of Engineering, Mahidol Unversity, 25/25, Putthamonthol 4 Road, Salaya, Putthamonthol, Nakhon Pathom 73170 Thailand
| | - Boonyanuch Wijarn
- Department of Biomedical Engineering, Faculty of Engineering, Mahidol Unversity, 25/25, Putthamonthol 4 Road, Salaya, Putthamonthol, Nakhon Pathom 73170 Thailand
| | - Yodchanan Wongsawat
- Department of Biomedical Engineering, Faculty of Engineering, Mahidol Unversity, 25/25, Putthamonthol 4 Road, Salaya, Putthamonthol, Nakhon Pathom 73170 Thailand
| |
Collapse
|
36
|
Sargolzaei S, Elahi H, Sokoloff A, Ghovanloo M. A Dual-Mode Magnetic-Acoustic System for Monitoring Fluid Intake Behavior in Animals. IEEE Trans Biomed Eng 2016; 64:2090-2097. [PMID: 27992324 DOI: 10.1109/tbme.2016.2638545] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
We have developed an unobtrusive magnetic-acoustic fluid intake monitoring (MAFIM) system using a conventional stainless-steel roller-ball nipple to measure licking and drinking behavior in animals. Movements of a small permanent magnetic tracer attached to stainless-steel roller balls that operate as a tongue-actuated valve are sensed by a pair of three-axial magnetometers, and transformed into a time-series indicating the status of the ball (up or down), using a Gaussian mixture model based data-driven classifier. The sounds produced by the rise and fall of the roller balls are also recorded and classified to substantiate the magnetic data by an independent modality for a more robust solution. The operation of the magnetic and acoustic sensors is controlled by an embedded system, communicating via Universal Serial Bus (USB) with a custom-designed user interface, running on a PC. The MAFIM system has been tested in vivo with minipigs, accurately measuring various drinking parameters and licking patterns without constraints imposed by current lick monitoring systems, such as nipple access, animal-nipple contact, animal training, and complex parameter settings.
Collapse
|
37
|
Lee MH, Ranganathan R, Kagerer FA, Mukherjee R. Body-machine interface for control of a screen cursor for a child with congenital absence of upper and lower limbs: a case report. J Neuroeng Rehabil 2016; 13:34. [PMID: 27009334 PMCID: PMC4806473 DOI: 10.1186/s12984-016-0139-4] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2016] [Accepted: 03/14/2016] [Indexed: 11/20/2022] Open
Abstract
Background There has been a recent interest in the development of body-machine interfaces which allow individuals with motor impairments to control assistive devices using body movements. Methods In this case study, we report findings in the context of the development of such an interface for a 10-year old child with congenital absence of upper and lower limbs. The interface consisted of 4 wireless inertial measurement units (IMUs), which we used to map movements of the upper body to the position of a cursor on a screen. We examined the learning of a task in which the child had to move the cursor to specified targets on the screen as quickly as possible. In addition, we also determined the robustness of the interface by evaluating the child’s performance in two different body postures. Results We found that the child was not only able to learn the task rapidly, but also showed superior performance when compared to typically developing children in the same age range. Moreover, task performance was comparable for the two different body postures, suggesting that the child was able to control the device in different postures without the need for interface recalibration. Conclusions These results clearly establish the viability and robustness of the proposed non-invasive body-machine interface for pediatric populations with severe motor limitations.
Collapse
Affiliation(s)
- Mei-Hua Lee
- Department of Kinesiology, Michigan State University, 308 W Circle Dr Rm 201, East Lansing, MI, 48824, USA.
| | - Rajiv Ranganathan
- Department of Kinesiology, Michigan State University, 308 W Circle Dr Rm 201, East Lansing, MI, 48824, USA.,Department of Mechanical Engineering, Michigan State University, East Lansing, MI, USA
| | - Florian A Kagerer
- Department of Kinesiology, Michigan State University, 308 W Circle Dr Rm 201, East Lansing, MI, 48824, USA
| | - Ranjan Mukherjee
- Department of Mechanical Engineering, Michigan State University, East Lansing, MI, USA
| |
Collapse
|
38
|
Huang X, Liu Y, Kong GW, Seo JH, Ma Y, Jang KI, Fan JA, Mao S, Chen Q, Li D, Liu H, Wang C, Patnaik D, Tian L, Salvatore GA, Feng X, Ma Z, Huang Y, Rogers JA. Epidermal radio frequency electronics for wireless power transfer. MICROSYSTEMS & NANOENGINEERING 2016; 2:16052. [PMID: 31057838 PMCID: PMC6444737 DOI: 10.1038/micronano.2016.52] [Citation(s) in RCA: 38] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/01/2016] [Revised: 06/08/2016] [Accepted: 07/08/2016] [Indexed: 05/04/2023]
Abstract
Epidermal electronic systems feature physical properties that approximate those of the skin, to enable intimate, long-lived skin interfaces for physiological measurements, human-machine interfaces and other applications that cannot be addressed by wearable hardware that is commercially available today. A primary challenge is power supply; the physical bulk, large mass and high mechanical modulus associated with conventional battery technologies can hinder efforts to achieve epidermal characteristics, and near-field power transfer schemes offer only a limited operating distance. Here we introduce an epidermal, far-field radio frequency (RF) power harvester built using a modularized collection of ultrathin antennas, rectifiers and voltage doublers. These components, separately fabricated and tested, can be integrated together via methods involving soft contact lamination. Systematic studies of the individual components and the overall performance in various dielectric environments highlight the key operational features of these systems and strategies for their optimization. The results suggest robust capabilities for battery-free RF power, with relevance to many emerging epidermal technologies.
Collapse
Affiliation(s)
- Xian Huang
- Department of Biomedical Engineering, School of Precision Instrument and Opto-electronics Engineering, Tianjin University, Tianjin 300072, China
| | - Yuhao Liu
- Department of Materials Science and Engineering, University of Illinois at Urbana-Champaign, Urbana, IL 61801, USA
| | - Gil Woo Kong
- Department of Materials Science and Engineering, University of Illinois at Urbana-Champaign, Urbana, IL 61801, USA
| | - Jung Hun Seo
- Department of Electrical and Computer Engineering, University of Wisconsin-Madison, Madison, WI 53706, USA
| | - Yinji Ma
- Department of Civil and Environmental Engineering, Northwestern University, Evanston, IL 60208, USA
- Department of Engineering Mechanics, Center for Mechanics and Materials, Tsinghua University, Beijing 100084, China
- Department of Mechanical Engineering, Northwestern University, Evanston, IL 60208, USA
| | - Kyung-In Jang
- Department of Materials Science and Engineering, University of Illinois at Urbana-Champaign, Urbana, IL 61801, USA
- Department of Robotics Engineering, Daegu Gyeongbuk Institute of Science and Technology, Daegu 42988, Republic of Korea
| | - Jonathan A. Fan
- Department of Robotics Engineering, Daegu Gyeongbuk Institute of Science and Technology, Daegu 42988, Republic of Korea
- Department of Electrical Engineering, Stanford University, Stanford, CA 94305, USA
| | - Shimin Mao
- Department of Materials Science and Engineering, University of Illinois at Urbana-Champaign, Urbana, IL 61801, USA
| | - Qiwen Chen
- Department of Materials Science and Engineering, University of Illinois at Urbana-Champaign, Urbana, IL 61801, USA
| | - Daizhen Li
- Department of Materials Science and Engineering, University of Illinois at Urbana-Champaign, Urbana, IL 61801, USA
| | - Hank Liu
- Department of Materials Science and Engineering, University of Illinois at Urbana-Champaign, Urbana, IL 61801, USA
| | - Chuxuan Wang
- Department of Materials Science and Engineering, University of Illinois at Urbana-Champaign, Urbana, IL 61801, USA
| | - Dwipayan Patnaik
- Department of Materials Science and Engineering, University of Illinois at Urbana-Champaign, Urbana, IL 61801, USA
| | - Limei Tian
- Department of Materials Science and Engineering, University of Illinois at Urbana-Champaign, Urbana, IL 61801, USA
| | - Giovanni A. Salvatore
- Department of Materials Science and Engineering, University of Illinois at Urbana-Champaign, Urbana, IL 61801, USA
| | - Xue Feng
- Department of Engineering Mechanics, Center for Mechanics and Materials, Tsinghua University, Beijing 100084, China
| | - Zhenqiang Ma
- Department of Electrical and Computer Engineering, University of Wisconsin-Madison, Madison, WI 53706, USA
| | - Yonggang Huang
- Department of Civil and Environmental Engineering, Northwestern University, Evanston, IL 60208, USA
- Department of Mechanical Engineering, Northwestern University, Evanston, IL 60208, USA
| | - John A. Rogers
- Department of Materials Science and Engineering, University of Illinois at Urbana-Champaign, Urbana, IL 61801, USA
- ()
| |
Collapse
|
39
|
Schmalfuß L, Rupp R, Tuga MR, Kogut A, Hewitt M, Meincke J, Klinker F, Duttenhoefer W, Eck U, Mikut R, Reischl M, Liebetanz D. Steer by ear: Myoelectric auricular control of powered wheelchairs for individuals with spinal cord injury. Restor Neurol Neurosci 2015; 34:79-95. [PMID: 26599475 DOI: 10.3233/rnn-150579] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
PURPOSE Providing mobility solutions for individuals with tetraplegia remains challenging. Existing control devices have shortcomings such as varying or poor signal quality or interference with communication. To overcome these limitations, we present a novel myoelectric auricular control system (ACS) based on bilateral activation of the posterior auricular muscles (PAMs). METHODS Ten able-bodied subjects and two individuals with tetraplegia practiced PAM activation over 4 days using visual feedback and software-based training for 1 h/day. Initially, half of these subjects were not able to voluntarily activate their PAMs. This ability was tested with regard to 8 parameters such as contraction rate, lateralized activation, wheelchair speed and path length in a virtual obstacle course. In session 5, all subjects steered an electric wheelchair with the ACS. RESULTS Performance of all subjects in controlling their PAMs improved steadily over the training period. By day 5, all subjects successfully generated basic steering commands using the ACS in a powered wheelchair, and subjects with tetraplegia completed a complex real-world obstacle course. This study demonstrates that the ability to activate PAM on both sides together or unilaterally can be learned and used intuitively to steer a wheelchair. CONCLUSIONS With the ACS we can exploit the untapped potential of the PAMs by assigning them a new, complex function. The inherent advantages of the ACS, such as not interfering with oral communication, robustness, stability over time and proportional and continuous signal generation, meet the specific needs of wheelchair users and render it a realistic alternative to currently available assistive technologies.
Collapse
Affiliation(s)
| | - R Rupp
- Heidelberg University Hospital, Spinal Cord Injury Center, Heidelberg, Germany
| | - M R Tuga
- Karlsruhe Institute of Technology, Institute for Applied Computer Science/Automation Technology, Karlsruhe, Germany
| | - A Kogut
- Heidelberg University Hospital, Spinal Cord Injury Center, Heidelberg, Germany
| | - M Hewitt
- Georg-August-University Göttingen, Department of Clinical Neurophysiology, Göttingen, Germany
| | - J Meincke
- Georg-August-University Göttingen, Department of Clinical Neurophysiology, Göttingen, Germany
| | - F Klinker
- Georg-August-University Göttingen, Department of Clinical Neurophysiology, Göttingen, Germany
| | - W Duttenhoefer
- Georg-August-University Göttingen, Department of Clinical Neurophysiology, Göttingen, Germany
| | - U Eck
- Heidelberg University Hospital, Spinal Cord Injury Center, Heidelberg, Germany
| | - R Mikut
- Karlsruhe Institute of Technology, Institute for Applied Computer Science, Karlsruhe, Germany
| | - M Reischl
- Karlsruhe Institute of Technology, Institute for Applied Computer Science, Karlsruhe, Germany
| | - D Liebetanz
- Georg-August-University Göttingen, Department of Clinical Neurophysiology, Göttingen, Germany
| |
Collapse
|
40
|
Ostadabbas S, Ghovanloo M, John Butler A. Developing a Tongue Controlled Exoskeleton for a Wrist Tracking Exercise: A Preliminary Study1. J Med Device 2015. [DOI: 10.1115/1.4030605] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022] Open
Affiliation(s)
- Sarah Ostadabbas
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA 30308
| | - Maysam Ghovanloo
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA 30308
| | - Andrew John Butler
- Department of Physical Therapy, Georgia State University, Atlanta, GA 30308
| |
Collapse
|
41
|
Thorp EB, Abdollahi F, Chen D, Farshchiansadegh A, Lee MH, Pedersen JP, Pierella C, Roth EJ, Seanez Gonzalez I, Mussa-Ivaldi FA. Upper Body-Based Power Wheelchair Control Interface for Individuals With Tetraplegia. IEEE Trans Neural Syst Rehabil Eng 2015; 24:249-60. [PMID: 26054071 DOI: 10.1109/tnsre.2015.2439240] [Citation(s) in RCA: 41] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Many power wheelchair control interfaces are not sufficient for individuals with severely limited upper limb mobility. The majority of controllers that do not rely on coordinated arm and hand movements provide users a limited vocabulary of commands and often do not take advantage of the user's residual motion. We developed a body-machine interface (BMI) that leverages the flexibility and customizability of redundant control by using high dimensional changes in shoulder kinematics to generate proportional control commands for a power wheelchair. In this study, three individuals with cervical spinal cord injuries were able to control a power wheelchair safely and accurately using only small shoulder movements. With the BMI, participants were able to achieve their desired trajectories and, after five sessions driving, were able to achieve smoothness that was similar to the smoothness with their current joystick. All participants were twice as slow using the BMI however improved with practice. Importantly, users were able to generalize training controlling a computer to driving a power wheelchair, and employed similar strategies when controlling both devices. Overall, this work suggests that the BMI can be an effective wheelchair control interface for individuals with high-level spinal cord injuries who have limited arm and hand control.
Collapse
|
42
|
Laumann A, Holbrook J, Minocha J, Rowles D, Nardone B, West D, Kim J, Bruce J, Roth EJ, Ghovanloo M. Safety and efficacy of medically performed tongue piercing in people with tetraplegia for use with tongue-operated assistive technology. Top Spinal Cord Inj Rehabil 2015; 21:61-76. [PMID: 25762861 DOI: 10.1310/sci2101-61] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
BACKGROUND Individuals with high-level spinal cord injuries need effective ways to perform activities. OBJECTIVES To develop and test a medically supervised tongue-piercing protocol and the wearing of a magnet-containing tongue barbell for use with the Tongue Drive System (TDS) in persons with tetraplegia. METHODS Volunteers with tetraplegia underwent initial screening sessions using a magnet glued on the tongue to activate and use the TDS. This was followed by tongue piercing, insertion of a standard barbell, a 4-week healing period, and an exchange of the standard barbell for a magnet-containing barbell. This was then used twice weekly for 6 to 8 weeks to perform computer tasks, drive a powered wheelchair, accomplish in-chair weight shifts, and dial a phone. Symptoms of intraoral dysfunction, change in tongue size following piercing, and subjective assessment of receiving and wearing a magnet-containing tongue barbell and its usability with the TDS were evaluated. RESULTS Twenty-one volunteers underwent initial trial sessions. Thirteen had their tongues pierced. One individual's barbell dislodged during healing resulting in tongue-tract closure. Twelve had the barbell exchanged for a magnet-containing barbell. One subject withdrew for unrelated issues. Eleven completed the TDS testing sessions and were able to complete the assigned tasks. No serious adverse events occurred related to wearing or using a tongue barbell to operate the TDS. CONCLUSIONS Using careful selection criteria and a medically supervised piercing protocol, no excess risk was associated with tongue piercing and wearing a tongue barbell in people with tetraplegia. Participants were able to operate the TDS.
Collapse
Affiliation(s)
- Anne Laumann
- Department of Dermatology, Northwestern University Feinberg School of Medicine , Chicago, Illinois
| | - Jaimee Holbrook
- Department of Pediatrics, University of Chicago , Chicago, Illinois
| | - Julia Minocha
- Department of Dermatology, Southern California Permanente Medical Group , San Diego, California
| | - Diane Rowles
- Department of Neurosurgery, Rush University Medical Center , Chicago, Illinois
| | - Beatrice Nardone
- Department of Dermatology, Northwestern University Feinberg School of Medicine , Chicago, Illinois
| | - Dennis West
- Department of Dermatology, Northwestern University Feinberg School of Medicine , Chicago, Illinois
| | - Jeonghee Kim
- GT-Bionics Lab, School of Electrical and Computer Engineering, Georgia Institute of Technology , Atlanta, Georgia
| | - Joy Bruce
- Hulse Spinal Cord Injury Lab, Shepherd Center , Atlanta, Georgia
| | - Elliot J Roth
- Rehabilitation Institute of Chicago , Illinois ; Northwestern University , Chicago, Illinois
| | - Maysam Ghovanloo
- GT-Bionics Lab, School of Electrical and Computer Engineering, Georgia Institute of Technology , Atlanta, Georgia
| |
Collapse
|
43
|
Kim J, Park H, Bruce J, Rowles D, Holbrook J, Nardone B, West DP, Laumann A, Roth EJ, Ghovanloo M. Assessment of the Tongue-Drive System Using a Computer, a Smartphone, and a Powered-Wheelchair by People With Tetraplegia. IEEE Trans Neural Syst Rehabil Eng 2015; 24:68-78. [PMID: 25730827 DOI: 10.1109/tnsre.2015.2405072] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Tongue-Drive System (TDS) is a wireless and wearable assistive technology that enables people with severe disabilities to control their computers, wheelchairs, and smartphones using voluntary tongue motion. To evaluate the efficacy of the TDS, several experiments were conducted, in which the performance of nine able-bodied (AB) participants using a mouse, a keypad, and the TDS, as well as a cohort of 11 participants with tetraplegia (TP) using the TDS, were observed and compared. Experiments included the Fitts' law tapping, wheelchair driving, phone-dialing, and weight-shifting tasks over five to six consecutive sessions. All participants received a tongue piercing, wore a magnetic tongue stud, and completed the trials as evaluable participants. Although AB participants were already familiar with the keypad, throughputs of their tapping tasks using the keypad were only 1.4 times better than those using the TDS. The completion times of wheelchair driving task using the TDS for AB and TP participants were between 157 s and 180 s with three different control strategies. Participants with TP completed phone-dialing and weight-shifting tasks in 81.9 s and 71.5 s, respectively, using tongue motions. Results showed statistically significant improvement or trending to improvement in performance status over the sessions. Most of the learning occurred between the first and second sessions, but trends did suggest that more practice would lead to increased improvement in performance using the TDS.
Collapse
|
44
|
Lobo-Prat J, Kooren PN, Stienen AHA, Herder JL, Koopman BFJM, Veltink PH. Non-invasive control interfaces for intention detection in active movement-assistive devices. J Neuroeng Rehabil 2014; 11:168. [PMID: 25516421 PMCID: PMC4459663 DOI: 10.1186/1743-0003-11-168] [Citation(s) in RCA: 60] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2014] [Accepted: 12/05/2014] [Indexed: 11/11/2022] Open
Abstract
Active movement-assistive devices aim to increase the quality of life for patients with neuromusculoskeletal disorders. This technology requires interaction between the user and the device through a control interface that detects the user’s movement intention. Researchers have explored a wide variety of invasive and non-invasive control interfaces. To summarize the wide spectrum of strategies, this paper presents a comprehensive review focused on non-invasive control interfaces used to operate active movement-assistive devices. A novel systematic classification method is proposed to categorize the control interfaces based on: (I) the source of the physiological signal, (II) the physiological phenomena responsible for generating the signal, and (III) the sensors used to measure the physiological signal. The proposed classification method can successfully categorize all the existing control interfaces providing a comprehensive overview of the state of the art. Each sensing modality is briefly described in the body of the paper following the same structure used in the classification method. Furthermore, we discuss several design considerations, challenges, and future directions of non-invasive control interfaces for active movement-assistive devices.
Collapse
Affiliation(s)
- Joan Lobo-Prat
- Department of Biomechanical Engineering, University of Twente, Drienerlolaan 5, 7522, NB, Enschede, The Netherlands.
| | - Peter N Kooren
- Department of Physics and Medical Technology, VU University Medical Center, Van der Boechorststraat 7, 1081 BT, Amsterdam, The Netherlands.
| | - Arno H A Stienen
- Department of Biomechanical Engineering, University of Twente, Drienerlolaan 5, 7522, NB, Enschede, The Netherlands. .,Department of Physical Therapy and Human Movement Sciences, Northwestern University, 645 N. Michigan Ave. Suite 1100, 60611, Chicago, IL, USA.
| | - Just L Herder
- Department of Precision and Microsystems Engineering, Delft University of Technology, Mekelweg 2, 2628 CD, Delft, The Netherlands. .,Department Mechanical Automation and Mechatronics, University of Twente, Drienerlolaan 5, 7500 AE, Enschede, The Netherlands.
| | - Bart F J M Koopman
- Department of Biomechanical Engineering, University of Twente, Drienerlolaan 5, 7522, NB, Enschede, The Netherlands.
| | - Peter H Veltink
- Department of Biomedical Signals and Systems, University of Twente, Drienerlolaan 5, 7500 AE, Enschede, The Netherlands.
| |
Collapse
|
45
|
An arch-shaped intraoral tongue drive system with built-in tongue-computer interfacing SoC. SENSORS 2014; 14:21565-87. [PMID: 25405513 PMCID: PMC4279550 DOI: 10.3390/s141121565] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/18/2014] [Revised: 11/10/2014] [Accepted: 11/11/2014] [Indexed: 11/16/2022]
Abstract
We present a new arch-shaped intraoral Tongue Drive System (iTDS) designed to occupy the buccal shelf in the user's mouth. The new arch-shaped iTDS, which will be referred to as the iTDS-2, incorporates a system-on-a-chip (SoC) that amplifies and digitizes the raw magnetic sensor data and sends it wirelessly to an external TDS universal interface (TDS-UI) via an inductive coil or a planar inverted-F antenna. A built-in transmitter (Tx) employs a dual-band radio that operates at either 27 MHz or 432 MHz band, according to the wireless link quality. A built-in super-regenerative receiver (SR-Rx) monitors the wireless link quality and switches the band if the link quality is below a predetermined threshold. An accompanying ultra-low power FPGA generates data packets for the Tx and handles digital control functions. The custom-designed TDS-UI receives raw magnetic sensor data from the iTDS-2, recognizes the intended user commands by the sensor signal processing (SSP) algorithm running in a smartphone, and delivers the classified commands to the target devices, such as a personal computer or a powered wheelchair. We evaluated the iTDS-2 prototype using center-out and maze navigation tasks on two human subjects, which proved its functionality. The subjects' performance with the iTDS-2 was improved by 22% over its predecessor, reported in our earlier publication.
Collapse
|
46
|
Body art meets neuroprosthetics. Nat Rev Neurol 2013. [DOI: 10.1038/nrneurol.2013.261] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|