1
|
Zhao H, Qiu Z, Peng D, Wang F, Wang Z, Qiu S, Shi X, Chu Q. Prediction of Joint Angles Based on Human Lower Limb Surface Electromyography. SENSORS (BASEL, SWITZERLAND) 2023; 23:5404. [PMID: 37420573 DOI: 10.3390/s23125404] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/08/2023] [Revised: 06/02/2023] [Accepted: 06/05/2023] [Indexed: 07/09/2023]
Abstract
Wearable exoskeletons can help people with mobility impairments by improving their rehabilitation. As electromyography (EMG) signals occur before movement, they can be used as input signals for the exoskeletons to predict the body's movement intention. In this paper, the OpenSim software is used to determine the muscle sites to be measured, i.e., rectus femoris, vastus lateralis, semitendinosus, biceps femoris, lateral gastrocnemius, and tibial anterior. The surface electromyography (sEMG) signals and inertial data are collected from the lower limbs while the human body is walking, going upstairs, and going uphill. The sEMG noise is reduced by a wavelet-threshold-based complete ensemble empirical mode decomposition with adaptive noise (CEEMDAN) reduction algorithm, and the time-domain features are extracted from the noise-reduced sEMG signals. Knee and hip angles during motion are calculated using quaternions through coordinate transformations. The random forest (RF) regression algorithm optimized by cuckoo search (CS), shortened as CS-RF, is used to establish the prediction model of lower limb joint angles by sEMG signals. Finally, root mean square error (RMSE), mean absolute error (MAE), and coefficient of determination (R2) are used as evaluation metrics to compare the prediction performance of the RF, support vector machine (SVM), back propagation (BP) neural network, and CS-RF. The evaluation results of CS-RF are superior to other algorithms under the three motion scenarios, with optimal metric values of 1.9167, 1.3893, and 0.9815, respectively.
Collapse
Affiliation(s)
- Hongyu Zhao
- Key Laboratory of Intelligent Control and Optimization for Industrial Equipment of Ministry of Education, Dalian University of Technology, Dalian 116024, China
- School of Control Science and Engineering, Dalian University of Technology, Dalian 116024, China
| | - Zhibo Qiu
- Key Laboratory of Intelligent Control and Optimization for Industrial Equipment of Ministry of Education, Dalian University of Technology, Dalian 116024, China
- School of Control Science and Engineering, Dalian University of Technology, Dalian 116024, China
| | - Daoyong Peng
- Neurology Department, Dalian Municipal Central Hospital, Dalian 116024, China
| | - Fang Wang
- Neurology Department, Dalian Municipal Central Hospital, Dalian 116024, China
| | - Zhelong Wang
- Key Laboratory of Intelligent Control and Optimization for Industrial Equipment of Ministry of Education, Dalian University of Technology, Dalian 116024, China
- School of Control Science and Engineering, Dalian University of Technology, Dalian 116024, China
| | - Sen Qiu
- Key Laboratory of Intelligent Control and Optimization for Industrial Equipment of Ministry of Education, Dalian University of Technology, Dalian 116024, China
- School of Control Science and Engineering, Dalian University of Technology, Dalian 116024, China
| | - Xin Shi
- Key Laboratory of Intelligent Control and Optimization for Industrial Equipment of Ministry of Education, Dalian University of Technology, Dalian 116024, China
- School of Control Science and Engineering, Dalian University of Technology, Dalian 116024, China
| | - Qinghao Chu
- Key Laboratory of Intelligent Control and Optimization for Industrial Equipment of Ministry of Education, Dalian University of Technology, Dalian 116024, China
- School of Control Science and Engineering, Dalian University of Technology, Dalian 116024, China
| |
Collapse
|
2
|
Zhang P, Wu P, Wang W. Research on Lower Limb Step Speed Recognition Method Based on Electromyography. MICROMACHINES 2023; 14:546. [PMID: 36984953 PMCID: PMC10058516 DOI: 10.3390/mi14030546] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/13/2022] [Revised: 02/14/2023] [Accepted: 02/21/2023] [Indexed: 06/18/2023]
Abstract
Wearable exoskeletons play an important role in people's lives, such as helping stroke and amputation patients to carry out rehabilitation training and so on. How to make the exoskeleton accurately judge the human action intention is the basic requirement to ensure that it can complete the corresponding task. Traditional exoskeleton control signals include pressure values, joint angles and acceleration values, which can only reflect the current motion information of the human lower limbs and cannot be used to predict motion. The electromyography (EMG) signal always occurs before a certain movement; it can be used to predict the target's gait speed and movement as the input signal. In this study, the generalization ability of a BP neural network and the timing property of a hidden Markov chain are used to properly fuse the two, and are finally used in the research of this paper. Experiments show that, using the same training samples, the recognition accuracy of the three-layer BP neural network is only 91%, while the recognition accuracy of the fusion discriminant model proposed in this paper can reach 95.1%. The results show that the fusion of BP neural network and hidden Markov chain has a strong solving ability for the task of wearable exoskeleton recognition of target step speed.
Collapse
Affiliation(s)
- Peng Zhang
- Engineering Training Centre, Northwestern Polytechnical University, Xi’an 710000, China
| | - Pengcheng Wu
- College of Automation, Northwestern Polytechnical University, Xi’an 710000, China
| | - Wendong Wang
- College of Mechanical and Electrical Engineering, Northwestern Polytechnical University, Xi’an 710000, China
| |
Collapse
|
3
|
Xu J, Pan J, Cui T, Zhang S, Yang Y, Ren TL. Recent Progress of Tactile and Force Sensors for Human-Machine Interaction. SENSORS (BASEL, SWITZERLAND) 2023; 23:1868. [PMID: 36850470 PMCID: PMC9961639 DOI: 10.3390/s23041868] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/29/2022] [Revised: 01/23/2023] [Accepted: 02/01/2023] [Indexed: 06/18/2023]
Abstract
Human-Machine Interface (HMI) plays a key role in the interaction between people and machines, which allows people to easily and intuitively control the machine and immersively experience the virtual world of the meta-universe by virtual reality/augmented reality (VR/AR) technology. Currently, wearable skin-integrated tactile and force sensors are widely used in immersive human-machine interactions due to their ultra-thin, ultra-soft, conformal characteristics. In this paper, the recent progress of tactile and force sensors used in HMI are reviewed, including piezoresistive, capacitive, piezoelectric, triboelectric, and other sensors. Then, this paper discusses how to improve the performance of tactile and force sensors for HMI. Next, this paper summarizes the HMI for dexterous robotic manipulation and VR/AR applications. Finally, this paper summarizes and proposes the future development trend of HMI.
Collapse
Affiliation(s)
- Jiandong Xu
- School of Integrated Circuits and Beijing National Research Center for Information Science and Technology (BNRist), Tsinghua University, Beijing 100084, China
| | - Jiong Pan
- School of Integrated Circuits and Beijing National Research Center for Information Science and Technology (BNRist), Tsinghua University, Beijing 100084, China
| | - Tianrui Cui
- School of Integrated Circuits and Beijing National Research Center for Information Science and Technology (BNRist), Tsinghua University, Beijing 100084, China
| | - Sheng Zhang
- Shenzhen International Graduate School, Tsinghua University, Shenzhen 518055, China
| | - Yi Yang
- School of Integrated Circuits and Beijing National Research Center for Information Science and Technology (BNRist), Tsinghua University, Beijing 100084, China
| | - Tian-Ling Ren
- School of Integrated Circuits and Beijing National Research Center for Information Science and Technology (BNRist), Tsinghua University, Beijing 100084, China
- Center for Flexible Electronics Technology, Tsinghua University, Beijing 100084, China
| |
Collapse
|
4
|
High-accuracy wearable detection of freezing of gait in Parkinson's disease based on pseudo-multimodal features. Comput Biol Med 2022; 146:105629. [DOI: 10.1016/j.compbiomed.2022.105629] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2022] [Revised: 03/25/2022] [Accepted: 04/15/2022] [Indexed: 11/22/2022]
|
5
|
A Human-Machine Interface Based on an EOG and a Gyroscope for Humanoid Robot Control and Its Application to Home Services. JOURNAL OF HEALTHCARE ENGINEERING 2022; 2022:1650387. [PMID: 35345662 PMCID: PMC8957419 DOI: 10.1155/2022/1650387] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/05/2021] [Revised: 01/28/2022] [Accepted: 02/14/2022] [Indexed: 11/18/2022]
Abstract
The human-machine interface (HMI) has been studied for robot teleoperation with the aim of empowering people who experience motor disabilities to increase their interaction with the physical environment. The challenge of an HMI for robot control is to rapidly, accurately, and sufficiently produce control commands. In this paper, an asynchronous HMI based on an electrooculogram (EOG) and a gyroscope is proposed using two self-paced and endogenous features, double blink and head rotation. By designing the multilevel graphical user interface (GUI), the user can rotate his head to move the cursor of the GUI and create a double blink to trigger the button in the interface. The proposed HMI is able to supply sufficient commands at the same time with high accuracy (ACC) and low response time (RT). In the trigger task of sixteen healthy subjects, the target was clicked from 20 options with ACC of 99.2% and RT 2.34 s. Furthermore, a continuous strategy that uses motion start and motion stop commands to create a certain robot motion is proposed to control a humanoid robot based on the HMI. It avoids the situation that combines some commands to achieve one motion or converts the certain motion to a command directly. In the home service experiment, all subjects operated a humanoid robot changing the state of a switch, grasping a key, and putting it into a box. The time ratio between HMI control and manual control was 1.22, and the number of commands ratio was 1.18. The results demonstrated that the continuous strategy and proposed HMI can improve performance in humanoid robot control.
Collapse
|
6
|
Esposito D, Centracchio J, Andreozzi E, Gargiulo GD, Naik GR, Bifulco P. Biosignal-Based Human-Machine Interfaces for Assistance and Rehabilitation: A Survey. SENSORS 2021; 21:s21206863. [PMID: 34696076 PMCID: PMC8540117 DOI: 10.3390/s21206863] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/31/2021] [Revised: 09/30/2021] [Accepted: 10/12/2021] [Indexed: 12/03/2022]
Abstract
As a definition, Human–Machine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignal-based HMIs for assistance and rehabilitation to outline state-of-the-art and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, full-text), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An ever-growing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIs’ complexity, so their usefulness should be carefully evaluated for the specific application.
Collapse
Affiliation(s)
- Daniele Esposito
- Department of Electrical Engineering and Information Technologies, Polytechnic and Basic Sciences School, University of Naples “Federico II”, 80125 Naples, Italy; (D.E.); (J.C.); (E.A.); (P.B.)
| | - Jessica Centracchio
- Department of Electrical Engineering and Information Technologies, Polytechnic and Basic Sciences School, University of Naples “Federico II”, 80125 Naples, Italy; (D.E.); (J.C.); (E.A.); (P.B.)
| | - Emilio Andreozzi
- Department of Electrical Engineering and Information Technologies, Polytechnic and Basic Sciences School, University of Naples “Federico II”, 80125 Naples, Italy; (D.E.); (J.C.); (E.A.); (P.B.)
| | - Gaetano D. Gargiulo
- School of Engineering, Design and Built Environment, Western Sydney University, Penrith, NSW 2747, Australia;
- The MARCS Institute, Western Sydney University, Penrith, NSW 2751, Australia
| | - Ganesh R. Naik
- School of Engineering, Design and Built Environment, Western Sydney University, Penrith, NSW 2747, Australia;
- The Adelaide Institute for Sleep Health, Flinders University, Bedford Park, SA 5042, Australia
- Correspondence:
| | - Paolo Bifulco
- Department of Electrical Engineering and Information Technologies, Polytechnic and Basic Sciences School, University of Naples “Federico II”, 80125 Naples, Italy; (D.E.); (J.C.); (E.A.); (P.B.)
| |
Collapse
|
7
|
Pérez-Reynoso FD, Rodríguez-Guerrero L, Salgado-Ramírez JC, Ortega-Palacios R. Human-Machine Interface: Multiclass Classification by Machine Learning on 1D EOG Signals for the Control of an Omnidirectional Robot. SENSORS (BASEL, SWITZERLAND) 2021; 21:5882. [PMID: 34502773 PMCID: PMC8434373 DOI: 10.3390/s21175882] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/29/2021] [Revised: 08/24/2021] [Accepted: 08/26/2021] [Indexed: 01/25/2023]
Abstract
People with severe disabilities require assistance to perform their routine activities; a Human-Machine Interface (HMI) will allow them to activate devices that respond according to their needs. In this work, an HMI based on electrooculography (EOG) is presented, the instrumentation is placed on portable glasses that have the task of acquiring both horizontal and vertical EOG signals. The registration of each eye movement is identified by a class and categorized using the one hot encoding technique to test precision and sensitivity of different machine learning classification algorithms capable of identifying new data from the eye registration; the algorithm allows to discriminate blinks in order not to disturb the acquisition of the eyeball position commands. The implementation of the classifier consists of the control of a three-wheeled omnidirectional robot to validate the response of the interface. This work proposes the classification of signals in real time and the customization of the interface, minimizing the user's learning curve. Preliminary results showed that it is possible to generate trajectories to control an omnidirectional robot to implement in the future assistance system to control position through gaze orientation.
Collapse
Affiliation(s)
| | - Liliam Rodríguez-Guerrero
- Research Center on Technology of Information and Systems (CITIS), Electric and Control Academic Group, Universidad Autónoma del Estado de Hidalgo (UAEH), Pachuca de Soto 42039, Mexico
| | | | - Rocío Ortega-Palacios
- Biomedical Engineering, Universidad Politécnica de Pachuca (UPP), Zempoala 43830, Mexico
| |
Collapse
|
8
|
Ha J, Park S, Im CH, Kim L. A Hybrid Brain-Computer Interface for Real-Life Meal-Assist Robot Control. SENSORS 2021; 21:s21134578. [PMID: 34283122 PMCID: PMC8271393 DOI: 10.3390/s21134578] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/30/2021] [Revised: 06/30/2021] [Accepted: 07/01/2021] [Indexed: 11/16/2022]
Abstract
Assistant devices such as meal-assist robots aid individuals with disabilities and support the elderly in performing daily activities. However, existing meal-assist robots are inconvenient to operate due to non-intuitive user interfaces, requiring additional time and effort. Thus, we developed a hybrid brain-computer interface-based meal-assist robot system following three features that can be measured using scalp electrodes for electroencephalography. The following three procedures comprise a single meal cycle. (1) Triple eye-blinks (EBs) from the prefrontal channel were treated as activation for initiating the cycle. (2) Steady-state visual evoked potentials (SSVEPs) from occipital channels were used to select the food per the user's intention. (3) Electromyograms (EMGs) were recorded from temporal channels as the users chewed the food to mark the end of a cycle and indicate readiness for starting the following meal. The accuracy, information transfer rate, and false positive rate during experiments on five subjects were as follows: accuracy (EBs/SSVEPs/EMGs) (%): (94.67/83.33/97.33); FPR (EBs/EMGs) (times/min): (0.11/0.08); ITR (SSVEPs) (bit/min): 20.41. These results revealed the feasibility of this assistive system. The proposed system allows users to eat on their own more naturally. Furthermore, it can increase the self-esteem of disabled and elderly peeople and enhance their quality of life.
Collapse
Affiliation(s)
- Jihyeon Ha
- Center for Bionics, Korea Institute of Science and Technology, Seoul 02792, Korea; (J.H.); (S.P.)
- Department of Biomedical Engineering, Hanyang University, Seoul 04763, Korea;
| | - Sangin Park
- Center for Bionics, Korea Institute of Science and Technology, Seoul 02792, Korea; (J.H.); (S.P.)
| | - Chang-Hwan Im
- Department of Biomedical Engineering, Hanyang University, Seoul 04763, Korea;
| | - Laehyun Kim
- Center for Bionics, Korea Institute of Science and Technology, Seoul 02792, Korea; (J.H.); (S.P.)
- Department of HY-KIST Bio-Convergence, Hanyang University, Seoul 04763, Korea
- Correspondence: ; Tel.: +82-2-958-6726
| |
Collapse
|