1
|
Rohr M, Haidamous J, Schafer N, Schaumann S, Latsch B, Kupnik M, Antink CH. On the Benefit of FMG and EMG Sensor Fusion for Gesture Recognition Using Cross-Subject Validation. IEEE Trans Neural Syst Rehabil Eng 2025; 33:935-944. [PMID: 40031586 DOI: 10.1109/tnsre.2025.3543649] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/05/2025]
Abstract
Hand gestures are a natural form of human communication, making gesture recognition a sensible approach for intuitive human-computer interaction. Wearable sensors on the forearm can be used to detect the muscle contractions that generate these gestures, but classification approaches relying on a single measured modality lack accuracy and robustness. In this work, we analyze sensor fusion of force myography (FMG) and electromyography (EMG) for gesture recognition. We employ piezoelectric FMG sensors based on ferroelectrets and a commercial EMG system in a user study with 13 participants to measure 66 distinct hand movements with 10ms labelling precision. Three classification tasks, namely flexion and extension, single finger, and all finger movement classification, are performed using common handcrafted features as input to machine learning classifiers. Subsequently, the evaluation covers the effectiveness of the sensor fusion using correlation analysis, classification performance based on leave-one-subject-out-cross-validation and 5x2cv-t-tests, and its effects of involuntary movements on classification. We find that sensor fusion leads to significant improvement (42% higher average recognition accuracy) on all three tasks and that both sensor modalities contain complementary information. Furthermore, we confirm this finding using reduced FMG and EMG sensor sets. This study reinforces the results of prior research about the effectiveness of sensor fusion by performing meticulous statistical analyses, thereby paving the way for multi-sensor gesture recognition in assistance systems.
Collapse
|
2
|
Wang S, Yi S, Zhao B, Li Y, Li S, Tao G, Mao X, Sun W. Sowing Depth Monitoring System for High-Speed Precision Planters Based on Multi-Sensor Data Fusion. SENSORS (BASEL, SWITZERLAND) 2024; 24:6331. [PMID: 39409371 PMCID: PMC11478517 DOI: 10.3390/s24196331] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/13/2024] [Revised: 09/15/2024] [Accepted: 09/26/2024] [Indexed: 10/20/2024]
Abstract
High-speed precision planters are subject to high-speed (12~16 km/h) operation due to terrain undulation caused by mechanical vibration and sensor measurement errors caused by the sowing depth monitoring system's accuracy reduction problems. Thus, this study investigates multi-sensor data fusion technology based on the sowing depth monitoring systems of high-speed precision planters. Firstly, a sowing depth monitoring model comprising laser, ultrasonic, and angle sensors as the multi-sensor monitoring unit is established. Secondly, these three single sensors are filtered using the Kalman filter. Finally, a multi-sensor data fusion algorithm for optimising four key parameters in the extended Kalman filter (EKF) using an improved sparrow search algorithm (ISSA) is proposed. Subsequently, the filtered data from the three single sensors are integrated to address the issues of mechanical vibration interference and sensor measurement errors. In order to ascertain the superiority of the ISSA-EKF, the ISSA-EKF and SSA-EKF are simulated, and their values are compared with the original monitoring value of the sensor and the filtered sowing depth value. The simulation test demonstrates that the ISSA-EKF-based sowing depth monitoring algorithm for high-speed precision planters, with a mean absolute error (MAE) of 0.083 cm, root mean square error (RMSE) of 0.103 cm, and correlation coefficient (R) of 0.979 achieves high-precision monitoring. This is evidenced by a significant improvement in accuracy when compared with the original monitoring value of the sensor, the filtered value, and the SSA-EKF. The results of a field test demonstrate that the ISSA-EKF-based sowing depth monitoring system for high-speed precision planters enhances the precision and reliability of the monitoring system when compared with the three single-sensor monitoring values. The average MAE and RMSE are reduced by 0.071 cm and 0.075 cm, respectively, while the average R is improved by 0.036. This study offers a theoretical foundation for the advancement of sowing depth monitoring systems for high-speed precision planters.
Collapse
Affiliation(s)
- Song Wang
- College of Engineering, Heilongjiang Bayi Agricultural University, Daqing 163319, China; (S.W.); (B.Z.); (Y.L.); (S.L.); (G.T.); (X.M.); (W.S.)
| | - Shujuan Yi
- College of Engineering, Heilongjiang Bayi Agricultural University, Daqing 163319, China; (S.W.); (B.Z.); (Y.L.); (S.L.); (G.T.); (X.M.); (W.S.)
| | - Bin Zhao
- College of Engineering, Heilongjiang Bayi Agricultural University, Daqing 163319, China; (S.W.); (B.Z.); (Y.L.); (S.L.); (G.T.); (X.M.); (W.S.)
- Provincial Key Laboratory of Intelligent Agricultural Machinery Equipment, Daqing 163319, China
| | - Yifei Li
- College of Engineering, Heilongjiang Bayi Agricultural University, Daqing 163319, China; (S.W.); (B.Z.); (Y.L.); (S.L.); (G.T.); (X.M.); (W.S.)
- College of Engineering, Northeast Agricultural University, Harbin 150030, China
| | - Shuaifei Li
- College of Engineering, Heilongjiang Bayi Agricultural University, Daqing 163319, China; (S.W.); (B.Z.); (Y.L.); (S.L.); (G.T.); (X.M.); (W.S.)
| | - Guixiang Tao
- College of Engineering, Heilongjiang Bayi Agricultural University, Daqing 163319, China; (S.W.); (B.Z.); (Y.L.); (S.L.); (G.T.); (X.M.); (W.S.)
| | - Xin Mao
- College of Engineering, Heilongjiang Bayi Agricultural University, Daqing 163319, China; (S.W.); (B.Z.); (Y.L.); (S.L.); (G.T.); (X.M.); (W.S.)
| | - Wensheng Sun
- College of Engineering, Heilongjiang Bayi Agricultural University, Daqing 163319, China; (S.W.); (B.Z.); (Y.L.); (S.L.); (G.T.); (X.M.); (W.S.)
| |
Collapse
|
3
|
Andreas D, Hou Z, Tabak MO, Dwivedi A, Beckerle P. A Multimodal Bracelet to Acquire Muscular Activity and Gyroscopic Data to Study Sensor Fusion for Intent Detection. SENSORS (BASEL, SWITZERLAND) 2024; 24:6214. [PMID: 39409254 PMCID: PMC11478661 DOI: 10.3390/s24196214] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/21/2024] [Revised: 09/17/2024] [Accepted: 09/24/2024] [Indexed: 10/20/2024]
Abstract
Researchers have attempted to control robotic hands and prostheses through biosignals but could not match the human hand. Surface electromyography records electrical muscle activity using non-invasive electrodes and has been the primary method in most studies. While surface electromyography-based hand motion decoding shows promise, it has not yet met the requirements for reliable use. Combining different sensing modalities has been shown to improve hand gesture classification accuracy. This work introduces a multimodal bracelet that integrates a 24-channel force myography system with six commercial surface electromyography sensors, each containing a six-axis inertial measurement unit. The device's functionality was tested by acquiring muscular activity with the proposed device from five participants performing five different gestures in a random order. A random forest model was then used to classify the performed gestures from the acquired signal. The results confirmed the device's functionality, making it suitable to study sensor fusion for intent detection in future studies. The results showed that combining all modalities yielded the highest classification accuracies across all participants, reaching 92.3±2.6% on average, effectively reducing misclassifications by 37% and 22% compared to using surface electromyography and force myography individually as input signals, respectively. This demonstrates the potential benefits of sensor fusion for more robust and accurate hand gesture classification and paves the way for advanced control of robotic and prosthetic hands.
Collapse
Affiliation(s)
- Daniel Andreas
- Chair of Autonomous Systems and Mechatronics, Friedrich-Alexander-Universität Erlangen-Nürnberg, 91054 Erlangen, Germany (P.B.)
| | - Zhongshi Hou
- Chair of Autonomous Systems and Mechatronics, Friedrich-Alexander-Universität Erlangen-Nürnberg, 91054 Erlangen, Germany (P.B.)
| | - Mohamad Obada Tabak
- Chair of Autonomous Systems and Mechatronics, Friedrich-Alexander-Universität Erlangen-Nürnberg, 91054 Erlangen, Germany (P.B.)
| | - Anany Dwivedi
- Artificial Intelligence (AI) Institute, Division of Health, Engineering, Computing and Science, University of Waikato, Hamilton 3216, New Zealand
| | - Philipp Beckerle
- Chair of Autonomous Systems and Mechatronics, Friedrich-Alexander-Universität Erlangen-Nürnberg, 91054 Erlangen, Germany (P.B.)
- Department of Artificial Intelligence in Biomedical Engineering, Friedrich-Alexander-Universität Erlangen-Nürnberg, 91054 Erlangen, Germany
| |
Collapse
|
4
|
Chen YN, Wu YN, Yang BS. The neuromuscular control for lower limb exoskeleton- a 50-year perspective. J Biomech 2023; 158:111738. [PMID: 37562276 DOI: 10.1016/j.jbiomech.2023.111738] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2023] [Revised: 07/19/2023] [Accepted: 07/21/2023] [Indexed: 08/12/2023]
Abstract
Historically, impaired lower limb function has resulted in heavy health burden and large economic loss in society. Although experts from various fields have put large amounts of effort into overcoming this challenge, there is still not a single standard treatment that can completely restore the lost limb function. During the past half century, with the advancing understanding of human biomechanics and engineering technologies, exoskeletons have achieved certain degrees of success in assisting and rehabilitating patients with loss of limb function, and therefore has been spotlighted in both the medical and engineering fields. In this article, we review the development milestones of lower limb exoskeletons as well as the neuromuscular interactions between the device and wearer throughout the past 50 years. Fifty years ago, the lower-limb exoskeletons just started to be devised. We review several prototypes and present their designs in terms of structure, sensor and control systems. Subsequently, we introduce the development milestones of modern lower limb exoskeletons and discuss the pros and cons of these differentiated devices. In addition, we summarize current important neuromuscular control systems and sensors; and discuss current evidence demonstrating how the exoskeletons may affect neuromuscular control of wearers. In conclusion, based on our review, we point out the possible future direction of combining multiple current technologies to build lower limb exoskeletons that can serve multiple aims.
Collapse
Affiliation(s)
- Yu-Ning Chen
- Department of Mechanical Engineering, National Yang Ming Chiao Tung University, Taiwan; Biomechanics and Medical Application Laboratory, National Yang Ming Chiao Tung University; Division of Neurosurgery, Department of Surgery, National Taiwan University Hospital Hsin-Chu Branch, Taiwan
| | - Yi-Ning Wu
- Department of Physical Therapy and Kinesiology, University of Massachusetts Lowell, MA, USA; The New England Robotics Validation and Experimentation Center, University of Massachusetts Lowell, MA, USA
| | - Bing-Shiang Yang
- Department of Mechanical Engineering, National Yang Ming Chiao Tung University, Taiwan; Biomechanics and Medical Application Laboratory, National Yang Ming Chiao Tung University; Mechanical and Mechatronics Systems Research Laboratories, Industrial Technology Research Institute, Taiwan; Taiwanese Society of Biomechanics, Taiwan.
| |
Collapse
|
5
|
Godoy RV, Guan B, Dwivedi A, Shahmohammadi M, Owen M, Liarokapis M. Multi-Grasp Classification for the Control of Robot Hands Employing Transformers and Lightmyography Signals. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2023; 2023:1-6. [PMID: 38082669 DOI: 10.1109/embc40787.2023.10340274] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2023]
Abstract
The increasing use of smart technical devices in our everyday lives has necessitated the use of muscle-machine interfaces (MuMI) that are intuitive and that can facilitate immersive interactions with these devices. The most common method to develop MuMIs is using Electromyography (EMG) based signals. However, due to several drawbacks of EMG-based interfaces, alternative methods to develop MuMI are being explored. In our previous work, we presented a new MuMI called Lightmyography (LMG), which achieved outstanding results compared to a classic EMG-based interface in a five-gesture classification task. In this study, we extend our previous work experimentally validating the efficiency of the LMG armband in classifying thirty-two different gestures from six participants using a deep learning technique called Temporal Multi-Channel Vision Transformers (TMC-ViT). The efficiency of the proposed model was assessed using accuracy. Moreover, two different undersampling techniques are compared. The proposed thirty-two-gesture classifiers achieve accuracies as high as 92%. Finally, we employ the LMG interface in the real-time control of a robotic hand using ten different gestures, successfully reproducing several grasp types from taxonomy grasps presented in the literature.
Collapse
|
6
|
Vera-Ortega P, Vázquez-Martín R, Fernandez-Lozano JJ, García-Cerezo A, Mandow A. Enabling Remote Responder Bio-Signal Monitoring in a Cooperative Human-Robot Architecture for Search and Rescue. SENSORS (BASEL, SWITZERLAND) 2022; 23:49. [PMID: 36616647 PMCID: PMC9823914 DOI: 10.3390/s23010049] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/25/2022] [Revised: 12/16/2022] [Accepted: 12/19/2022] [Indexed: 06/17/2023]
Abstract
The roles of emergency responders are challenging and often physically demanding, so it is essential that their duties are performed safely and effectively. In this article, we address real-time bio-signal sensor monitoring for responders in disaster scenarios. In particular, we propose the integration of a set of health monitoring sensors suitable for detecting stress, anxiety and physical fatigue in an Internet of Cooperative Agents architecture for search and rescue (SAR) missions (SAR-IoCA), which allows remote control and communication between human and robotic agents and the mission control center. With this purpose, we performed proof-of-concept experiments with a bio-signal sensor suite worn by firefighters in two high-fidelity SAR exercises. Moreover, we conducted a survey, distributed to end-users through the Fire Brigade consortium of the Provincial Council of Málaga, in order to analyze the firefighters' opinion about biological signals monitoring while on duty. As a result of this methodology, we propose a wearable sensor suite design with the aim of providing some easy-to-wear integrated-sensor garments, which are suitable for emergency worker activity. The article offers discussion of user acceptance, performance results and learned lessons.
Collapse
|