1
|
Seifaddini P, Sheikhahmadi S, Kolahdouz M, Aghababa H. Smart Printed Triboelectric Wearable Sensor with High Performance for Glove-Based Motion Detection. ACS APPLIED MATERIALS & INTERFACES 2024; 16:9506-9516. [PMID: 38346320 DOI: 10.1021/acsami.3c17419] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/23/2024]
Abstract
In a world increasingly driven by data, wearable triboelectric nanogenerators (TENGs) offer a convenient way to monitor and collect information about human body motions. To meet the demands of the large-scale production of wearable TENGs, material selection to realize a high conversion efficiency and simplify the fabrication process remains a challenge. To address these issues, we present a simple-structured wearable printed arc-shaped triboelectric sensor (PATS) for finger motion detection by leveraging inkjet printing technology. In this regard, pressure sensors composed of diverse materials based on dielectric-dielectric and metal-dielectric structures in contact-separation mode were fabricated and compared. Thanks to the unique characteristics of the silver nanoparticle (Ag-NP)-printed layer and silicon rubber (SR), the SR-Ag PATS shows a high peak-to-peak voltage of 14.15 V and a short-circuit current of 0.78 μA. The proposed sensor with the capability of accurately identifying finger motions at various bending angles suggests promising application potential in glove-based human-machine interface (HMI) systems.
Collapse
Affiliation(s)
- Parinaz Seifaddini
- School of Electrical and Computer Engineering, College of Engineering, University of Tehran, 1417614411 Tehran, Iran
| | - Sina Sheikhahmadi
- School of Electrical and Computer Engineering, College of Engineering, University of Tehran, 1417614411 Tehran, Iran
| | - Mohammadreza Kolahdouz
- School of Electrical and Computer Engineering, College of Engineering, University of Tehran, 1417614411 Tehran, Iran
| | - Hossein Aghababa
- Department of Engineering, Loyola University Maryland, Baltimore, Maryland 21210, United States
| |
Collapse
|
2
|
Lu C, Jin L, Liu Y, Wang J, Li W. Teleoperated Grasping Using Data Gloves Based on Fuzzy Logic Controller. Biomimetics (Basel) 2024; 9:116. [PMID: 38392162 PMCID: PMC10886496 DOI: 10.3390/biomimetics9020116] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2023] [Revised: 01/29/2024] [Accepted: 02/06/2024] [Indexed: 02/24/2024] Open
Abstract
Teleoperated robots have attracted significant interest in recent years, and data gloves are one of the commonly used devices for their operation. However, existing solutions still encounter two challenges: the ways in which data gloves capture human operational intentions and achieve accurate mapping. In order to address these challenges, we propose a novel teleoperation method using data gloves based on fuzzy logic controller. Firstly, the data are collected and normalized from the flex sensors on data gloves to identify human manipulation intentions. Then, a fuzzy logic controller is designed to convert finger flexion information into motion control commands for robot arms. Finally, experiments are conducted to demonstrate the effectiveness and precision of the proposed method.
Collapse
Affiliation(s)
- Chunxiao Lu
- School of Automotive Engineering, Harbin Institute of Technology (Weihai), Weihai 264209, China
| | - Lei Jin
- School of Automotive Engineering, Harbin Institute of Technology (Weihai), Weihai 264209, China
| | - Yufei Liu
- China North Vehicle Research Institute, Beijing 100072, China
| | - Jianfeng Wang
- School of Automotive Engineering, Harbin Institute of Technology (Weihai), Weihai 264209, China
| | - Weihua Li
- School of Automotive Engineering, Harbin Institute of Technology (Weihai), Weihai 264209, China
- Yangtze River Delta HIT Robot Technology Research Institute, Wuhu 241060, China
| |
Collapse
|
3
|
Vysocký A, Poštulka T, Chlebek J, Kot T, Maslowski J, Grushko S. Hand Gesture Interface for Robot Path Definition in Collaborative Applications: Implementation and Comparative Study. SENSORS (BASEL, SWITZERLAND) 2023; 23:s23094219. [PMID: 37177421 PMCID: PMC10180605 DOI: 10.3390/s23094219] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/27/2023] [Revised: 04/21/2023] [Accepted: 04/21/2023] [Indexed: 05/15/2023]
Abstract
The article explores the possibilities of using hand gestures as a control interface for robotic systems in a collaborative workspace. The development of hand gesture control interfaces has become increasingly important in everyday life as well as professional contexts such as manufacturing processes. We present a system designed to facilitate collaboration between humans and robots in manufacturing processes that require frequent revisions of the robot path and that allows direct definition of the waypoints, which differentiates our system from the existing ones. We introduce a novel and intuitive approach to human-robot cooperation through the use of simple gestures. As part of a robotic workspace, a proposed interface was developed and implemented utilising three RGB-D sensors for monitoring the operator's hand movements within the workspace. The system employs distributed data processing through multiple Jetson Nano units, with each unit processing data from a single camera. MediaPipe solution is utilised to localise the hand landmarks in the RGB image, enabling gesture recognition. We compare the conventional methods of defining robot trajectories with their developed gesture-based system through an experiment with 20 volunteers. The experiment involved verification of the system under realistic conditions in a real workspace closely resembling the intended industrial application. Data collected during the experiment included both objective and subjective parameters. The results indicate that the gesture-based interface enables users to define a given path objectively faster than conventional methods. We critically analyse the features and limitations of the developed system and suggest directions for future research. Overall, the experimental results indicate the usefulness of the developed system as it can speed up the definition of the robot's path.
Collapse
Affiliation(s)
- Aleš Vysocký
- Department of Robotics, Faculty of Mechanical Engineering, VSB-Technical University of Ostrava, 17. Listopadu 2172/15, 708 00 Ostrava, Czech Republic
| | - Tomáš Poštulka
- Department of Robotics, Faculty of Mechanical Engineering, VSB-Technical University of Ostrava, 17. Listopadu 2172/15, 708 00 Ostrava, Czech Republic
| | - Jakub Chlebek
- Department of Robotics, Faculty of Mechanical Engineering, VSB-Technical University of Ostrava, 17. Listopadu 2172/15, 708 00 Ostrava, Czech Republic
| | - Tomáš Kot
- Department of Robotics, Faculty of Mechanical Engineering, VSB-Technical University of Ostrava, 17. Listopadu 2172/15, 708 00 Ostrava, Czech Republic
| | - Jan Maslowski
- Department of Robotics, Faculty of Mechanical Engineering, VSB-Technical University of Ostrava, 17. Listopadu 2172/15, 708 00 Ostrava, Czech Republic
| | - Stefan Grushko
- Department of Robotics, Faculty of Mechanical Engineering, VSB-Technical University of Ostrava, 17. Listopadu 2172/15, 708 00 Ostrava, Czech Republic
| |
Collapse
|
4
|
Noble F, Xu M, Alam F. Static Hand Gesture Recognition Using Capacitive Sensing and Machine Learning. SENSORS (BASEL, SWITZERLAND) 2023; 23:3419. [PMID: 37050481 PMCID: PMC10099234 DOI: 10.3390/s23073419] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/03/2023] [Revised: 03/23/2023] [Accepted: 03/23/2023] [Indexed: 06/19/2023]
Abstract
Automated hand gesture recognition is a key enabler of Human-to-Machine Interfaces (HMIs) and smart living. This paper reports the development and testing of a static hand gesture recognition system using capacitive sensing. Our system consists of a 6×18 array of capacitive sensors that captured five gestures-Palm, Fist, Middle, OK, and Index-of five participants to create a dataset of gesture images. The dataset was used to train Decision Tree, Naïve Bayes, Multi-Layer Perceptron (MLP) neural network, and Convolutional Neural Network (CNN) classifiers. Each classifier was trained five times; each time, the classifier was trained using four different participants' gestures and tested with one different participant's gestures. The MLP classifier performed the best, achieving an average accuracy of 96.87% and an average F1 score of 92.16%. This demonstrates that the proposed system can accurately recognize hand gestures and that capacitive sensing is a viable method for implementing a non-contact, static hand gesture recognition system.
Collapse
|
5
|
van Wegen M, Herder JL, Adelsberger R, Pastore-Wapp M, van Wegen EEH, Bohlhalter S, Nef T, Krack P, Vanbellingen T. An Overview of Wearable Haptic Technologies and Their Performance in Virtual Object Exploration. SENSORS (BASEL, SWITZERLAND) 2023; 23:1563. [PMID: 36772603 PMCID: PMC9919508 DOI: 10.3390/s23031563] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/07/2022] [Revised: 01/24/2023] [Accepted: 01/26/2023] [Indexed: 06/18/2023]
Abstract
We often interact with our environment through manual handling of objects and exploration of their properties. Object properties (OP), such as texture, stiffness, size, shape, temperature, weight, and orientation provide necessary information to successfully perform interactions. The human haptic perception system plays a key role in this. As virtual reality (VR) has been a growing field of interest with many applications, adding haptic feedback to virtual experiences is another step towards more realistic virtual interactions. However, integrating haptics in a realistic manner, requires complex technological solutions and actual user-testing in virtual environments (VEs) for verification. This review provides a comprehensive overview of recent wearable haptic devices (HDs) categorized by the OP exploration for which they have been verified in a VE. We found 13 studies which specifically addressed user-testing of wearable HDs in healthy subjects. We map and discuss the different technological solutions for different OP exploration which are useful for the design of future haptic object interactions in VR, and provide future recommendations.
Collapse
Affiliation(s)
- Myla van Wegen
- Department of Precision and Microsystems Engineering, Delft University of Technology, 2628 CD Delft, The Netherlands
| | - Just L. Herder
- Department of Precision and Microsystems Engineering, Delft University of Technology, 2628 CD Delft, The Netherlands
| | | | - Manuela Pastore-Wapp
- Gerontechnology and Rehabilitation Group, ARTORG Center for Biomedical Engineering Research, University of Bern, 3008 Bern, Switzerland
- Neurocenter, Luzerner Kantonsspital, 6000 Luzern, Switzerland
| | - Erwin E. H. van Wegen
- Department of Rehabilitation Medicine, Amsterdam Movement Sciences, Amsterdam UMC, VUmc, 1117 HV Amsterdam, The Netherlands
| | | | - Tobias Nef
- Gerontechnology and Rehabilitation Group, ARTORG Center for Biomedical Engineering Research, University of Bern, 3008 Bern, Switzerland
| | - Paul Krack
- Department of Neurology, Center for Parkinson’s Disease and Movement Disorders, Inselspital, Bern University Hospital, University of Bern, 3008 Bern, Switzerland
| | - Tim Vanbellingen
- Gerontechnology and Rehabilitation Group, ARTORG Center for Biomedical Engineering Research, University of Bern, 3008 Bern, Switzerland
- Neurocenter, Luzerner Kantonsspital, 6000 Luzern, Switzerland
| |
Collapse
|
6
|
Zhou G, Lu ML, Yu D. Investigating gripping force during lifting tasks using a pressure sensing glove system. APPLIED ERGONOMICS 2023; 107:103917. [PMID: 36279645 DOI: 10.1016/j.apergo.2022.103917] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/07/2022] [Revised: 09/30/2022] [Accepted: 10/06/2022] [Indexed: 06/16/2023]
Abstract
Lifting tasks remain one of the leading causes of musculoskeletal disorders (MSDs), primarily in the low back region. Lifting analysis tools are, therefore, designed for assessing the risk of low back pain. Shoulder musculoskeletal problems have emerged as common MSDs associated with manual handling tasks. It is hypothesized that gripping force is related to lifting conditions and may be used as a supplementary risk metric for MSDs in the shoulder and low back regions, because it measures additional hand exertions for coupling the lifted object during lifting. We assessed the capability tactile gloves for measuring the gripping force during lifting as a means for assessing different task conditions (lifting weight, lifting height, lifting direction, body rotation, and handle). Thirty participants wore the tactile gloves and performed simulated lifting tasks. Regression models were used to analyze the effects of the task variables on estimating the measured gripping force. Results demonstrated that 58% and 70% of the lifting weight variance were explained by the measured gripping force without and with considering the individual difference, respectively. In addition to the lifting risk measures commonly used by practitioners, this study suggests a potential for using gripping force as a supplementary or additional risk metric for MSDs.
Collapse
Affiliation(s)
| | - Ming-Lun Lu
- National Institute for Occupational Safety and Health, Cincinnati, OH, USA
| | - Denny Yu
- Purdue University, West Lafayette, IN, USA.
| |
Collapse
|
7
|
Shin S, Lee HJ, Chang WH, Ko SH, Shin YI, Kim YH. A Smart Glove Digital System Promotes Restoration of Upper Limb Motor Function and Enhances Cortical Hemodynamic Changes in Subacute Stroke Patients with Mild to Moderate Weakness: A Randomized Controlled Trial. J Clin Med 2022; 11:jcm11247343. [PMID: 36555960 PMCID: PMC9782087 DOI: 10.3390/jcm11247343] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2022] [Revised: 12/08/2022] [Accepted: 12/08/2022] [Indexed: 12/14/2022] Open
Abstract
This study was a randomized controlled trial to examine the effects of the RAPAEL® Smart Glove digital training system on upper extremity function and cortical hemodynamic changes in subacute stroke patients. Of 48 patients, 20 experimental and 16 controls completed the study. In addition to conventional occupational therapy (OT), the experimental group received game-based digital hand motor training with the RAPAEL® Smart Glove digital system, while the control group received extra OT for 30 min. The Fugl-Meyer assessment (UFMA) and Jebsen-Tayler hand function test (JTT) were assessed before (T0), immediately after (T1), and four weeks after intervention (T2). Cortical hemodynamics (oxyhemoglobin [OxyHb] concentration) were measured by functional near-infrared spectroscopy. The experimental group had significantly better improvements in UFMA (T1-T0 mean [SD]; Experimental 13.50 [7.49]; Control 8.00 [4.44]; p = 0.014) and JTT (Experimental 21.10 [20.84]; Control 5.63 [5.06]; p = 0.012). The OxyHb concentration change over the ipsilesional primary sensorimotor cortex during the affected wrist movement was greater in the experimental group (T1, Experimental 0.7943 × 10-4 μmol/L; Control -0.3269 × 10-4 μmol/L; p = 0.025). This study demonstrated a beneficial effect of game-based virtual reality training with the RAPAEL® Smart Glove digital system with conventional OT on upper extremity motor function in subacute stroke patients.
Collapse
Affiliation(s)
- Seyoung Shin
- Department of Physical and Rehabilitation Medicine, Center for Prevention and Rehabilitation, Heart Vascular Stroke Institute, Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul 06351, Republic of Korea
| | - Hwang-Jae Lee
- Robot Business Team, Samsung Electronics, Suwon 16677, Republic of Korea
| | - Won Hyuk Chang
- Department of Physical and Rehabilitation Medicine, Center for Prevention and Rehabilitation, Heart Vascular Stroke Institute, Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul 06351, Republic of Korea
| | - Sung Hwa Ko
- Department of Rehabilitation Medicine, Pusan National University School of Medicine, Pusan National University Yangsan Hospital, Yangsan 50612, Republic of Korea
| | - Yong-Il Shin
- Department of Rehabilitation Medicine, Pusan National University School of Medicine, Pusan National University Yangsan Hospital, Yangsan 50612, Republic of Korea
- Research Institute of Convergence for Biomedical Science and Technology, Pusan National University Yangsan Hospital, Yangsan 50612, Republic of Korea
- Correspondence: (Y.-I.S.); (Y.-H.K.); Tel.: +82-51-360-2872 (Y.-I.S.); +82-2-3410-2824 (ext. 2818) (Y.-H.K.); Fax: +82-2-3410-0388 (Y.-H.K.)
| | - Yun-Hee Kim
- Department of Physical and Rehabilitation Medicine, Center for Prevention and Rehabilitation, Heart Vascular Stroke Institute, Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul 06351, Republic of Korea
- Department of Health Science and Technology, Department of Medical Devices Management and Research, Department of Digital Healthcare, SAIHST, Sungkyunkwan University, Seoul 06355, Republic of Korea
- Correspondence: (Y.-I.S.); (Y.-H.K.); Tel.: +82-51-360-2872 (Y.-I.S.); +82-2-3410-2824 (ext. 2818) (Y.-H.K.); Fax: +82-2-3410-0388 (Y.-H.K.)
| |
Collapse
|
8
|
Otero-González I, Caeiro-Rodríguez M, Rodriguez-D’Jesus A. Methods for Gastrointestinal Endoscopy Quantification: A Focus on Hands and Fingers Kinematics. SENSORS (BASEL, SWITZERLAND) 2022; 22:9253. [PMID: 36501954 PMCID: PMC9741269 DOI: 10.3390/s22239253] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/18/2022] [Revised: 11/19/2022] [Accepted: 11/23/2022] [Indexed: 06/17/2023]
Abstract
Gastrointestinal endoscopy is a complex procedure requiring the mastery of several competencies and skills. This procedure is in increasing demand, but there exist important management and ethical issues regarding the training of new endoscopists. Nowadays, this requires the direct involvement of real patients and a high chance of the endoscopists themselves suffering from musculoskeletal conditions. Colonoscopy quantification can be useful for improving these two issues. This paper reviews the literature regarding efforts to quantify gastrointestinal procedures and focuses on the capture of hand and finger kinematics. Current technologies to support the capture of data from hand and finger movements are analyzed and tested, considering smart gloves and vision-based solutions. Manus VR Prime II and Stretch Sense MoCap reveal the main problems with smart gloves related to the adaptation of the gloves to different hand sizes and comfortability. Regarding vision-based solutions, Vero Vicon cameras show the main problem in gastrointestinal procedure scenarios: occlusion. In both cases, calibration and data interoperability are also key issues that limit possible applications. In conclusion, new advances are needed to quantify hand and finger kinematics in an appropriate way to support further developments.
Collapse
Affiliation(s)
- Iván Otero-González
- atlanTTic Research Center for Telecommunication Technologies, Universidade de Vigo, Campus-Universitario S/N, 36312 Vigo, Spain
| | - Manuel Caeiro-Rodríguez
- atlanTTic Research Center for Telecommunication Technologies, Universidade de Vigo, Campus-Universitario S/N, 36312 Vigo, Spain
| | | |
Collapse
|
9
|
Bayer IS. MEMS-Based Tactile Sensors: Materials, Processes and Applications in Robotics. MICROMACHINES 2022; 13:2051. [PMID: 36557349 PMCID: PMC9782357 DOI: 10.3390/mi13122051] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/24/2022] [Revised: 11/14/2022] [Accepted: 11/17/2022] [Indexed: 06/17/2023]
Abstract
Commonly encountered problems in the manipulation of objects with robotic hands are the contact force control and the setting of approaching motion. Microelectromechanical systems (MEMS) sensors on robots offer several solutions to these problems along with new capabilities. In this review, we analyze tactile, force and/or pressure sensors produced by MEMS technologies including off-the-shelf products such as MEMS barometric sensors. Alone or in conjunction with other sensors, MEMS platforms are considered very promising for robots to detect the contact forces, slippage and the distance to the objects for effective dexterous manipulation. We briefly reviewed several sensing mechanisms and principles, such as capacitive, resistive, piezoresistive and triboelectric, combined with new flexible materials technologies including polymers processing and MEMS-embedded textiles for flexible and snake robots. We demonstrated that without taking up extra space and at the same time remaining lightweight, several MEMS sensors can be integrated into robotic hands to simulate human fingers, gripping, hardness and stiffness sensations. MEMS have high potential of enabling new generation microactuators, microsensors, micro miniature motion-systems (e.g., microrobots) that will be indispensable for health, security, safety and environmental protection.
Collapse
Affiliation(s)
- Ilker S Bayer
- Smart Materials, Istituto Italiano di Tecnologia, Via Morego 30, 16163 Genova, Italy
| |
Collapse
|
10
|
Ma CC, Mo PC, Hsu HY, Su FC. A novel sensor-embedded holding device for monitoring upper extremity functions. Front Bioeng Biotechnol 2022; 10:976242. [PMID: 36406219 PMCID: PMC9670142 DOI: 10.3389/fbioe.2022.976242] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2022] [Accepted: 10/19/2022] [Indexed: 11/06/2022] Open
Abstract
There are several causes that can lead to functional weakness in the hands or upper extremities (UE), such as stroke, trauma, or aging. Therefore, evaluation and monitoring of UE rehabilitation have become essential. However, most traditional evaluation tools (TETs) and assessments require clinicians to assist or are limited to specific clinical settings. Several novel assessments might apply to wearable devices, yet those devices will still need clinicians or caretakers to help with further tests. Thus, a novel UE assessment device that is user-friendly and requires minimal assistance would be needed. The cylindrical grasp is one of the common UE movements performed in daily life. Therefore, a cylindrical sensor-embedded holding device (SEHD) for training and monitoring was developed for a usability test within this research. The SEHD has 14 force sensors with an array designed to fit holding positions and a six-axis inertial measurement unit (IMU) to monitor grip strength, hand dexterity, acceleration, and angular velocity. Six young adults, six healthy elderly participants, and three stroke survivors had participated in this study to see if the SEHD could be used as a reference to TETs. During result analyses, where the correlation coefficient analyses were applied, forearm rotation smoothness and the Purdue Pegboard Test (PPT) showed a moderate negative correlation [r (16) = −0.724, p < 0.01], and the finger independence showed a moderate negative correlation with the PPT [r (10) = −0.615, p < 0.05]. There was also a highly positive correlation between the maximum pressing task and Jamar dynamometer in maximum grip strength [r (16) = 0.821, p < 0.01]. These outcomes suggest that the SEHD with simple movements could be applied as a reference for users to monitor their UE ability.
Collapse
Affiliation(s)
- Charlie Chen Ma
- Department of Biomedical Engineering, National Cheng Kung University, Tainan, Taiwan
| | - Pu-Chun Mo
- Department of Biomedical Engineering, National Cheng Kung University, Tainan, Taiwan
| | - Hsiu-Yun Hsu
- Department of Physical Medicine Rehabilitation, National Cheng Kung University Hospital, Tainan, Taiwan
| | - Fong-Chin Su
- Department of Biomedical Engineering, National Cheng Kung University, Tainan, Taiwan
- Medical Device Innovation Center, National Cheng Kung University, Tainan, Taiwan
- *Correspondence: Fong-Chin Su,
| |
Collapse
|
11
|
Reichert C, Klemm L, Mushunuri RV, Kalyani A, Schreiber S, Kuehn E, Azañón E. Discriminating Free Hand Movements Using Support Vector Machine and Recurrent Neural Network Algorithms. SENSORS (BASEL, SWITZERLAND) 2022; 22:6101. [PMID: 36015862 PMCID: PMC9412700 DOI: 10.3390/s22166101] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/04/2022] [Revised: 07/29/2022] [Accepted: 08/08/2022] [Indexed: 06/15/2023]
Abstract
Decoding natural hand movements is of interest for human-computer interaction and may constitute a helpful tool in the diagnosis of motor diseases and rehabilitation monitoring. However, the accurate measurement of complex hand movements and the decoding of dynamic movement data remains challenging. Here, we introduce two algorithms, one based on support vector machine (SVM) classification combined with dynamic time warping, and the other based on a long short-term memory (LSTM) neural network, which were designed to discriminate small differences in defined sequences of hand movements. We recorded hand movement data from 17 younger and 17 older adults using an exoskeletal data glove while they were performing six different movement tasks. Accuracy rates in decoding the different movement types were similarly high for SVM and LSTM in across-subject classification, but, for within-subject classification, SVM outperformed LSTM. The SVM-based approach, therefore, appears particularly promising for the development of movement decoding tools, in particular if the goal is to generalize across age groups, for example for detecting specific motor disorders or tracking their progress over time.
Collapse
Affiliation(s)
- Christoph Reichert
- Department of Behavioral Neurology, Leibniz Institute for Neurobiology, Brenneckestr. 6, 39118 Magdeburg, Germany
- Center for Behavioral Brain Sciences (CBBS), Universitaetsplatz 2, 39106 Magdeburg, Germany
- Forschungscampus STIMULATE, Otto-Hahn-Str. 2, 39106 Magdeburg, Germany
| | - Lisa Klemm
- Department of Neurology, University Medical Center, Leipziger Str. 44, 39120 Magdeburg, Germany
| | | | - Avinash Kalyani
- Institute for Cognitive Neurology and Dementia Research (IKND), Otto-von-Guericke University, Leipziger Str. 44, 39120 Magdeburg, Germany
- German Center for Neurodegenerative Diseases (DZNE), Leipziger Str. 44, 39120 Magdeburg, Germany
| | - Stefanie Schreiber
- Center for Behavioral Brain Sciences (CBBS), Universitaetsplatz 2, 39106 Magdeburg, Germany
- Department of Neurology, University Medical Center, Leipziger Str. 44, 39120 Magdeburg, Germany
| | - Esther Kuehn
- Center for Behavioral Brain Sciences (CBBS), Universitaetsplatz 2, 39106 Magdeburg, Germany
- Institute for Cognitive Neurology and Dementia Research (IKND), Otto-von-Guericke University, Leipziger Str. 44, 39120 Magdeburg, Germany
- German Center for Neurodegenerative Diseases (DZNE), Leipziger Str. 44, 39120 Magdeburg, Germany
- Hertie Institute for Clinical Brain Research (HIH), Otfried Mueller-Str. 27, 72076 Tuebingen, Germany
| | - Elena Azañón
- Department of Behavioral Neurology, Leibniz Institute for Neurobiology, Brenneckestr. 6, 39118 Magdeburg, Germany
- Center for Behavioral Brain Sciences (CBBS), Universitaetsplatz 2, 39106 Magdeburg, Germany
- Department of Neurology, University Medical Center, Leipziger Str. 44, 39120 Magdeburg, Germany
| |
Collapse
|
12
|
Zhang C, Feng S, He R, Fang Y, Zhang S. Gastroenterology in the Metaverse: The dawn of a new era? Front Med (Lausanne) 2022; 9:904566. [PMID: 36035392 PMCID: PMC9403067 DOI: 10.3389/fmed.2022.904566] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2022] [Accepted: 07/26/2022] [Indexed: 12/03/2022] Open
Abstract
2021 is known as the first Year of the Metaverse, and around the world, internet giants are eager to devote themselves to it. In this review, we will introduce the concept, current development, and application of the Metaverse and the use of the current basic technologies in the medical field, such as virtual reality and telemedicine. We also probe into the new model of gastroenterology in the future era of the Metaverse.
Collapse
Affiliation(s)
- Chi Zhang
- The First Clinical Medical College, Zhejiang Chinese Medical University, Hangzhou, China
| | - Shuyan Feng
- The First Clinical Medical College, Zhejiang Chinese Medical University, Hangzhou, China
| | - Ruonan He
- The First Clinical Medical College, Zhejiang Chinese Medical University, Hangzhou, China
| | - Yi Fang
- The First Clinical Medical College, Zhejiang Chinese Medical University, Hangzhou, China
| | - Shuo Zhang
- The Second Affiliated Hospital of Zhejiang Chinese Medical University, Hangzhou, China
- *Correspondence: Shuo Zhang
| |
Collapse
|
13
|
Development of Low-Fidelity Virtual Replicas of Products for Usability Testing. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12146937] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/10/2022]
Abstract
Designers perform early-stage formative usability tests with low-fidelity prototypes to improve the design of new products. This low-tech prototype style reduces the manufacturing resources but limits the functions that can be assessed. Recent advances in technology enable designers to create low-fidelity 3D models for users to engage in a virtual environment. Three-dimensional models communicate design concepts and are not often used in formative usability testing. The proposed method discusses how to create a virtual replica of a product by assessing key human interaction steps and addresses the limitations of translating those steps into a virtual environment. In addition, the paper will provide a framework to evaluate the usability of a product in a virtual setting, with a specific emphasis on low-resource online testing in the user population. A study was performed to pilot the subject’s experience with the proposed approach and determine how the virtual online simulation impacted the performance. The study outcomes demonstrated that subjects were able to successfully interact with the virtual replica and found the simulation realistic. This method can be followed to perform formative usability tests earlier and incorporate subject feedback into future iterations of their design, which can improve safety and product efficacy.
Collapse
|
14
|
Müller LR, Petersen J, Yamlahi A, Wise P, Adler TJ, Seitel A, Kowalewski KF, Müller B, Kenngott H, Nickel F, Maier-Hein L. Robust hand tracking for surgical telestration. Int J Comput Assist Radiol Surg 2022; 17:1477-1486. [PMID: 35624404 PMCID: PMC9307534 DOI: 10.1007/s11548-022-02637-9] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2022] [Accepted: 04/06/2022] [Indexed: 11/30/2022]
Abstract
PURPOSE As human failure has been shown to be one primary cause for post-operative death, surgical training is of the utmost socioeconomic importance. In this context, the concept of surgical telestration has been introduced to enable experienced surgeons to efficiently and effectively mentor trainees in an intuitive way. While previous approaches to telestration have concentrated on overlaying drawings on surgical videos, we explore the augmented reality (AR) visualization of surgical hands to imitate the direct interaction with the situs. METHODS We present a real-time hand tracking pipeline specifically designed for the application of surgical telestration. It comprises three modules, dedicated to (1) the coarse localization of the expert's hand and the subsequent (2) segmentation of the hand for AR visualization in the field of view of the trainee and (3) regression of keypoints making up the hand's skeleton. The semantic representation is obtained to offer the ability for structured reporting of the motions performed as part of the teaching. RESULTS According to a comprehensive validation based on a large data set comprising more than 14,000 annotated images with varying application-relevant conditions, our algorithm enables real-time hand tracking and is sufficiently accurate for the task of surgical telestration. In a retrospective validation study, a mean detection accuracy of 98%, a mean keypoint regression accuracy of 10.0 px and a mean Dice Similarity Coefficient of 0.95 were achieved. In a prospective validation study, it showed uncompromised performance when the sensor, operator or gesture varied. CONCLUSION Due to its high accuracy and fast inference time, our neural network-based approach to hand tracking is well suited for an AR approach to surgical telestration. Future work should be directed to evaluating the clinical value of the approach.
Collapse
Affiliation(s)
- Lucas-Raphael Müller
- Intelligent Medical Systems (IMSY), German Cancer Research Center (DKFZ), Heidelberg, Germany.
- Faculty of Mathematics and Computer Science, Heidelberg University, Heidelberg, Germany.
| | - Jens Petersen
- Division of Medical Image Computing (MIC), German Cancer Research Center (DKFZ), Heidelberg, Germany
| | - Amine Yamlahi
- Intelligent Medical Systems (IMSY), German Cancer Research Center (DKFZ), Heidelberg, Germany
| | - Philipp Wise
- Department for General, Visceral and Transplantation Surgery, Mannheim University Hospital, Heidelberg, Germany
| | - Tim J Adler
- Intelligent Medical Systems (IMSY), German Cancer Research Center (DKFZ), Heidelberg, Germany
- Faculty of Mathematics and Computer Science, Heidelberg University, Heidelberg, Germany
| | - Alexander Seitel
- Intelligent Medical Systems (IMSY), German Cancer Research Center (DKFZ), Heidelberg, Germany
| | - Karl-Friedrich Kowalewski
- Department of Urology and Urosurgery, Medical Faculty Mannheim, Heidelberg University Hospital, Heidelberg, Germany
| | - Beat Müller
- Department for General, Visceral and Transplantation Surgery, Mannheim University Hospital, Heidelberg, Germany
| | - Hannes Kenngott
- Department for General, Visceral and Transplantation Surgery, Mannheim University Hospital, Heidelberg, Germany
| | - Felix Nickel
- Department for General, Visceral and Transplantation Surgery, Mannheim University Hospital, Heidelberg, Germany.
| | - Lena Maier-Hein
- Intelligent Medical Systems (IMSY), German Cancer Research Center (DKFZ), Heidelberg, Germany
- Faculty of Mathematics and Computer Science, Heidelberg University, Heidelberg, Germany
- Medical Faculty, Heidelberg University, Heidelberg, Germany
| |
Collapse
|
15
|
Abstract
Human–robot collaboration (HRC) enables humans and robots to coexist in the same working environment by performing production operations together. HRC systems are used in advanced manufacturing to improve the productivity and efficiency of a manufacturing process. The question is which HRC systems can ensure that humans can work with robots in a safe environment. This present study proposes a solution through the development of a low-cost sensory glove. This glove was developed using a number of hardware and software tools. The sensory glove analysed and computed the motion and orientation of a worker’s hand. This was carried out to operate the robot through commands and actions while under safe operating conditions. The sensory glove was built as a mechatronic device and was controlled by an algorithm that was designed and developed to compute the data and create a three-dimensional render of the glove as it moved. The image produced enabled the robot to recognize the worker’s hand when collaboration began. Tests were conducted to determine the accuracy, dynamic range and practicality of the system. The results showed that the sensory glove is an innovative low-cost solution for humans and robots to collaborate safely. The sensory glove was able to provide a safe working environment for humans and robots to collaborate on operations together.
Collapse
|
16
|
Sensing System for Plegic or Paretic Hands Self-Training Motivation. SENSORS 2022; 22:s22062414. [PMID: 35336583 PMCID: PMC8955878 DOI: 10.3390/s22062414] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/31/2021] [Revised: 03/06/2022] [Accepted: 03/07/2022] [Indexed: 11/16/2022]
Abstract
Patients after stroke with paretic or plegic hands require frequent exercises to promote neuroplasticity and to improve hand joint mobilization. Available devices for hand exercising are intended for persons with some level of hand control or provide continuous passive motion with limited patient involvement. Patients can benefit from self-exercising where they use the other hand to exercise the plegic or paretic one. However, post-stroke neuropsychological complications, apathy, and cognitive impairments such as forgetfulness make regular self-exercising difficult. This paper describes Przypominajka v2-a system intended to support self-exercising, remind about it, and motivate patients. We propose a glove-based device with an on-device machine-learning-based exercise scoring, a tablet-based interface, and a web-based application for therapists. The feasibility of on-device inference and the accuracy of correct exercise classification was evaluated on four healthy participants. Whole system use was described in a case study with a patient with a paretic hand. The anomaly classification has an accuracy of 91.3% and f1 value of 91.6% but achieves poorer results for new users (78% and 81%). The case study showed that patients had a positive reaction to exercising with Przypominajka, but there were issues relating to sensor glove: ease of putting on and clarity of instructions. The paper presents a new way in which sensor systems can support the rehabilitation of after-stroke patients with an on-device machine-learning-based classification that can accurately score and contribute to patient motivation.
Collapse
|
17
|
Proulx CE, Louis Jean MT, Higgins J, Gagnon DH, Dancause N. Somesthetic, Visual, and Auditory Feedback and Their Interactions Applied to Upper Limb Neurorehabilitation Technology: A Narrative Review to Facilitate Contextualization of Knowledge. FRONTIERS IN REHABILITATION SCIENCES 2022; 3:789479. [PMID: 36188924 PMCID: PMC9397809 DOI: 10.3389/fresc.2022.789479] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/05/2021] [Accepted: 01/14/2022] [Indexed: 11/13/2022]
Abstract
Reduced hand dexterity is a common component of sensorimotor impairments for individuals after stroke. To improve hand function, innovative rehabilitation interventions are constantly developed and tested. In this context, technology-based interventions for hand rehabilitation have been emerging rapidly. This paper offers an overview of basic knowledge on post lesion plasticity and sensorimotor integration processes in the context of augmented feedback and new rehabilitation technologies, in particular virtual reality and soft robotic gloves. We also discuss some factors to consider related to the incorporation of augmented feedback in the development of technology-based interventions in rehabilitation. This includes factors related to feedback delivery parameter design, task complexity and heterogeneity of sensory deficits in individuals affected by a stroke. In spite of the current limitations in our understanding of the mechanisms involved when using new rehabilitation technologies, the multimodal augmented feedback approach appears promising and may provide meaningful ways to optimize recovery after stroke. Moving forward, we argue that comparative studies allowing stratification of the augmented feedback delivery parameters based upon different biomarkers, lesion characteristics or impairments should be advocated (e.g., injured hemisphere, lesion location, lesion volume, sensorimotor impairments). Ultimately, we envision that treatment design should combine augmented feedback of multiple modalities, carefully adapted to the specific condition of the individuals affected by a stroke and that evolves along with recovery. This would better align with the new trend in stroke rehabilitation which challenges the popular idea of the existence of an ultimate good-for-all intervention.
Collapse
Affiliation(s)
- Camille E. Proulx
- School of Rehabilitation, Faculty of Medecine, Université de Montréal, Montreal, QC, Canada
- Center for Interdisciplinary Research in Rehabilitation of Greater Montreal – Site Institut universitaire sur la réadaptation en déficience physique de Montréal, CIUSSS Centre-Sud-de-l'Île-de-Montréal, Montreal, QC, Canada
- *Correspondence: Camille E. Proulx
| | | | - Johanne Higgins
- School of Rehabilitation, Faculty of Medecine, Université de Montréal, Montreal, QC, Canada
- Center for Interdisciplinary Research in Rehabilitation of Greater Montreal – Site Institut universitaire sur la réadaptation en déficience physique de Montréal, CIUSSS Centre-Sud-de-l'Île-de-Montréal, Montreal, QC, Canada
| | - Dany H. Gagnon
- School of Rehabilitation, Faculty of Medecine, Université de Montréal, Montreal, QC, Canada
- Center for Interdisciplinary Research in Rehabilitation of Greater Montreal – Site Institut universitaire sur la réadaptation en déficience physique de Montréal, CIUSSS Centre-Sud-de-l'Île-de-Montréal, Montreal, QC, Canada
| | - Numa Dancause
- Department of Neurosciences, Faculty of Medecine, Université de Montréal, Montreal, QC, Canada
- Centre interdisciplinaire de recherche sur le cerveau et l'apprentissage (CIRCA), Université de Montréal, Montreal, QC, Canada
| |
Collapse
|
18
|
A Fabricated Force Glove That Measures Hand Forces during Activities of Daily Living. SENSORS 2022; 22:s22041330. [PMID: 35214233 PMCID: PMC8877267 DOI: 10.3390/s22041330] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/27/2021] [Revised: 02/06/2022] [Accepted: 02/07/2022] [Indexed: 01/03/2023]
Abstract
Understanding hand and wrist forces during activities of daily living (ADLs) are pertinent when modeling prosthetics/orthotics, preventing workplace-related injuries, and understanding movement patterns that make athletes, dancers, and musicians elite. The small size of the wrist, fingers, and numerous joints creates obstacles in accurately measuring these forces. In this study, 14 FlexiForce sensors were sewn into a glove in an attempt to capture forces applied by the fingers. Participants in this study wore the glove and performed grasp and key turn activities. The maximal forces produced in the study were 9 N at the distal middle finger phalanx and 24 N at the distal thumb phalanx, respectively, for the grasp and key turn activities. Results from this study will help in determining the minimal forces of the hand during ADLs so that appropriate actuators may be placed at the appropriate joints in exoskeletons, orthotics, and prosthetics.
Collapse
|
19
|
DelPreto J, Hughes J, D'Aria M, de Fazio M, Rus D. A Wearable Smart Glove and Its Application of Pose and Gesture Detection to Sign Language Classification. IEEE Robot Autom Lett 2022. [DOI: 10.1109/lra.2022.3191232] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Affiliation(s)
| | | | | | | | - Daniela Rus
- MIT Distributed Robotics Lab, Cambridge, MA, USA
| |
Collapse
|
20
|
Technologies for Multimodal Interaction in Extended Reality—A Scoping Review. MULTIMODAL TECHNOLOGIES AND INTERACTION 2021. [DOI: 10.3390/mti5120081] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
When designing extended reality (XR) applications, it is important to consider multimodal interaction techniques, which employ several human senses simultaneously. Multimodal interaction can transform how people communicate remotely, practice for tasks, entertain themselves, process information visualizations, and make decisions based on the provided information. This scoping review summarized recent advances in multimodal interaction technologies for head-mounted display-based (HMD) XR systems. Our purpose was to provide a succinct, yet clear, insightful, and structured overview of emerging, underused multimodal technologies beyond standard video and audio for XR interaction, and to find research gaps. The review aimed to help XR practitioners to apply multimodal interaction techniques and interaction researchers to direct future efforts towards relevant issues on multimodal XR. We conclude with our perspective on promising research avenues for multimodal interaction technologies.
Collapse
|
21
|
Review of Wearable Devices and Data Collection Considerations for Connected Health. SENSORS 2021; 21:s21165589. [PMID: 34451032 PMCID: PMC8402237 DOI: 10.3390/s21165589] [Citation(s) in RCA: 64] [Impact Index Per Article: 21.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/04/2021] [Revised: 07/22/2021] [Accepted: 08/02/2021] [Indexed: 12/16/2022]
Abstract
Wearable sensor technology has gradually extended its usability into a wide range of well-known applications. Wearable sensors can typically assess and quantify the wearer’s physiology and are commonly employed for human activity detection and quantified self-assessment. Wearable sensors are increasingly utilised to monitor patient health, rapidly assist with disease diagnosis, and help predict and often improve patient outcomes. Clinicians use various self-report questionnaires and well-known tests to report patient symptoms and assess their functional ability. These assessments are time consuming and costly and depend on subjective patient recall. Moreover, measurements may not accurately demonstrate the patient’s functional ability whilst at home. Wearable sensors can be used to detect and quantify specific movements in different applications. The volume of data collected by wearable sensors during long-term assessment of ambulatory movement can become immense in tuple size. This paper discusses current techniques used to track and record various human body movements, as well as techniques used to measure activity and sleep from long-term data collected by wearable technology devices.
Collapse
|