1
|
Polsinelli M, Di Matteo A, Lozzi D, Mattei E, Mignosi F, Nazzicone L, Stornelli V, Placidi G. Portable Head-Mounted System for Mobile Forearm Tracking. Sensors (Basel) 2024; 24:2227. [PMID: 38610437 PMCID: PMC11014154 DOI: 10.3390/s24072227] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/02/2024] [Revised: 03/25/2024] [Accepted: 03/26/2024] [Indexed: 04/14/2024]
Abstract
Computer vision (CV)-based systems using cameras and recognition algorithms offer touchless, cost-effective, precise, and versatile hand tracking. These systems allow unrestricted, fluid, and natural movements without the constraints of wearable devices, gaining popularity in human-system interaction, virtual reality, and medical procedures. However, traditional CV-based systems, relying on stationary cameras, are not compatible with mobile applications and demand substantial computing power. To address these limitations, we propose a portable hand-tracking system utilizing the Leap Motion Controller 2 (LMC) mounted on the head and controlled by a single-board computer (SBC) powered by a compact power bank. The proposed system enhances portability, enabling users to interact freely with their surroundings. We present the system's design and conduct experimental tests to evaluate its robustness under variable lighting conditions, power consumption, CPU usage, temperature, and frame rate. This portable hand-tracking solution, which has minimal weight and runs independently of external power, proves suitable for mobile applications in daily life.
Collapse
Affiliation(s)
| | - Alessandro Di Matteo
- A2VI-Lab, DISIM, University of L’Aquila, 67100 L’Aquila, Italy; (A.D.M.); (D.L.); (E.M.); (F.M.)
| | - Daniele Lozzi
- A2VI-Lab, DISIM, University of L’Aquila, 67100 L’Aquila, Italy; (A.D.M.); (D.L.); (E.M.); (F.M.)
| | - Enrico Mattei
- A2VI-Lab, DISIM, University of L’Aquila, 67100 L’Aquila, Italy; (A.D.M.); (D.L.); (E.M.); (F.M.)
| | - Filippo Mignosi
- A2VI-Lab, DISIM, University of L’Aquila, 67100 L’Aquila, Italy; (A.D.M.); (D.L.); (E.M.); (F.M.)
| | - Lorenzo Nazzicone
- A2VI-Lab, DIIIE, University of L’Aquila, 67100 L’Aquila, Italy; (L.N.); (V.S.)
| | - Vincenzo Stornelli
- A2VI-Lab, DIIIE, University of L’Aquila, 67100 L’Aquila, Italy; (L.N.); (V.S.)
| | - Giuseppe Placidi
- A2VI-Lab, c/o Department of MESVA, University of L’Aquila, 67100 L’Aquila, Italy;
| |
Collapse
|
2
|
Cerdá-Boluda J, Mora MC, Lloret N, Scarani S, Sastre J. Design of Virtual Hands for Natural Interaction in the Metaverse. Sensors (Basel) 2024; 24:741. [PMID: 38339458 PMCID: PMC10857016 DOI: 10.3390/s24030741] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/20/2023] [Revised: 01/17/2024] [Accepted: 01/22/2024] [Indexed: 02/12/2024]
Abstract
The emergence of the Metaverse is raising important questions in the field of human-machine interaction that must be addressed for a successful implementation of the new paradigm. Therefore, the exploration and integration of both technology and human interaction within this new framework are needed. This paper describes an innovative and technically viable proposal for virtual shopping in the fashion field. Virtual hands directly scanned from the real world have been integrated, after a retopology process, in a virtual environment created for the Metaverse, and have been integrated with digital nails. Human interaction with the Metaverse has been carried out through the acquisition of the real posture of the user's hands using an infrared-based sensor and mapping it in its virtualized version, achieving natural identification. The technique has been successfully tested in an immersive shopping experience with the Meta Quest 2 headset as a pilot project, where a transactions mechanism based on the blockchain technology (non-fungible tokens, NFTs) has allowed for the development of a feasible solution for massive audiences. The consumers' reactions were extremely positive, with a total of 250 in-person participants and 120 remote accesses to the Metaverse. Very interesting technical guidelines are raised in this project, the resolution of which may be useful for future implementations.
Collapse
Affiliation(s)
- Joaquín Cerdá-Boluda
- Instituto de Instrumentación para Imagen Molecular (I3M), Universitat Politècnica de València, 46020 Valencia, Spain
| | - Marta C. Mora
- Departament d’Enginyeria Mecànica i Construcció (EMC), Universitat Jaume I, 12071 Castelló de la Plana, Spain;
| | - Nuria Lloret
- Institute of Design and Manufacturing, Universitat Politècnica de València, 46022 Valencia, Spain;
| | - Stefano Scarani
- Department of Sculpture, Universitat Politècnica de València, 46022 Valencia, Spain;
| | - Jorge Sastre
- Institute of Telecommunications and Multimedia Applications, Universitat Politècnica de València, 46022 Valencia, Spain;
| |
Collapse
|
3
|
Lei Y, Deng Y, Dong L, Li X, Li X, Su Z. A Novel Sensor Fusion Approach for Precise Hand Tracking in Virtual Reality-Based Human-Computer Interaction. Biomimetics (Basel) 2023; 8:326. [PMID: 37504214 PMCID: PMC10807483 DOI: 10.3390/biomimetics8030326] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2023] [Revised: 07/06/2023] [Accepted: 07/08/2023] [Indexed: 07/29/2023] Open
Abstract
The rapidly evolving field of Virtual Reality (VR)-based Human-Computer Interaction (HCI) presents a significant demand for robust and accurate hand tracking solutions. Current technologies, predominantly based on single-sensing modalities, fall short in providing comprehensive information capture due to susceptibility to occlusions and environmental factors. In this paper, we introduce a novel sensor fusion approach combined with a Long Short-Term Memory (LSTM)-based algorithm for enhanced hand tracking in VR-based HCI. Our system employs six Leap Motion controllers, two RealSense depth cameras, and two Myo armbands to yield a multi-modal data capture. This rich data set is then processed using LSTM, ensuring the accurate real-time tracking of complex hand movements. The proposed system provides a powerful tool for intuitive and immersive interactions in VR environments.
Collapse
Affiliation(s)
- Yu Lei
- College of Humanities and Arts, Hunan International Economics University, Changsha 410012, China;
| | - Yi Deng
- College of Physical Education, Hunan International Economics University, Changsha 410012, China
| | - Lin Dong
- Institute of Sports Artificial Intelligence, Capital University of Physical Education and Sports, Beijing 100091, China
| | - Xiaohui Li
- Department of Wushu and China, Songshan Shaolin Wushu College, Zhengzhou 452470, China
- Department of History and Pakistan, University of the Punjab, Lahore 54000, Pakistan
| | - Xiangnan Li
- Yantai Science and Technology Innovation Promotion Center, Yantai 264005, China
| | - Zhi Su
- Department of Information, School of Design and Art, Changsha University of Science and Technology, Changsha 410076, China;
| |
Collapse
|
4
|
Vysocký A, Poštulka T, Chlebek J, Kot T, Maslowski J, Grushko S. Hand Gesture Interface for Robot Path Definition in Collaborative Applications: Implementation and Comparative Study. Sensors (Basel) 2023; 23:s23094219. [PMID: 37177421 PMCID: PMC10180605 DOI: 10.3390/s23094219] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/27/2023] [Revised: 04/21/2023] [Accepted: 04/21/2023] [Indexed: 05/15/2023]
Abstract
The article explores the possibilities of using hand gestures as a control interface for robotic systems in a collaborative workspace. The development of hand gesture control interfaces has become increasingly important in everyday life as well as professional contexts such as manufacturing processes. We present a system designed to facilitate collaboration between humans and robots in manufacturing processes that require frequent revisions of the robot path and that allows direct definition of the waypoints, which differentiates our system from the existing ones. We introduce a novel and intuitive approach to human-robot cooperation through the use of simple gestures. As part of a robotic workspace, a proposed interface was developed and implemented utilising three RGB-D sensors for monitoring the operator's hand movements within the workspace. The system employs distributed data processing through multiple Jetson Nano units, with each unit processing data from a single camera. MediaPipe solution is utilised to localise the hand landmarks in the RGB image, enabling gesture recognition. We compare the conventional methods of defining robot trajectories with their developed gesture-based system through an experiment with 20 volunteers. The experiment involved verification of the system under realistic conditions in a real workspace closely resembling the intended industrial application. Data collected during the experiment included both objective and subjective parameters. The results indicate that the gesture-based interface enables users to define a given path objectively faster than conventional methods. We critically analyse the features and limitations of the developed system and suggest directions for future research. Overall, the experimental results indicate the usefulness of the developed system as it can speed up the definition of the robot's path.
Collapse
Affiliation(s)
- Aleš Vysocký
- Department of Robotics, Faculty of Mechanical Engineering, VSB-Technical University of Ostrava, 17. Listopadu 2172/15, 708 00 Ostrava, Czech Republic
| | - Tomáš Poštulka
- Department of Robotics, Faculty of Mechanical Engineering, VSB-Technical University of Ostrava, 17. Listopadu 2172/15, 708 00 Ostrava, Czech Republic
| | - Jakub Chlebek
- Department of Robotics, Faculty of Mechanical Engineering, VSB-Technical University of Ostrava, 17. Listopadu 2172/15, 708 00 Ostrava, Czech Republic
| | - Tomáš Kot
- Department of Robotics, Faculty of Mechanical Engineering, VSB-Technical University of Ostrava, 17. Listopadu 2172/15, 708 00 Ostrava, Czech Republic
| | - Jan Maslowski
- Department of Robotics, Faculty of Mechanical Engineering, VSB-Technical University of Ostrava, 17. Listopadu 2172/15, 708 00 Ostrava, Czech Republic
| | - Stefan Grushko
- Department of Robotics, Faculty of Mechanical Engineering, VSB-Technical University of Ostrava, 17. Listopadu 2172/15, 708 00 Ostrava, Czech Republic
| |
Collapse
|
5
|
Placidi G, Di Matteo A, Lozzi D, Polsinelli M, Theodoridou E. Patient-Therapist Cooperative Hand Telerehabilitation through a Novel Framework Involving the Virtual Glove System. Sensors (Basel) 2023; 23:3463. [PMID: 37050523 PMCID: PMC10098681 DOI: 10.3390/s23073463] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/18/2023] [Revised: 03/20/2023] [Accepted: 03/21/2023] [Indexed: 06/19/2023]
Abstract
Telerehabilitation is important for post-stroke or post-surgery rehabilitation because the tasks it uses are reproducible. When combined with assistive technologies, such as robots, virtual reality, tracking systems, or a combination of them, it can also allow the recording of a patient's progression and rehabilitation monitoring, along with an objective evaluation. In this paper, we present the structure, from actors and functionalities to software and hardware views, of a novel framework that allows cooperation between patients and therapists. The system uses a computer-vision-based system named virtual glove for real-time hand tracking (40 fps), which is translated into a light and precise system. The novelty of this work lies in the fact that it gives the therapist quantitative, not only qualitative, information about the hand's mobility, for every hand joint separately, while at the same time providing control of the result of the rehabilitation by also quantitatively monitoring the progress of the hand mobility. Finally, it also offers a strategy for patient-therapist interaction and therapist-therapist data sharing.
Collapse
Affiliation(s)
- Giuseppe Placidi
- AVI-Lab, Department of Life, Health & Environmental Sciences, University of L’Aquila, 67100 L’Aquila, Italy
| | - Alessandro Di Matteo
- AVI-Lab, Department of Information Engineering, Computer Science and Mathematics, University of L’Aquila, 67100 L’Aquila, Italy
| | - Daniele Lozzi
- AVI-Lab, Department of Information Engineering, Computer Science and Mathematics, University of L’Aquila, 67100 L’Aquila, Italy
| | - Matteo Polsinelli
- Department of Computer Science, University of Salerno, 84084 Fisciano, Italy
| | - Eleni Theodoridou
- AVI-Lab, Department of Life, Health & Environmental Sciences, University of L’Aquila, 67100 L’Aquila, Italy
| |
Collapse
|
6
|
Varela-Aldás J, Buele J, López I, Palacios-Navarro G. Influence of Hand Tracking in Immersive Virtual Reality for Memory Assessment. Int J Environ Res Public Health 2023; 20:4609. [PMID: 36901618 PMCID: PMC10002257 DOI: 10.3390/ijerph20054609] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/14/2023] [Revised: 02/26/2023] [Accepted: 03/03/2023] [Indexed: 06/18/2023]
Abstract
Few works analyze the parameters inherent to immersive virtual reality (IVR) in applications for memory evaluation. Specifically, hand tracking adds to the immersion of the system, placing the user in the first person with full awareness of the position of their hands. Thus, this work addresses the influence of hand tracking in memory assessment with IVR systems. For this, an application based on activities of daily living was developed, where the user must remember the location of the elements. The data collected by the application are the accuracy of the answers and the response time; the participants are 20 healthy subjects who pass the MoCA test with an age range between 18 to 60 years of age; the application was evaluated with classic controllers and with the hand tracking of the Oculus Quest 2. After the experimentation, the participants carried out presence (PQ), usability (UMUX), and satisfaction (USEQ) tests. The results indicate no difference with statistical significance between both experiments; controller experiments have 7.08% higher accuracy and 0.27 ys. faster response time. Contrary to expectations, presence was 1.3% lower for hand tracking, and usability (0.18%) and satisfaction (1.43%) had similar results. The findings indicate no evidence to determine better conditions in the evaluation of memory in this case of IVR with hand tracking.
Collapse
Affiliation(s)
- José Varela-Aldás
- Centro de Investigaciones de Ciencias Humanas y de la Educación—CICHE, Universidad Indoamérica, Ambato 180103, Ecuador
- SISAu Research Group, Facultad de Ingeniería, Industria y Producción FAINPRO, Universidad Indoamérica, Ambato 180103, Ecuador
| | - Jorge Buele
- SISAu Research Group, Facultad de Ingeniería, Industria y Producción FAINPRO, Universidad Indoamérica, Ambato 180103, Ecuador
- Department of Electronic Engineering and Communications, University of Zaragoza, 44003 Teruel, Spain
| | - Irene López
- SISAu Research Group, Facultad de Ingeniería, Industria y Producción FAINPRO, Universidad Indoamérica, Ambato 180103, Ecuador
| | | |
Collapse
|
7
|
Karrenbach M, Preechayasomboon P, Sauer P, Boe D, Rombokas E. Deep learning and session-specific rapid recalibration for dynamic hand gesture recognition from EMG. Front Bioeng Biotechnol 2022; 10:1034672. [PMID: 36588953 PMCID: PMC9797837 DOI: 10.3389/fbioe.2022.1034672] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2022] [Accepted: 12/05/2022] [Indexed: 12/23/2022] Open
Abstract
We anticipate wide adoption of wrist and forearm electomyographic (EMG) interface devices worn daily by the same user. This presents unique challenges that are not yet well addressed in the EMG literature, such as adapting for session-specific differences while learning a longer-term model of the specific user. In this manuscript we present two contributions toward this goal. First, we present the MiSDIREKt (Multi-Session Dynamic Interaction Recordings of EMG and Kinematics) dataset acquired using a novel hardware design. A single participant performed four kinds of hand interaction tasks in virtual reality for 43 distinct sessions over 12 days, totaling 814 min. Second, we analyze this data using a non-linear encoder-decoder for dimensionality reduction in gesture classification. We find that an architecture which recalibrates with a small amount of single session data performs at an accuracy of 79.5% on that session, as opposed to architectures which learn solely from the single session (49.6%) or learn only from the training data (55.2%).
Collapse
Affiliation(s)
- Maxim Karrenbach
- Department of Electrical and Computer Engineering, University of Washington, Seattle, WA, United States,*Correspondence: Maxim Karrenbach,
| | | | - Peter Sauer
- Department of Statistics and Data Science, Carnegie Mellon University, Pittsburgh, PA, United States
| | - David Boe
- Department of Mechanical Engineering, University of Washington, Seattle, WA, United States
| | - Eric Rombokas
- Department of Electrical and Computer Engineering, University of Washington, Seattle, WA, United States,Department of Mechanical Engineering, University of Washington, Seattle, WA, United States
| |
Collapse
|
8
|
Abstract
Adapting hand movements to changes in our body or the environment is essential for skilled motor behavior, as is the ability to flexibly combine experience gathered in separate contexts. However, it has been shown that when adapting hand movements to two different visuomotor perturbations in succession, interference effects can occur. Here, we investigate whether these interference effects compromise our ability to adapt to the superposition of the two perturbations. Participants tracked with a joystick, a visual target that followed a smooth but an unpredictable trajectory. Four separate groups of participants (total n = 83) completed one block of 50 trials under each of three mappings: one in which the cursor was rotated by 90° (ROTATION), one in which the cursor mimicked the behavior of a mass-spring system (SPRING), and one in which the SPRING and ROTATION mappings were superimposed (SPROT). The order of the blocks differed across groups. Although interference effects were found when switching between SPRING and ROTATION, participants who performed these blocks first performed better in SPROT than participants who had no prior experience with SPRING and ROTATION (i.e., composition). Moreover, participants who started with SPROT exhibited better performance under SPRING and ROTATION than participants who had no prior experience with each of these mappings (i.e., decomposition). Additional analyses confirmed that these effects resulted from components of learning that were specific to the rotational and spring perturbations. These results show that interference effects do not preclude the ability to compose/decompose various forms of visuomotor adaptation.NEW & NOTEWORTHY The ability to compose/decompose task representations is critical for both cognitive and behavioral flexibility. Here, we show that this ability extends to two forms of visuomotor adaptation in which humans have to perform visually guided hand movements. Despite the presence of interference effects when switching between visuomotor maps, we show that participants are able to flexibly compose or decompose knowledge acquired in previous sessions. These results further demonstrate the flexibility of sensorimotor adaptation in humans.
Collapse
Affiliation(s)
- Pierre-Michel Bernier
- Département de Kinanthropologie, Université de Sherbrooke, Sherbrooke, Quebec, Canada
| | - James Mathew
- Institut Neurosci Timone, Aix Marseille Univ, CNRS, INT, Marseille, France.,Institute of Neuroscience, Institute of Communication & Information Technologies, Electronics & Applied Mathematics, Université Catholique de Louvain, Louvain-la-neuve, Belgium
| | - Frederic R Danion
- Institut Neurosci Timone, Aix Marseille Univ, CNRS, INT, Marseille, France.,Center for Research on Cognition and Learning (CERCA) UMR 7295, University of Poitiers, CNRS, Poitiers, France
| |
Collapse
|
9
|
Yeamkuan S, Chamnongthai K. 3D Point-of-Intention Determination Using a Multimodal Fusion of Hand Pointing and Eye Gaze for a 3D Display. Sensors (Basel) 2021; 21:1155. [PMID: 33562169 DOI: 10.3390/s21041155] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/17/2021] [Revised: 02/01/2021] [Accepted: 02/03/2021] [Indexed: 12/13/2022]
Abstract
This paper proposes a three-dimensional (3D) point-of-intention (POI) determination method using multimodal fusion between hand pointing and eye gaze for a 3D virtual display. In the method, the finger joint forms of the pointing hand sensed by a Leap Motion sensor are first detected as pointing intention candidates. Subsequently, differences with neighboring frames, which should be during hand pointing period, are checked by AND logic with the hand-pointing intention candidates. A crossing point between the eye gaze and hand pointing lines is finally decided by the closest distance concept. In order to evaluate the performance of the proposed method, experiments with ten participants, in which they looked at and pointed at nine test points for approximately five second each, were performed. The experimental results show the proposed method measures 3D POIs at 75 cm, 85 cm, and 95 cm with average distance errors of 4.67%, 5.38%, and 5.71%, respectively.
Collapse
|
10
|
Kim S, Park S, Lee O. Development of a Diagnosis and Evaluation System for Hemiplegic Patients Post-Stroke Based on Motion Recognition Tracking and Analysis of Wrist Joint Kinematics. Sensors (Basel) 2020; 20:E4548. [PMID: 32823784 PMCID: PMC7472295 DOI: 10.3390/s20164548] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/23/2020] [Revised: 08/03/2020] [Accepted: 08/11/2020] [Indexed: 11/18/2022]
Abstract
An inexperienced therapist lacks the analysis of a patient's movement. In addition, the patient does not receive objective feedback from the therapist due to the visual subjective judgment. The aim is to provide a guide for in-depth rehabilitation therapy in virtual space by continuously tracking the user's wrist joint during Leap Motion Controller (LMC) activities and present the basic data to confirm steady therapy results in real-time. The conventional Box and Block Test (BBT) is commonly used in upper extremity rehabilitation therapy. It was modeled in proportion to the actual size and Auto Desk Inventor was used to perform the 3D modeling work. The created 3D object was then implemented in C # through Unity5.6.2p4 based on LMC. After obtaining a wrist joint motion value, the motion was analyzed by 3D graph. Healthy subjects (23 males and 25 females, n = 48) were enrolled in this study. There was no statistically significant counting difference between conventional BBT and system BBT. This indicates the possibility of effective diagnosis and evaluation of hemiplegic patients post-stroke. We can keep track of wrist joints, check real-time continuous feedback in the implemented virtual space, and provide the basic data for an LMC-based quantitative rehabilitation therapy guide.
Collapse
Affiliation(s)
- Subok Kim
- Department of Computer Science & Engineering, Graduate School, Soonchunhyang University, 22 Soonchunhyang-ro, Asan 31538, Korea;
| | - Seoho Park
- Department of Medical IT Engineering, College of Medical Sciences, Soonchunhyang University, 22 Soonchunhyang-ro, Asan 31538, Korea;
| | - Onseok Lee
- Department of Computer Science & Engineering, Graduate School, Soonchunhyang University, 22 Soonchunhyang-ro, Asan 31538, Korea;
- Department of Medical IT Engineering, College of Medical Sciences, Soonchunhyang University, 22 Soonchunhyang-ro, Asan 31538, Korea;
| |
Collapse
|
11
|
Vysocký A, Grushko S, Oščádal P, Kot T, Babjak J, Jánoš R, Sukop M, Bobovský Z. Analysis of Precision and Stability of Hand Tracking with Leap Motion Sensor. Sensors (Basel) 2020; 20:E4088. [PMID: 32707927 DOI: 10.3390/s20154088] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/20/2020] [Revised: 07/16/2020] [Accepted: 07/20/2020] [Indexed: 11/16/2022]
Abstract
In this analysis, we present results from measurements performed to determine the stability of a hand tracking system and the accuracy of the detected palm and finger’s position. Measurements were performed for the evaluation of the sensor for an application in an industrial robot-assisted assembly scenario. Human–robot interaction is a relevant topic in collaborative robotics. Intuitive and straightforward control tools for robot navigation and program flow control are essential for effective utilisation in production scenarios without unnecessary slowdowns caused by the operator. For the hand tracking and gesture-based control, it is necessary to know the sensor’s accuracy. For gesture recognition with a moving target, the sensor must provide stable tracking results. This paper evaluates the sensor’s real-world performance by measuring the localisation deviations of the hand being tracked as it moves in the workspace.
Collapse
|
12
|
Albani G, Ferraris C, Nerino R, Chimienti A, Pettiti G, Parisi F, Ferrari G, Cau N, Cimolin V, Azzaro C, Priano L, Mauro A. An Integrated Multi-Sensor Approach for the Remote Monitoring of Parkinson's Disease. Sensors (Basel) 2019; 19:E4764. [PMID: 31684020 PMCID: PMC6864792 DOI: 10.3390/s19214764] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/29/2019] [Revised: 10/30/2019] [Accepted: 10/31/2019] [Indexed: 01/30/2023]
Abstract
The increment of the prevalence of neurological diseases due to the trend in population aging demands for new strategies in disease management. In Parkinson's disease (PD), these strategies should aim at improving diagnosis accuracy and frequency of the clinical follow-up by means of decentralized cost-effective solutions. In this context, a system suitable for the remote monitoring of PD subjects is presented. It consists of the integration of two approaches investigated in our previous works, each one appropriate for the movement analysis of specific parts of the body: low-cost optical devices for the upper limbs and wearable sensors for the lower ones. The system performs the automated assessments of six motor tasks of the unified Parkinson's disease rating scale, and it is equipped with a gesture-based human machine interface designed to facilitate the user interaction and the system management. The usability of the system has been evaluated by means of standard questionnaires, and the accuracy of the automated assessment has been verified experimentally. The results demonstrate that the proposed solution represents a substantial improvement in PD assessment respect to the former two approaches treated separately, and a new example of an accurate, feasible and cost-effective mean for the decentralized management of PD.
Collapse
Affiliation(s)
- Giovanni Albani
- Istituto Auxologico Italiano, IRCCS, Department of Neurology and NeuroRehabilitation, S. Giuseppe Hospital, 28824 Piancavallo, Oggebbio (Verbania), Italy.
| | - Claudia Ferraris
- Institute of Electronics, Computer and Telecommunication Engineering, National Research Council, Corso Duca degli Abruzzi 24, 10129 Torino, Italy.
- Department of Neurosciences, University of Turin, Via Cherasco 15, 10100 Torino, Italy.
| | - Roberto Nerino
- Institute of Electronics, Computer and Telecommunication Engineering, National Research Council, Corso Duca degli Abruzzi 24, 10129 Torino, Italy.
| | - Antonio Chimienti
- Institute of Electronics, Computer and Telecommunication Engineering, National Research Council, Corso Duca degli Abruzzi 24, 10129 Torino, Italy.
| | - Giuseppe Pettiti
- Institute of Electronics, Computer and Telecommunication Engineering, National Research Council, Corso Duca degli Abruzzi 24, 10129 Torino, Italy.
| | - Federico Parisi
- CNIT Research Unit of Parma and Department of Information Engineering, University of Parma, 43124 Parma, Italy.
| | - Gianluigi Ferrari
- CNIT Research Unit of Parma and Department of Information Engineering, University of Parma, 43124 Parma, Italy.
| | - Nicola Cau
- Istituto Auxologico Italiano, IRCCS, Department of Neurology and NeuroRehabilitation, S. Giuseppe Hospital, 28824 Piancavallo, Oggebbio (Verbania), Italy.
| | - Veronica Cimolin
- Department of Electronics, Information and Bioengineering, Politecnico di Milano, 20133 Milano, Italy.
| | - Corrado Azzaro
- Istituto Auxologico Italiano, IRCCS, Department of Neurology and NeuroRehabilitation, S. Giuseppe Hospital, 28824 Piancavallo, Oggebbio (Verbania), Italy.
| | - Lorenzo Priano
- Istituto Auxologico Italiano, IRCCS, Department of Neurology and NeuroRehabilitation, S. Giuseppe Hospital, 28824 Piancavallo, Oggebbio (Verbania), Italy.
- Department of Neurosciences, University of Turin, Via Cherasco 15, 10100 Torino, Italy.
| | - Alessandro Mauro
- Istituto Auxologico Italiano, IRCCS, Department of Neurology and NeuroRehabilitation, S. Giuseppe Hospital, 28824 Piancavallo, Oggebbio (Verbania), Italy.
- Department of Neurosciences, University of Turin, Via Cherasco 15, 10100 Torino, Italy.
| |
Collapse
|
13
|
Salchow-Hömmen C, Callies L, Laidig D, Valtin M, Schauer T, Seel T. A Tangible Solution for Hand Motion Tracking in Clinical Applications. Sensors (Basel) 2019; 19:E208. [PMID: 30626130 DOI: 10.3390/s19010208] [Citation(s) in RCA: 29] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/21/2018] [Revised: 12/22/2018] [Accepted: 12/23/2018] [Indexed: 11/16/2022]
Abstract
Objective real-time assessment of hand motion is crucial in many clinical applications including technically-assisted physical rehabilitation of the upper extremity. We propose an inertial-sensor-based hand motion tracking system and a set of dual-quaternion-based methods for estimation of finger segment orientations and fingertip positions. The proposed system addresses the specific requirements of clinical applications in two ways: (1) In contrast to glove-based approaches, the proposed solution maintains the sense of touch. (2) In contrast to previous work, the proposed methods avoid the use of complex calibration procedures, which means that they are suitable for patients with severe motor impairment of the hand. To overcome the limited significance of validation in lab environments with homogeneous magnetic fields, we validate the proposed system using functional hand motions in the presence of severe magnetic disturbances as they appear in realistic clinical settings. We show that standard sensor fusion methods that rely on magnetometer readings may perform well in perfect laboratory environments but can lead to more than 15 cm root-mean-square error for the fingertip distances in realistic environments, while our advanced method yields root-mean-square errors below 2 cm for all performed motions.
Collapse
|
14
|
Ferraris C, Nerino R, Chimienti A, Pettiti G, Cau N, Cimolin V, Azzaro C, Albani G, Priano L, Mauro A. A Self-Managed System for Automated Assessment of UPDRS Upper Limb Tasks in Parkinson's Disease. Sensors (Basel) 2018; 18:E3523. [PMID: 30340420 PMCID: PMC6210162 DOI: 10.3390/s18103523] [Citation(s) in RCA: 26] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/07/2018] [Revised: 10/05/2018] [Accepted: 10/15/2018] [Indexed: 02/05/2023]
Abstract
A home-based, reliable, objective and automated assessment of motor performance of patients affected by Parkinson's Disease (PD) is important in disease management, both to monitor therapy efficacy and to reduce costs and discomforts. In this context, we have developed a self-managed system for the automated assessment of the PD upper limb motor tasks as specified by the Unified Parkinson's Disease Rating Scale (UPDRS). The system is built around a Human Computer Interface (HCI) based on an optical RGB-Depth device and a replicable software. The HCI accuracy and reliability of the hand tracking compares favorably against consumer hand tracking devices as verified by an optoelectronic system as reference. The interface allows gestural interactions with visual feedback, providing a system management suitable for motor impaired users. The system software characterizes hand movements by kinematic parameters of their trajectories. The correlation between selected parameters and clinical UPDRS scores of patient performance is used to assess new task instances by a machine learning approach based on supervised classifiers. The classifiers have been trained by an experimental campaign on cohorts of PD patients. Experimental results show that automated assessments of the system replicate clinical ones, demonstrating its effectiveness in home monitoring of PD.
Collapse
Affiliation(s)
- Claudia Ferraris
- Institute of Electronics, Computer and Telecommunication Engineering, National Research Council, 10129 Torino, Italy.
- Department of Neurosciences, University of Turin, 10124 Torino, Italy.
| | - Roberto Nerino
- Institute of Electronics, Computer and Telecommunication Engineering, National Research Council, 10129 Torino, Italy.
| | - Antonio Chimienti
- Institute of Electronics, Computer and Telecommunication Engineering, National Research Council, 10129 Torino, Italy.
| | - Giuseppe Pettiti
- Institute of Electronics, Computer and Telecommunication Engineering, National Research Council, 10129 Torino, Italy.
| | - Nicola Cau
- Department of Electronics, Information and Bioengineering, Politecnico di Milano, 20133 Milano, Italy.
- Department of Neurology and NeuroRehabilitation, Istituto Auxologico Italiano, IRCCS, S. Giuseppe Hospital, 28824 Piancavallo, Italy.
| | - Veronica Cimolin
- Department of Electronics, Information and Bioengineering, Politecnico di Milano, 20133 Milano, Italy.
| | - Corrado Azzaro
- Department of Neurology and NeuroRehabilitation, Istituto Auxologico Italiano, IRCCS, S. Giuseppe Hospital, 28824 Piancavallo, Italy.
| | - Giovanni Albani
- Department of Neurology and NeuroRehabilitation, Istituto Auxologico Italiano, IRCCS, S. Giuseppe Hospital, 28824 Piancavallo, Italy.
| | - Lorenzo Priano
- Department of Neurology and NeuroRehabilitation, Istituto Auxologico Italiano, IRCCS, S. Giuseppe Hospital, 28824 Piancavallo, Italy.
- Department of Neurosciences, University of Turin, 10124 Torino, Italy.
| | - Alessandro Mauro
- Department of Neurology and NeuroRehabilitation, Istituto Auxologico Italiano, IRCCS, S. Giuseppe Hospital, 28824 Piancavallo, Italy.
- Department of Neurosciences, University of Turin, 10124 Torino, Italy.
| |
Collapse
|
15
|
Abstract
Hand rehabilitation is fundamental after stroke or surgery. Traditional rehabilitation requires a therapist and implies high costs, stress for the patient, and subjective evaluation of the therapy effectiveness. Alternative approaches, based on mechanical and tracking-based gloves, can be really effective when used in virtual reality (VR) environments. Mechanical devices are often expensive, cumbersome, patient specific and hand specific, while tracking-based devices are not affected by these limitations but, especially if based on a single tracking sensor, could suffer from occlusions. In this paper, the implementation of a multi-sensors approach, the Virtual Glove (VG), based on the simultaneous use of two orthogonal LEAP motion controllers, is described. The VG is calibrated and static positioning measurements are compared with those collected with an accurate spatial positioning system. The positioning error is lower than 6 mm in a cylindrical region of interest of radius 10 cm and height 21 cm. Real-time hand tracking measurements are also performed, analysed and reported. Hand tracking measurements show that VG operated in real-time (60 fps), reduced occlusions, and managed two LEAP sensors correctly, without any temporal and spatial discontinuity when skipping from one sensor to the other. A video demonstrating the good performance of VG is also collected and presented in the Supplementary Materials. Results are promising but further work must be done to allow the calculation of the forces exerted by each finger when constrained by mechanical tools (e.g., peg-boards) and for reducing occlusions when grasping these tools. Although the VG is proposed for rehabilitation purposes, it could also be used for tele-operation of tools and robots, and for other VR applications.
Collapse
Affiliation(s)
- Giuseppe Placidi
- A²VI_Lab, Department of Life, Health & Environmental Sciences, University of L'Aquila, Via Vetoio 1, 67100 Coppito, L'Aquila, Italy.
| | - Luigi Cinque
- Department of Computer Science, Sapienza University, Via Salaria 113, Rome 00198, Italy.
| | - Matteo Polsinelli
- A²VI_Lab, Department of Life, Health & Environmental Sciences, University of L'Aquila, Via Vetoio 1, 67100 Coppito, L'Aquila, Italy.
| | - Matteo Spezialetti
- A²VI_Lab, Department of Life, Health & Environmental Sciences, University of L'Aquila, Via Vetoio 1, 67100 Coppito, L'Aquila, Italy.
| |
Collapse
|
16
|
Schaffelhofer S, Agudelo-Toro A, Scherberger H. Decoding a wide range of hand configurations from macaque motor, premotor, and parietal cortices. J Neurosci 2015; 35:1068-81. [PMID: 25609623 DOI: 10.1523/JNEUROSCI.3594-14.2015] [Citation(s) in RCA: 77] [Impact Index Per Article: 8.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Despite recent advances in decoding cortical activity for motor control, the development of hand prosthetics remains a major challenge. To reduce the complexity of such applications, higher cortical areas that also represent motor plans rather than just the individual movements might be advantageous. We investigated the decoding of many grip types using spiking activity from the anterior intraparietal (AIP), ventral premotor (F5), and primary motor (M1) cortices. Two rhesus monkeys were trained to grasp 50 objects in a delayed task while hand kinematics and spiking activity from six implanted electrode arrays (total of 192 electrodes) were recorded. Offline, we determined 20 grip types from the kinematic data and decoded these hand configurations and the grasped objects with a simple Bayesian classifier. When decoding from AIP, F5, and M1 combined, the mean accuracy was 50% (using planning activity) and 62% (during motor execution) for predicting the 50 objects (chance level, 2%) and substantially larger when predicting the 20 grip types (planning, 74%; execution, 86%; chance level, 5%). When decoding from individual arrays, objects and grip types could be predicted well during movement planning from AIP (medial array) and F5 (lateral array), whereas M1 predictions were poor. In contrast, predictions during movement execution were best from M1, whereas F5 performed only slightly worse. These results demonstrate for the first time that a large number of grip types can be decoded from higher cortical areas during movement preparation and execution, which could be relevant for future neuroprosthetic devices that decode motor plans.
Collapse
|