1
|
Azadi B, Haslgrübler M, Anzengruber-Tanase B, Sopidis G, Ferscha A. Robust Feature Representation Using Multi-Task Learning for Human Activity Recognition. SENSORS (BASEL, SWITZERLAND) 2024; 24:681. [PMID: 38276371 PMCID: PMC10819053 DOI: 10.3390/s24020681] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/20/2023] [Revised: 01/11/2024] [Accepted: 01/19/2024] [Indexed: 01/27/2024]
Abstract
Learning underlying patterns from sensory data is crucial in the Human Activity Recognition (HAR) task to avoid poor generalization when coping with unseen data. A key solution to such an issue is representation learning, which becomes essential when input signals contain activities with similar patterns or when patterns generated by different subjects for the same activity vary. To address these issues, we seek a solution to increase generalization by learning the underlying factors of each sensor signal. We develop a novel multi-channel asymmetric auto-encoder to recreate input signals precisely and extract indicative unsupervised futures. Further, we investigate the role of various activation functions in signal reconstruction to ensure the model preserves the patterns of each activity in the output. Our main contribution is that we propose a multi-task learning model to enhance representation learning through shared layers between signal reconstruction and the HAR task to improve the robustness of the model in coping with users not included in the training phase. The proposed model learns shared features between different tasks that are indeed the underlying factors of each input signal. We validate our multi-task learning model using several publicly available HAR datasets, UCI-HAR, MHealth, PAMAP2, and USC-HAD, and an in-house alpine skiing dataset collected in the wild, where our model achieved 99%, 99%, 95%, 88%, and 92% accuracy. Our proposed method shows consistent performance and good generalization on all the datasets compared to the state of the art.
Collapse
Affiliation(s)
- Behrooz Azadi
- Pro2Future GmbH, Altenberger Strasse 69, 4040 Linz, Austria; (M.H.); (B.A.-T.); (G.S.)
| | - Michael Haslgrübler
- Pro2Future GmbH, Altenberger Strasse 69, 4040 Linz, Austria; (M.H.); (B.A.-T.); (G.S.)
| | | | - Georgios Sopidis
- Pro2Future GmbH, Altenberger Strasse 69, 4040 Linz, Austria; (M.H.); (B.A.-T.); (G.S.)
| | - Alois Ferscha
- Institute of Pervasive Computing, Johannes Kepler University, Altenberger Straße 69, 4040 Linz, Austria;
| |
Collapse
|
2
|
Moreno-Pérez JA, Ruiz-García I, Navarro-Marchal I, López-Ruiz N, Gómez-López PJ, Palma AJ, Carvajal MA. System Based on an Inertial Measurement Unit for Accurate Flight Time Determination in Vertical Jumps. SENSORS (BASEL, SWITZERLAND) 2023; 23:6022. [PMID: 37447871 DOI: 10.3390/s23136022] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/02/2023] [Revised: 06/25/2023] [Accepted: 06/27/2023] [Indexed: 07/15/2023]
Abstract
The world of elite sports has always been characterized by intense competition, where victories are often determined by minimal differences. This means that every little detail in the preparation of top-level athletes is crucial to their performance at the highest level. One of the most significant aspects to monitor is the jumping capacity, as it enables the measurement of performance, progression, and helps prevent injuries. Herein, we present the development of a system capable of measuring the flight time and height reached by the user, reporting the results through a smartphone using an Android ad-hoc application, which handles all the data processing. The system consists of an affordable and portable circuit based on an accelerometer. It communicates with the smartphone via UART using a Bluetooth module, and its battery provides approximately 9 h of autonomy, making it suitable for outdoor operations. To evaluate the system's precision, we conducted performance tests (counter-movement jumps) with seven subjects. The results confirmed the system's potential for monitoring high-level sports training sessions, as the average deviation obtained was only 2.1% (~0.01 s) in the analysis of flight time and 4.6% (~0.01 m) in jump height.
Collapse
Affiliation(s)
- Juan A Moreno-Pérez
- ECSens, Sport and Health University Research Institute (iMUDS), Department of Electronics and Computer Technology, ETSIIT, University of Granada, 18014 Granada, Spain
| | - Isidoro Ruiz-García
- ECSens, Sport and Health University Research Institute (iMUDS), Department of Electronics and Computer Technology, ETSIIT, University of Granada, 18014 Granada, Spain
| | - Ismael Navarro-Marchal
- SkiingLab, Sport and Health University Research Institute (iMUDS), Department of Physical and Sport Education, University of Granada, 18007 Granada, Spain
- Human Lab, Sport and Health University Research Institute (iMUDS), Department of Physical Education and Sport, Faculty of Sport Sciences, University of Granada, 18007 Granada, Spain
| | - Nuria López-Ruiz
- ECSens, Sport and Health University Research Institute (iMUDS), Department of Electronics and Computer Technology, ETSIIT, University of Granada, 18014 Granada, Spain
| | - Pablo J Gómez-López
- SkiingLab, Sport and Health University Research Institute (iMUDS), Department of Physical and Sport Education, University of Granada, 18007 Granada, Spain
| | - Alberto J Palma
- ECSens, Sport and Health University Research Institute (iMUDS), Department of Electronics and Computer Technology, ETSIIT, University of Granada, 18014 Granada, Spain
| | - Miguel A Carvajal
- ECSens, Sport and Health University Research Institute (iMUDS), Department of Electronics and Computer Technology, ETSIIT, University of Granada, 18014 Granada, Spain
| |
Collapse
|
3
|
Hoelzemann A, Romero JL, Bock M, Laerhoven KV, Lv Q. Hang-Time HAR: A Benchmark Dataset for Basketball Activity Recognition Using Wrist-Worn Inertial Sensors. SENSORS (BASEL, SWITZERLAND) 2023; 23:5879. [PMID: 37447730 DOI: 10.3390/s23135879] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/16/2023] [Revised: 06/12/2023] [Accepted: 06/19/2023] [Indexed: 07/15/2023]
Abstract
We present a benchmark dataset for evaluating physical human activity recognition methods from wrist-worn sensors, for the specific setting of basketball training, drills, and games. Basketball activities lend themselves well for measurement by wrist-worn inertial sensors, and systems that are able to detect such sport-relevant activities could be used in applications of game analysis, guided training, and personal physical activity tracking. The dataset was recorded from two teams in separate countries (USA and Germany) with a total of 24 players who wore an inertial sensor on their wrist, during both a repetitive basketball training session and a game. Particular features of this dataset include an inherent variance through cultural differences in game rules and styles as the data was recorded in two countries, as well as different sport skill levels since the participants were heterogeneous in terms of prior basketball experience. We illustrate the dataset's features in several time-series analyses and report on a baseline classification performance study with two state-of-the-art deep learning architectures.
Collapse
Affiliation(s)
| | - Julia Lee Romero
- Computer Science, University of Colorado Boulder, Boulder, CO 80302, USA
| | - Marius Bock
- Ubiquitous Computing, University of Siegen, 57076 Siegen, Germany
| | | | - Qin Lv
- Computer Science, University of Colorado Boulder, Boulder, CO 80302, USA
| |
Collapse
|
4
|
Sopidis G, Haslgrübler M, Ferscha A. Counting Activities Using Weakly Labeled Raw Acceleration Data: A Variable-Length Sequence Approach with Deep Learning to Maintain Event Duration Flexibility. SENSORS (BASEL, SWITZERLAND) 2023; 23:s23115057. [PMID: 37299784 DOI: 10.3390/s23115057] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/15/2023] [Revised: 05/19/2023] [Accepted: 05/22/2023] [Indexed: 06/12/2023]
Abstract
This paper presents a novel approach for counting hand-performed activities using deep learning and inertial measurement units (IMUs). The particular challenge in this task is finding the correct window size for capturing activities with different durations. Traditionally, fixed window sizes have been used, which occasionally result in incorrectly represented activities. To address this limitation, we propose segmenting the time series data into variable-length sequences using ragged tensors to store and process the data. Additionally, our approach utilizes weakly labeled data to simplify the annotation process and reduce the time to prepare annotated data for machine learning algorithms. Thus, the model receives only partial information about the performed activity. Therefore, we propose an LSTM-based architecture, which takes into account both the ragged tensors and the weak labels. To the best of our knowledge, no prior studies attempted counting utilizing variable-size IMU acceleration data with relatively low computational requirements using the number of completed repetitions of hand-performed activities as a label. Hence, we present the data segmentation method we employed and the model architecture that we implemented to show the effectiveness of our approach. Our results are evaluated using the Skoda public dataset for Human activity recognition (HAR) and demonstrate a repetition error of ±1 even in the most challenging cases. The findings of this study have applications and can be beneficial for various fields, including healthcare, sports and fitness, human-computer interaction, robotics, and the manufacturing industry.
Collapse
Affiliation(s)
| | | | - Alois Ferscha
- Institute of Pervasive Computing, Johannes Kepler University, Altenberger Straße 69, 4040 Linz, Austria
| |
Collapse
|