1
|
Hoffman B, Cusimano M, Baglione V, Canestrari D, Chevallier D, DeSantis DL, Jeantet L, Ladds MA, Maekawa T, Mata-Silva V, Moreno-González V, Pagano AM, Trapote E, Vainio O, Vehkaoja A, Yoda K, Zacarian K, Friedlaender A. A benchmark for computational analysis of animal behavior, using animal-borne tags. MOVEMENT ECOLOGY 2024; 12:78. [PMID: 39695785 DOI: 10.1186/s40462-024-00511-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/27/2024] [Accepted: 10/03/2024] [Indexed: 12/20/2024]
Abstract
BACKGROUND Animal-borne sensors ('bio-loggers') can record a suite of kinematic and environmental data, which are used to elucidate animal ecophysiology and improve conservation efforts. Machine learning techniques are used for interpreting the large amounts of data recorded by bio-loggers, but there exists no common framework for comparing the different machine learning techniques in this domain. This makes it difficult to, for example, identify patterns in what works well for machine learning-based analysis of bio-logger data. It also makes it difficult to evaluate the effectiveness of novel methods developed by the machine learning community. METHODS To address this, we present the Bio-logger Ethogram Benchmark (BEBE), a collection of datasets with behavioral annotations, as well as a modeling task and evaluation metrics. BEBE is to date the largest, most taxonomically diverse, publicly available benchmark of this type, and includes 1654 h of data collected from 149 individuals across nine taxa. Using BEBE, we compare the performance of deep and classical machine learning methods for identifying animal behaviors based on bio-logger data. As an example usage of BEBE, we test an approach based on self-supervised learning. To apply this approach to animal behavior classification, we adapt a deep neural network pre-trained with 700,000 h of data collected from human wrist-worn accelerometers. RESULTS We find that deep neural networks out-perform the classical machine learning methods we tested across all nine datasets in BEBE. We additionally find that the approach based on self-supervised learning out-performs the alternatives we tested, especially in settings when there is a low amount of training data available. CONCLUSIONS In light of these results, we are able to make concrete suggestions for designing studies that rely on machine learning to infer behavior from bio-logger data. Therefore, we expect that BEBE will be useful for making similar suggestions in the future, as additional hypotheses about machine learning techniques are tested. Datasets, models, and evaluation code are made publicly available at https://github.com/earthspecies/BEBE , to enable community use of BEBE.
Collapse
Affiliation(s)
| | | | | | | | | | | | - Lorène Jeantet
- African Institute for Mathematical Sciences, University of Stellenbosch, Stellenbosch, South Africa
| | | | | | | | | | | | | | | | | | - Ken Yoda
- Nagoya University, Nagoya, Japan
| | | | | |
Collapse
|
2
|
Mozumder MAI, Theodore Armand TP, Sumon RI, Imtiyaj Uddin SM, Kim HC. Automated Pipeline for Robust Cat Activity Detection Based on Deep Learning and Wearable Sensor Data. SENSORS (BASEL, SWITZERLAND) 2024; 24:7436. [PMID: 39685969 DOI: 10.3390/s24237436] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/21/2024] [Revised: 11/12/2024] [Accepted: 11/18/2024] [Indexed: 12/18/2024]
Abstract
The health, safety, and well-being of household pets such as cats has become a challenging task in previous years. To estimate a cat's behavior, objective observations of both the frequency and variability of specific behavior traits are required, which might be difficult to come by in a cat's ordinary life. There is very little research on cat activity and cat disease analysis based on real-time data. Although previous studies have made progress, several key questions still need addressing: What types of data are best suited for accurately detecting activity patterns? Where should sensors be strategically placed to ensure precise data collection, and how can the system be effectively automated for seamless operation? This study addresses these questions by pointing out whether the cat should be equipped with a sensor, and how the activity detection system can be automated. Magnetic, motion, vision, audio, and location sensors are among the sensors used in the machine learning experiment. In this study, we collect data using three types of differentiable and realistic wearable sensors, namely, an accelerometer, a gyroscope, and a magnetometer. Therefore, this study aims to employ cat activity detection techniques to combine data from acceleration, motion, and magnetic sensors, such as accelerometers, gyroscopes, and magnetometers, respectively, to recognize routine cat activity. Data collecting, data processing, data fusion, and artificial intelligence approaches are all part of the system established in this study. We focus on One-Dimensional Convolutional Neural Networks (1D-CNNs) in our research, to recognize cat activity modeling for detection and classification. Such 1D-CNNs have recently emerged as a cutting-edge approach for signal processing-based systems such as sensor-based pet and human health monitoring systems, anomaly identification in manufacturing, and in other areas. Our study culminates in the development of an automated system for robust pet (cat) activity analysis using artificial intelligence techniques, featuring a 1D-CNN-based approach. In this experimental research, the 1D-CNN approach is evaluated using training and validation sets. The approach achieved a satisfactory accuracy of 98.9% while detecting the activity useful for cat well-being.
Collapse
Affiliation(s)
| | | | - Rashadul Islam Sumon
- Institute of Digital Anti-Aging Healthcare, Inje University, Gimhae 50834, Republic of Korea
| | | | - Hee-Cheol Kim
- Institute of Digital Anti-Aging Healthcare, Inje University, Gimhae 50834, Republic of Korea
- Department of Computer Engineering, Inje University, Gimhae 50834, Republic of Korea
| |
Collapse
|
3
|
Redmond C, Smit M, Draganova I, Corner-Thomas R, Thomas D, Andrews C. The Use of Triaxial Accelerometers and Machine Learning Algorithms for Behavioural Identification in Domestic Dogs ( Canis familiaris): A Validation Study. SENSORS (BASEL, SWITZERLAND) 2024; 24:5955. [PMID: 39338701 PMCID: PMC11435861 DOI: 10.3390/s24185955] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/26/2024] [Revised: 09/04/2024] [Accepted: 09/10/2024] [Indexed: 09/30/2024]
Abstract
Assessing the behaviour and physical attributes of domesticated dogs is critical for predicting the suitability of animals for companionship or specific roles such as hunting, military or service. Common methods of behavioural assessment can be time consuming, labour-intensive, and subject to bias, making large-scale and rapid implementation challenging. Objective, practical and time effective behaviour measures may be facilitated by remote and automated devices such as accelerometers. This study, therefore, aimed to validate the ActiGraph® accelerometer as a tool for behavioural classification. This study used a machine learning method that identified nine dog behaviours with an overall accuracy of 74% (range for each behaviour was 54 to 93%). In addition, overall body dynamic acceleration was found to be correlated with the amount of time spent exhibiting active behaviours (barking, locomotion, scratching, sniffing, and standing; R2 = 0.91, p < 0.001). Machine learning was an effective method to build a model to classify behaviours such as barking, defecating, drinking, eating, locomotion, resting-asleep, resting-alert, sniffing, and standing with high overall accuracy whilst maintaining a large behavioural repertoire.
Collapse
Affiliation(s)
- Cushla Redmond
- School of Agriculture and Environment, Massey University, Palmerston North 4410, New Zealand
| | - Michelle Smit
- School of Agriculture and Environment, Massey University, Palmerston North 4410, New Zealand
| | - Ina Draganova
- School of Agriculture and Environment, Massey University, Palmerston North 4410, New Zealand
| | - Rene Corner-Thomas
- School of Agriculture and Environment, Massey University, Palmerston North 4410, New Zealand
| | - David Thomas
- School of Agriculture and Environment, Massey University, Palmerston North 4410, New Zealand
| | - Christopher Andrews
- School of Agriculture and Environment, Massey University, Palmerston North 4410, New Zealand
| |
Collapse
|
4
|
Ahn SH, Kim S, Jeong DH. Unsupervised Domain Adaptation for Mitigating Sensor Variability and Interspecies Heterogeneity in Animal Activity Recognition. Animals (Basel) 2023; 13:3276. [PMID: 37894000 PMCID: PMC10603736 DOI: 10.3390/ani13203276] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2023] [Revised: 10/16/2023] [Accepted: 10/17/2023] [Indexed: 10/29/2023] Open
Abstract
Animal activity recognition (AAR) using wearable sensor data has gained significant attention due to its applications in monitoring and understanding animal behavior. However, two major challenges hinder the development of robust AAR models: domain variability and the difficulty of obtaining labeled datasets. To address this issue, this study intensively investigates the impact of unsupervised domain adaptation (UDA) for AAR. We compared three distinct types of UDA techniques: minimizing divergence-based, adversarial-based, and reconstruction-based approaches. By leveraging UDA, AAR classifiers enable the model to learn domain-invariant features, allowing classifiers trained on the source domain to perform well on the target domain without labels. We evaluated the effectiveness of UDA techniques using dog movement sensor data and additional data from horses. The application of UDA across sensor positions (neck and back), sizes (middle-sized and large-sized), and gender (female and male) within the dog data, as well as across species (dog and horses), exhibits significant improvements in the classification performance and reduced the domain discrepancy. The results highlight the potential of UDA to mitigate the domain shift and enhance AAR in various settings and for different animal species, providing valuable insights for practical applications in real-world scenarios where labeled data is scarce.
Collapse
Affiliation(s)
- Seong-Ho Ahn
- Department of Artificial Intelligence, The Catholic University of Korea, Bucheon 14662, Republic of Korea;
| | - Seeun Kim
- School of Computer Science and Information Engineering, The Catholic University of Korea, Bucheon 14662, Republic of Korea;
| | - Dong-Hwa Jeong
- Department of Artificial Intelligence, The Catholic University of Korea, Bucheon 14662, Republic of Korea;
| |
Collapse
|
5
|
Smit M, Ikurior SJ, Corner-Thomas RA, Andrews CJ, Draganova I, Thomas DG. The Use of Triaxial Accelerometers and Machine Learning Algorithms for Behavioural Identification in Domestic Cats ( Felis catus): A Validation Study. SENSORS (BASEL, SWITZERLAND) 2023; 23:7165. [PMID: 37631701 PMCID: PMC10458840 DOI: 10.3390/s23167165] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/30/2023] [Revised: 08/10/2023] [Accepted: 08/11/2023] [Indexed: 08/27/2023]
Abstract
Animal behaviour can be an indicator of health and welfare. Monitoring behaviour through visual observation is labour-intensive and there is a risk of missing infrequent behaviours. Twelve healthy domestic shorthair cats were fitted with triaxial accelerometers mounted on a collar and harness. Over seven days, accelerometer and video footage were collected simultaneously. Identifier variables (n = 32) were calculated from the accelerometer data and summarized into 1 s epochs. Twenty-four behaviours were annotated from the video recordings and aligned with the summarised accelerometer data. Models were created using random forest (RF) and supervised self-organizing map (SOM) machine learning techniques for each mounting location. Multiple modelling rounds were run to select and merge behaviours based on performance values. All models were then tested on a validation accelerometer dataset from the same twelve cats to identify behaviours. The frequency of behaviours was calculated and compared using Dirichlet regression. Despite the SOM models having higher Kappa (>95%) and overall accuracy (>95%) compared with the RF models (64-76% and 70-86%, respectively), the RF models predicted behaviours more consistently between mounting locations. These results indicate that triaxial accelerometers can identify cat specific behaviours.
Collapse
Affiliation(s)
- Michelle Smit
- School of Agriculture and Environment, Massey University, Palmerston North 4410, New Zealand
| | | | | | | | | | | |
Collapse
|
6
|
Marcato M, Tedesco S, O'Mahony C, O'Flynn B, Galvin P. Machine learning based canine posture estimation using inertial data. PLoS One 2023; 18:e0286311. [PMID: 37342986 DOI: 10.1371/journal.pone.0286311] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2022] [Accepted: 05/12/2023] [Indexed: 06/23/2023] Open
Abstract
The aim of this study was to design a new canine posture estimation system specifically for working dogs. The system was composed of Inertial Measurement Units (IMUs) that are commercially available, and a supervised learning algorithm which was developed for different behaviours. Three IMUs, each containing a 3-axis accelerometer, gyroscope, and magnetometer, were attached to the dogs' chest, back, and neck. To build and test the model, data were collected during a video-recorded behaviour test where the trainee assistance dogs performed static postures (standing, sitting, lying down) and dynamic activities (walking, body shake). Advanced feature extraction techniques were employed for the first time in this field, including statistical, temporal, and spectral methods. The most important features for posture prediction were chosen using Select K Best with ANOVA F-value. The individual contributions of each IMU, sensor, and feature type were analysed using Select K Best scores and Random Forest feature importance. Results showed that the back and chest IMUs were more important than the neck IMU, and the accelerometers were more important than the gyroscopes. The addition of IMUs to the chest and back of dog harnesses is recommended to improve performance. Additionally, statistical and temporal feature domains were more important than spectral feature domains. Three novel cascade arrangements of Random Forest and Isolation Forest were fitted to the dataset. The best classifier achieved an f1-macro of 0.83 and an f1-weighted of 0.90 for the prediction of the five postures, demonstrating a better performance than previous studies. These results were attributed to the data collection methodology (number of subjects and observations, multiple IMUs, use of common working dog breeds) and novel machine learning techniques (advanced feature extraction, feature selection and modelling arrangements) employed. The dataset and code used are publicly available on Mendeley Data and GitHub, respectively.
Collapse
Affiliation(s)
- Marinara Marcato
- Tyndall National Institute, University College Cork, Cork, Ireland
| | | | - Conor O'Mahony
- Tyndall National Institute, University College Cork, Cork, Ireland
| | - Brendan O'Flynn
- Tyndall National Institute, University College Cork, Cork, Ireland
| | - Paul Galvin
- Tyndall National Institute, University College Cork, Cork, Ireland
| |
Collapse
|
7
|
Muramatsu S, Kohata Y, Hira E, Momoi Y, Yamamoto M, Takamatsu S, Itoh T. Margined Horn-Shaped Air Chamber for Body-Conduction Microphone. SENSORS (BASEL, SWITZERLAND) 2023; 23:s23094565. [PMID: 37177769 PMCID: PMC10181571 DOI: 10.3390/s23094565] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/08/2023] [Revised: 05/03/2023] [Accepted: 05/05/2023] [Indexed: 05/15/2023]
Abstract
The sound amplification ratios of sealed air chambers with different shapes were quantitatively compared to design a body-conduction microphone to measure animal scratching sounds. Recently, quantitative monitoring of scratching intensity in dogs has been required. We have already developed a collar with a body-conduction microphone to measure body-conducted scratching sounds. However, the air chamber, one of the components of the body-conduction microphone, has not been appropriately designed. This study compared the amplification ratios of air chambers with different shapes through numerical analysis and experiments. According to the results, the horn-shaped air chamber achieved the highest amplification performance, at least for sound frequencies below 3 kHz. The simulated amplification ratio of the horn-shaped air chamber with a 1 mm height and a 15 mm diameter was 52.5 dB. The deformation of the bottom of the air chamber affected the amplification ratio. Adjusting the margin of the margined horn shape could maintain its amplification ratio at any pressing force. The simulated and experimental amplification ratios of the margined horn-shaped air chamber were 53.4 dB and 19.4 dB, respectively.
Collapse
Affiliation(s)
- Shun Muramatsu
- Department of Precision Engineering, Graduate School of Engineering, The University of Tokyo, Tokyo 113-8656, Japan
| | - Yuki Kohata
- Department of Precision Engineering, Faculty of Engineering, The University of Tokyo, Tokyo 113-8656, Japan
| | - Emi Hira
- Department of Veterinary Medical Sciences, Graduate School of Agricultural and Life Sciences, The University of Tokyo, Tokyo 113-8657, Japan
| | - Yasuyuki Momoi
- Department of Veterinary Medical Sciences, Graduate School of Agricultural and Life Sciences, The University of Tokyo, Tokyo 113-8657, Japan
| | - Michitaka Yamamoto
- Department of Precision Engineering, Graduate School of Engineering, The University of Tokyo, Tokyo 113-8656, Japan
- Department of Precision Engineering, Faculty of Engineering, The University of Tokyo, Tokyo 113-8656, Japan
| | - Seiichi Takamatsu
- Department of Precision Engineering, Graduate School of Engineering, The University of Tokyo, Tokyo 113-8656, Japan
- Department of Precision Engineering, Faculty of Engineering, The University of Tokyo, Tokyo 113-8656, Japan
| | - Toshihiro Itoh
- Department of Precision Engineering, Graduate School of Engineering, The University of Tokyo, Tokyo 113-8656, Japan
- Department of Precision Engineering, Faculty of Engineering, The University of Tokyo, Tokyo 113-8656, Japan
| |
Collapse
|
8
|
Video Validation of Tri-Axial Accelerometer for Monitoring Zoo-Housed Tamandua tetradactyla Activity Patterns in Response to Changes in Husbandry Conditions. Animals (Basel) 2022; 12:ani12192516. [PMID: 36230257 PMCID: PMC9559380 DOI: 10.3390/ani12192516] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2022] [Revised: 09/15/2022] [Accepted: 09/16/2022] [Indexed: 11/17/2022] Open
Abstract
Accelerometers are a technology that is increasingly used in the evaluation of animal behaviour. A tri-axial accelerometer attached to a vest was used on Tamandua tetradactyla individuals (n = 10) at Biodiversity Park. First, the influence of using a vest on the animals’ behaviour was evaluated (ABA-type: A1 and A2, without a vest; B, with a vest; each stage lasted 24 h), and no changes were detected. Second, their behaviour was monitored using videos and the accelerometer simultaneously (experimental room, 20 min). The observed behaviours were correlated with the accelerometer data, and summary measures (X, Y and Z axes) were obtained. Additionally, the overall dynamic body acceleration was calculated, determining a threshold to discriminate activity/inactivity events (variance = 0.0055). Then, based on a 24 h complementary test (video sampling every 5 min), the sensitivity (85.91%) and precision (100%) of the accelerometer were assessed. Animals were exposed to an ABA-type experimental design: A1 and A2: complex enclosure; B: decreased complexity (each stage lasted 24 h). An increase in total activity (%) was revealed using the accelerometer (26.15 ± 1.50, 29.29 ± 2.25, and 35.36 ± 3.15, respectively). Similar activity levels were detected using video analysis. The results demonstrate that the use of the accelerometer is reliable to determine the activity. Considering that the zoo-housed lesser anteaters exhibit a cathemeral activity pattern, this study contributes to easily monitoring their activities and responses to different management procedures supporting welfare programs, as well as ex situ conservation.
Collapse
|
9
|
Somppi S, Törnqvist H, Koskela A, Vehkaoja A, Tiira K, Väätäjä H, Surakka V, Vainio O, Kujala MV. Dog-Owner Relationship, Owner Interpretations and Dog Personality Are Connected with the Emotional Reactivity of Dogs. Animals (Basel) 2022; 12:1338. [PMID: 35681804 PMCID: PMC9179432 DOI: 10.3390/ani12111338] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2022] [Revised: 05/22/2022] [Accepted: 05/23/2022] [Indexed: 11/16/2022] Open
Abstract
We evaluated the effect of the dog-owner relationship on dogs' emotional reactivity, quantified with heart rate variability (HRV), behavioral changes, physical activity and dog owner interpretations. Twenty nine adult dogs encountered five different emotional situations (i.e., stroking, a feeding toy, separation from the owner, reunion with the owner, a sudden appearance of a novel object). The results showed that both negative and positive situations provoked signs of heightened arousal in dogs. During negative situations, owners' ratings about the heightened emotional arousal correlated with lower HRV, higher physical activity and more behaviors that typically index arousal and fear. The three factors of The Monash Dog-Owner Relationship Scale (MDORS) were reflected in the dogs' heart rate variability and behaviors: the Emotional Closeness factor was related to increased HRV (p = 0.009), suggesting this aspect is associated with the secure base effect, and the Shared Activities factor showed a trend toward lower HRV (p = 0.067) along with more owner-directed behaviors reflecting attachment related arousal. In contrast, the Perceived Costs factor was related to higher HRV (p = 0.009) along with less fear and less owner-directed behaviors, which may reflect the dog's more independent personality. In conclusion, dogs' emotional reactivity and the dog-owner relationship modulate each other, depending on the aspect of the relationship and dogs' individual responsivity.
Collapse
Affiliation(s)
- Sanni Somppi
- Department of Equine and Small Animal Medicine, Faculty of Veterinary Medicine, University of Helsinki, P.O. Box 57, FI-00014 Helsinki, Finland; (H.T.); (A.K.); (K.T.); (O.V.); (M.V.K.)
- Department of Psychology, Faculty of Education and Psychology, University of Jyväskylä, P.O. Box 35, FI-40014 Jyväskylä, Finland
| | - Heini Törnqvist
- Department of Equine and Small Animal Medicine, Faculty of Veterinary Medicine, University of Helsinki, P.O. Box 57, FI-00014 Helsinki, Finland; (H.T.); (A.K.); (K.T.); (O.V.); (M.V.K.)
- Department of Psychology, Faculty of Education and Psychology, University of Jyväskylä, P.O. Box 35, FI-40014 Jyväskylä, Finland
| | - Aija Koskela
- Department of Equine and Small Animal Medicine, Faculty of Veterinary Medicine, University of Helsinki, P.O. Box 57, FI-00014 Helsinki, Finland; (H.T.); (A.K.); (K.T.); (O.V.); (M.V.K.)
| | - Antti Vehkaoja
- Faculty of Medicine and Health Technology, Tampere University, P.O. Box 692, FI-33101 Tampere, Finland;
| | - Katriina Tiira
- Department of Equine and Small Animal Medicine, Faculty of Veterinary Medicine, University of Helsinki, P.O. Box 57, FI-00014 Helsinki, Finland; (H.T.); (A.K.); (K.T.); (O.V.); (M.V.K.)
| | - Heli Väätäjä
- Research Group for Emotions, Sociality, and Computing, Faculty of Information Technology and Communication Sciences, Tampere University, P.O. Box 100, FI-33014 Tampere, Finland;
- Master School, Lapland University of Applied Sciences, Jokiväylä 11 B, FI-96300 Rovaniemi, Finland;
| | - Veikko Surakka
- Master School, Lapland University of Applied Sciences, Jokiväylä 11 B, FI-96300 Rovaniemi, Finland;
| | - Outi Vainio
- Department of Equine and Small Animal Medicine, Faculty of Veterinary Medicine, University of Helsinki, P.O. Box 57, FI-00014 Helsinki, Finland; (H.T.); (A.K.); (K.T.); (O.V.); (M.V.K.)
| | - Miiamaaria V. Kujala
- Department of Equine and Small Animal Medicine, Faculty of Veterinary Medicine, University of Helsinki, P.O. Box 57, FI-00014 Helsinki, Finland; (H.T.); (A.K.); (K.T.); (O.V.); (M.V.K.)
- Department of Psychology, Faculty of Education and Psychology, University of Jyväskylä, P.O. Box 35, FI-40014 Jyväskylä, Finland
- Department of Neuroscience and Biomedical Engineering, Aalto University School of Science, P.O. Box 12200, FI-00076 Aalto, Finland
| |
Collapse
|
10
|
Dog Behavior Recognition Based on Multimodal Data from a Camera and Wearable Device. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12063199] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
Although various studies on monitoring dog behavior have been conducted, methods that can minimize or compensate data noise are required. This paper proposes multimodal data-based dog behavior recognition that fuses video and sensor data using a camera and a wearable device. The video data represent the moving area of dogs to detect the dogs. The sensor data represent the movement of the dogs and extract features that affect dog behavior recognition. Seven types of behavior recognition were conducted, and the results of the two data types were used to recognize the dog’s behavior through a fusion model based on deep learning. Experimentation determined that, among FasterRCNN, YOLOv3, and YOLOv4, the object detection rate and behavior recognition accuracy were the highest when YOLOv4 was used. In addition, the sensor data showed the best performance when all statistical features were selected. Finally, it was confirmed that the performance of multimodal data-based fusion models was improved over that of single data-based models and that the CNN-LSTM-based model had the best performance. The method presented in this study can be applied for dog treatment or health monitoring, and it is expected to provide a simple way to estimate the amount of activity.
Collapse
|
11
|
Vehkaoja A, Somppi S, Törnqvist H, Valldeoriola Cardó A, Kumpulainen P, Väätäjä H, Majaranta P, Surakka V, Kujala MV, Vainio O. Description of movement sensor dataset for dog behavior classification. Data Brief 2022; 40:107822. [PMID: 35079615 PMCID: PMC8777071 DOI: 10.1016/j.dib.2022.107822] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Revised: 01/05/2022] [Accepted: 01/07/2022] [Indexed: 11/24/2022] Open
Abstract
Movement sensor data from seven static and dynamic dog behaviors (sitting, standing, lying down, trotting, walking, playing, and (treat) searching i.e. sniffing) was collected from 45 middle to large sized dogs with six degree-of-freedom movement sensors attached to the collar and the harness. With 17 dogs the collection procedure was repeated. The duration of each of the seven behaviors was approximately three minutes. The order of the tasks was varied between the dogs and the two repetitions (for the 17 dogs). The behaviors were annotated post-hoc based on the video recordings made with two camcorders during the tests with one second resolution. The annotations were accurately synchronized with the raw movement sensors data. The annotated data was originally used for training behavior classification machine learning algorithms for classifying the seven behaviors. The developed signal processing and classification algorithms are provided together with the raw measurement data and reference annotations. The description and results of the original investigation that the dataset relates to are found in: P. Kumpulainen, A. Valldeoriola Cardó, S. Somppi, H. Törnqvist, H. Väätäjä, P. Majaranta, Y. Gizatdinova, C. Hoog Antink, V. Surakka, M. V. Kujala, O. Vainio, A. Vehkaoja, Dog behavior classification with movement sensors placed on the harness and the collar, Applied Animal behavior Science, 241 (2021), 105,393.
Collapse
Affiliation(s)
- Antti Vehkaoja
- Faculty of Medicine and Health Technology, Tampere University, P.O. Box 692, Tampere FI-33101, Finland
| | - Sanni Somppi
- Department of Equine and Small Animal Medicine, University of Helsinki, P.O. Box 57, Helsinki FI-00014, Finland
| | - Heini Törnqvist
- Department of Equine and Small Animal Medicine, University of Helsinki, P.O. Box 57, Helsinki FI-00014, Finland
- Department of Psychology, Faculty of Education and Psychology, University of Jyväskylä, P.O. Box 35, Jyväskylä FI-40014, Finland
| | - Anna Valldeoriola Cardó
- Department of Equine and Small Animal Medicine, University of Helsinki, P.O. Box 57, Helsinki FI-00014, Finland
| | - Pekka Kumpulainen
- Faculty of Medicine and Health Technology, Tampere University, P.O. Box 692, Tampere FI-33101, Finland
| | - Heli Väätäjä
- Research Group for Emotions, Sociality, and Computing, Faculty of Information Technology and Communication Sciences, Tampere University, P.O. Box 100, Tampere FI-33014, Finland
- Master School, Lapland University of Applied Sciences, Jokiväylä 11 B, Rovaniemi 96300, Finland
| | - Päivi Majaranta
- Research Group for Emotions, Sociality, and Computing, Faculty of Information Technology and Communication Sciences, Tampere University, P.O. Box 100, Tampere FI-33014, Finland
| | - Veikko Surakka
- Research Group for Emotions, Sociality, and Computing, Faculty of Information Technology and Communication Sciences, Tampere University, P.O. Box 100, Tampere FI-33014, Finland
| | - Miiamaaria V. Kujala
- Department of Equine and Small Animal Medicine, University of Helsinki, P.O. Box 57, Helsinki FI-00014, Finland
- Department of Psychology, Faculty of Education and Psychology, University of Jyväskylä, P.O. Box 35, Jyväskylä FI-40014, Finland
| | - Outi Vainio
- Department of Equine and Small Animal Medicine, University of Helsinki, P.O. Box 57, Helsinki FI-00014, Finland
| |
Collapse
|
12
|
Deep Learning Empowered Wearable-Based Behavior Recognition for Search and Rescue Dogs. SENSORS 2022; 22:s22030993. [PMID: 35161741 PMCID: PMC8840386 DOI: 10.3390/s22030993] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/20/2021] [Revised: 01/20/2022] [Accepted: 01/25/2022] [Indexed: 02/01/2023]
Abstract
Search and Rescue (SaR) dogs are important assets in the hands of first responders, as they have the ability to locate the victim even in cases where the vision and or the sound is limited, due to their inherent talents in olfactory and auditory senses. In this work, we propose a deep-learning-assisted implementation incorporating a wearable device, a base station, a mobile application, and a cloud-based infrastructure that can first monitor in real-time the activity, the audio signals, and the location of a SaR dog, and second, recognize and alert the rescuing team whenever the SaR dog spots a victim. For this purpose, we employed deep Convolutional Neural Networks (CNN) both for the activity recognition and the sound classification, which are trained using data from inertial sensors, such as 3-axial accelerometer and gyroscope and from the wearable’s microphone, respectively. The developed deep learning models were deployed on the wearable device, while the overall proposed implementation was validated in two discrete search and rescue scenarios, managing to successfully spot the victim (i.e., obtained F1-score more than 99%) and inform the rescue team in real-time for both scenarios.
Collapse
|
13
|
Highlights of published papers in applied Animal Behaviour Science in 2021. Appl Anim Behav Sci 2022. [DOI: 10.1016/j.applanim.2021.105533] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
14
|
Mao A, Huang E, Gan H, Parkes RSV, Xu W, Liu K. Cross-Modality Interaction Network for Equine Activity Recognition Using Imbalanced Multi-Modal Data. SENSORS 2021; 21:s21175818. [PMID: 34502709 PMCID: PMC8434387 DOI: 10.3390/s21175818] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/26/2021] [Revised: 08/25/2021] [Accepted: 08/27/2021] [Indexed: 11/16/2022]
Abstract
With the recent advances in deep learning, wearable sensors have increasingly been used in automated animal activity recognition. However, there are two major challenges in improving recognition performance—multi-modal feature fusion and imbalanced data modeling. In this study, to improve classification performance for equine activities while tackling these two challenges, we developed a cross-modality interaction network (CMI-Net) involving a dual convolution neural network architecture and a cross-modality interaction module (CMIM). The CMIM adaptively recalibrated the temporal- and axis-wise features in each modality by leveraging multi-modal information to achieve deep intermodality interaction. A class-balanced (CB) focal loss was adopted to supervise the training of CMI-Net to alleviate the class imbalance problem. Motion data was acquired from six neck-attached inertial measurement units from six horses. The CMI-Net was trained and verified with leave-one-out cross-validation. The results demonstrated that our CMI-Net outperformed the existing algorithms with high precision (79.74%), recall (79.57%), F1-score (79.02%), and accuracy (93.37%). The adoption of CB focal loss improved the performance of CMI-Net, with increases of 2.76%, 4.16%, and 3.92% in precision, recall, and F1-score, respectively. In conclusion, CMI-Net and CB focal loss effectively enhanced the equine activity classification performance using imbalanced multi-modal sensor data.
Collapse
Affiliation(s)
- Axiu Mao
- Department of Infectious Diseases and Public Health, Jockey Club College of Veterinary Medicine and Life Sciences, City University of Hong Kong, Hong Kong, China; (A.M.); (H.G.)
| | - Endai Huang
- Department of Computer Science, City University of Hong Kong, Hong Kong, China; (E.H.); (W.X.)
| | - Haiming Gan
- Department of Infectious Diseases and Public Health, Jockey Club College of Veterinary Medicine and Life Sciences, City University of Hong Kong, Hong Kong, China; (A.M.); (H.G.)
- College of Electronic Engineering, South China Agricultural University, Guangzhou 510642, China
| | - Rebecca S. V. Parkes
- Department of Veterinary Clinical Sciences, Jockey Club College of Veterinary Medicine and Life Sciences, City University of Hong Kong, Hong Kong, China;
- Centre for Companion Animal Health, Jockey Club College of Veterinary Medicine and Life Sciences, City University of Hong Kong, Hong Kong, China
| | - Weitao Xu
- Department of Computer Science, City University of Hong Kong, Hong Kong, China; (E.H.); (W.X.)
| | - Kai Liu
- Department of Infectious Diseases and Public Health, Jockey Club College of Veterinary Medicine and Life Sciences, City University of Hong Kong, Hong Kong, China; (A.M.); (H.G.)
- Animal Health Research Centre, Chengdu Research Institute, City University of Hong Kong, Chengdu 610000, China
- Correspondence:
| |
Collapse
|