1
|
Bodempudi VUC, Li G, Mason JH, Wilson JL, Liu T, Rasheed KM. Identifying mating events of group-housed broiler breeders via bio-inspired deep learning models. Poult Sci 2025; 104:105126. [PMID: 40300323 DOI: 10.1016/j.psj.2025.105126] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2025] [Revised: 03/28/2025] [Accepted: 04/02/2025] [Indexed: 05/01/2025] Open
Abstract
Mating behaviors are crucial for bird welfare, reproduction, and productivity in breeding flocks. During mating, a rooster mounts a hen, which may result in the hen overlapping or disappearing from top-view of a vision system. The objective of this research was to develop Deep learning models (DLM) to identify mating behavior based on bird count changes and bio-characteristics of mating. Twenty broiler breeder hens and 2-3 roosters (56 weeks) of the Ross 708 breed were monitored in four experimental pens. The DLM framework included a bird detection model, data filtering algorithms based on mating duration, and logic frameworks for mating identification based on bird count changes. Pretrained models of object detection (You Only Look Once Version 7 and 8, YOLOv7 and YOLOv8), tracking (YOLOv7 or YOLOv8 with Deep Simple Online Real-time Tracking (SORT), StrongSORT, and ByteTrack), and segmentation (Segment Anything Model2 (SAM2), YOLOv8-segmentation, Track Anything) were comparatively evaluated for bird detection, and YOLOv8l object detection model was selected due to balanced performance in processing speed (8 seconds per frame) and accuracy (75 % Mean Average Precision (mAP)). With custom training, the best performance of detecting broiler breeders via YOLOv8l was over 0.939 precision, recall, mAP50, mAP95, and F1 score for training and 0.95 positive and negative predicted values for testing. After comparing 24 scenarios of mating duration and 32 scenarios of time interval, a mating duration of 3-9 seconds and the time intervals of T-3 to T+12 seconds based on manual observation were incorporated into the framework to filter out unnecessary data and retain keyframes for further processing, significantly reducing the processing speed by a factor of 10. The optimized framework was effectively able to detect the birds and identify the mating behavior with 0.92 accuracy compared to other YOLO detection plus logic frameworks. Mating event identification via the developed DLM framework fluctuated among different time of a day and bird ages due to bird overlapping, gathering densities, and occlusions. By automating this process, breeders can efficiently monitor and analyze mating behaviors, facilitating timely interventions and adjustments in housing and management practices to optimize broiler breeder fertility, genetics, and overall productivity.
Collapse
Affiliation(s)
- Venkat U C Bodempudi
- Department of Poultry Science, University of Georgia, Athens, GA 30602, USA; Institute for Artificial Intelligence, University of Georgia, Athens, GA 30602, USA
| | - Guoming Li
- Department of Poultry Science, University of Georgia, Athens, GA 30602, USA; Institute for Artificial Intelligence, University of Georgia, Athens, GA 30602, USA.
| | - J Hunter Mason
- Department of Poultry Science, University of Georgia, Athens, GA 30602, USA
| | - Jeanna L Wilson
- Department of Poultry Science, University of Georgia, Athens, GA 30602, USA
| | - Tianming Liu
- School of Computing, University of Georgia, Athens, GA 30602, USA
| | - Khaled M Rasheed
- Institute for Artificial Intelligence, University of Georgia, Athens, GA 30602, USA
| |
Collapse
|
2
|
Lv S, Mao Y, Liu Y, Huang Y, Guo D, Cheng L, Tang Z, Peng S, Xiao D. JTF-SqueezeNet: A SqueezeNet network based on joint time-frequency data representation for egg-laying detection in individually caged ducks. Poult Sci 2025; 104:104782. [PMID: 39808916 PMCID: PMC11782797 DOI: 10.1016/j.psj.2025.104782] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2024] [Revised: 12/31/2024] [Accepted: 01/03/2025] [Indexed: 01/16/2025] Open
Abstract
Accurate individual egg-laying detection is crucial for eliminating low-yielding breeder ducks and improving production efficiency. However, existing methods are often expensive and require strict environmental conditions. This study proposes a data processing method based on wearable sensors and joint time-frequency representation (TFR), aimed at accurately identifying egg-laying in ducks. First, the sensors continuously monitor the ducks' activity and collect corresponding X-axis acceleration data. Next, a sliding window combined with Short-Time Fourier Transform (STFT) is applied to convert the continuous data into spectrograms within consecutive windows. SqueezeNet is then used to detect spectrograms containing key features of the egg-laying process, marking these as egg-laying state windows. Finally, Kalman filtering was used to continuously predict the detected egg-laying status, allowing for the precise determination of the egg-laying period. The best detection performance was achieved by applying the 10-fold cross-validation to a dataset of 59,135 spectrograms, using a window size of 50 min and a step size of 3 min. This configuration yielded an accuracy of 95.73 % for detecting egg-laying status, with an inference time of only 2.1511 milliseconds per window. The accuracy for identifying the egg-laying period reached 92.19 %, with a precision of 93.57 % and a recall rate of 91.95 %. Additionally, we explored the scalability of the joint time-frequency representation to reduce the computational complexity of the model.
Collapse
Affiliation(s)
- Siting Lv
- College of Mathematics Informatics, South China Agricultural University, Guangzhou 510642, China; Key Laboratory of Smart Agricultural Technology in Tropical South China, Ministry of Agriculture and Rural Affairs, Guangzhou 510642, China; Guangdong Engineering Research Center of Agricultural Big Data, Guangzhou 510642, China
| | - Yuanyang Mao
- College of Mathematics Informatics, South China Agricultural University, Guangzhou 510642, China; Key Laboratory of Smart Agricultural Technology in Tropical South China, Ministry of Agriculture and Rural Affairs, Guangzhou 510642, China; Guangdong Engineering Research Center of Agricultural Big Data, Guangzhou 510642, China
| | - Youfu Liu
- College of Mathematics Informatics, South China Agricultural University, Guangzhou 510642, China; Key Laboratory of Smart Agricultural Technology in Tropical South China, Ministry of Agriculture and Rural Affairs, Guangzhou 510642, China; Guangdong Engineering Research Center of Agricultural Big Data, Guangzhou 510642, China
| | - Yigui Huang
- College of Mathematics Informatics, South China Agricultural University, Guangzhou 510642, China; Key Laboratory of Smart Agricultural Technology in Tropical South China, Ministry of Agriculture and Rural Affairs, Guangzhou 510642, China; Guangdong Engineering Research Center of Agricultural Big Data, Guangzhou 510642, China
| | - Dakang Guo
- College of Mathematics Informatics, South China Agricultural University, Guangzhou 510642, China; Key Laboratory of Smart Agricultural Technology in Tropical South China, Ministry of Agriculture and Rural Affairs, Guangzhou 510642, China; Guangdong Engineering Research Center of Agricultural Big Data, Guangzhou 510642, China
| | - Lei Cheng
- College of Mathematics Informatics, South China Agricultural University, Guangzhou 510642, China; Key Laboratory of Smart Agricultural Technology in Tropical South China, Ministry of Agriculture and Rural Affairs, Guangzhou 510642, China; Guangdong Engineering Research Center of Agricultural Big Data, Guangzhou 510642, China
| | - Zhuoheng Tang
- College of Mathematics Informatics, South China Agricultural University, Guangzhou 510642, China; Key Laboratory of Smart Agricultural Technology in Tropical South China, Ministry of Agriculture and Rural Affairs, Guangzhou 510642, China; Guangdong Engineering Research Center of Agricultural Big Data, Guangzhou 510642, China
| | - Shaohai Peng
- College of Mathematics Informatics, South China Agricultural University, Guangzhou 510642, China; Key Laboratory of Smart Agricultural Technology in Tropical South China, Ministry of Agriculture and Rural Affairs, Guangzhou 510642, China; Guangdong Engineering Research Center of Agricultural Big Data, Guangzhou 510642, China
| | - Deqin Xiao
- College of Mathematics Informatics, South China Agricultural University, Guangzhou 510642, China; Key Laboratory of Smart Agricultural Technology in Tropical South China, Ministry of Agriculture and Rural Affairs, Guangzhou 510642, China; Guangdong Engineering Research Center of Agricultural Big Data, Guangzhou 510642, China.
| |
Collapse
|
3
|
Khan I, Peralta D, Fontaine J, Soster de Carvalho P, Martos Martinez-Caja A, Antonissen G, Tuyttens F, De Poorter E. Monitoring Welfare of Individual Broiler Chickens Using Ultra-Wideband and Inertial Measurement Unit Wearables. SENSORS (BASEL, SWITZERLAND) 2025; 25:811. [PMID: 39943450 PMCID: PMC11820151 DOI: 10.3390/s25030811] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/28/2024] [Revised: 01/15/2025] [Accepted: 01/24/2025] [Indexed: 02/16/2025]
Abstract
Monitoring animal welfare on farms and in research settings is attracting increasing interest, both for ethical reasons and for improving productivity through the early detection of stress or diseases. In contrast to video-based monitoring, which requires good light conditions and has difficulty tracking specific animals, recent advances in the miniaturization of wearable devices allow for the collection of acceleration and location data to track individual animal behavior. However, for broilers, there are several challenges to address when using wearables, such as coping with (i) the large numbers of chickens in commercial farms,(ii)the impact of their rapid growth, and (iii) the small weights that the devices must have to be carried by the chickens without any impact on their health or behavior. To this end, this paper describes a pilot study in which chickens were fitted with devices containing an Inertial Measurement Unit (IMU) and an Ultra-Wideband (UWB) sensor. To establish guidelines for practitioners who want to monitor broiler welfare and activity at different scales, we first compare the attachment methods of the wearables to the broiler chickens, taking into account their effectiveness (in terms of retention time) and their impact on the broiler's welfare. Then, we establish the technical requirements to carry out such a study, and the challenges that may arise. This analysis involves aspects such as noise estimation, synergy between UWB and IMU, and the measurement of activity levels based on the monitoring of chicken activity. We show that IMU data can be used for detecting activity level differences between individual animals and environmental conditions. UWB data can be used to monitor the positions and movement patterns of up to 200 animals simultaneously with an accuracy of less than 20 cm. We also show that the accuracy depends on installation aspects and that errors are larger at the borders of the monitored area. Attachment with sutures had the longest mean retention of 19.5 days, whereas eyelash glue had the shortest mean retention of 3 days. To conclude the paper, we identify current challenges and future research lines in the field.
Collapse
Affiliation(s)
- Imad Khan
- Department of Pathobiology, Pharmacology and Zoological Medicine, Faculty of Veterinary Medicine, Ghent University, 9820 Merelbeke, Belgium; (I.K.); (P.S.d.C.); (A.M.M.-C.); (G.A.)
| | - Daniel Peralta
- Department of Computer Science and Artificial Intelligence, University of Granada, 18071 Granada, Spain
- DaSCI Andalusian Institute in Data Science and Computational Intelligence,18071 Granada, Spain
| | - Jaron Fontaine
- IDLab, Department of Information Technology, Ghent University—imec, 9052 Ghent, Belgium; (J.F.); (E.D.P.)
| | - Patricia Soster de Carvalho
- Department of Pathobiology, Pharmacology and Zoological Medicine, Faculty of Veterinary Medicine, Ghent University, 9820 Merelbeke, Belgium; (I.K.); (P.S.d.C.); (A.M.M.-C.); (G.A.)
- Poulpharm, 8870 Izegem, Belgium
| | - Ana Martos Martinez-Caja
- Department of Pathobiology, Pharmacology and Zoological Medicine, Faculty of Veterinary Medicine, Ghent University, 9820 Merelbeke, Belgium; (I.K.); (P.S.d.C.); (A.M.M.-C.); (G.A.)
| | - Gunther Antonissen
- Department of Pathobiology, Pharmacology and Zoological Medicine, Faculty of Veterinary Medicine, Ghent University, 9820 Merelbeke, Belgium; (I.K.); (P.S.d.C.); (A.M.M.-C.); (G.A.)
| | - Frank Tuyttens
- Department of Veterinary and Biosciences, Faculty of Veterinary Medicine, Ghent University, 9820 Merelbeke, Belgium;
- Flanders Research Institute for Agriculture, Fisheries, and Food (ILVO), 9090 Melle, Belgium
| | - Eli De Poorter
- IDLab, Department of Information Technology, Ghent University—imec, 9052 Ghent, Belgium; (J.F.); (E.D.P.)
| |
Collapse
|
4
|
Rossi FB, Rossi N, Orso G, Barberis L, Marin RH, Kembro JM. Monitoring poultry social dynamics using colored tags: Avian visual perception, behavioral effects, and artificial intelligence precision. Poult Sci 2025; 104:104464. [PMID: 39577175 PMCID: PMC11617678 DOI: 10.1016/j.psj.2024.104464] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2024] [Revised: 10/29/2024] [Accepted: 10/29/2024] [Indexed: 11/24/2024] Open
Abstract
Artificial intelligence (AI) in animal behavior and welfare research is on the rise. AI can detect behaviors and localize animals in video recordings, thus it is a valuable tool for studying social dynamics. However, maintaining the identity of individuals over time, especially in homogeneous poultry flocks, remains challenging for algorithms. We propose using differentially colored "backpack" tags (black, gray, white, orange, red, purple, and green) detectable with computer vision (eg. YOLO) from top-view video recordings of pens. These tags can also accommodate sensors, such as accelerometers. In separate experiments, we aim to: (i) evaluate avian visual perception of the different colored tags; (ii) assess the potential impact of tag colors on social behavior; and (iii) test the ability of the YOLO model to accurately distinguish between different colored tags on Japanese quail in social group settings. First, the reflectance spectra of tags and feathers were measured. An avian visual model was applied to calculate the quantum catches for each spectrum. Green and purple tags showed significant chromatic contrast to the feather. Mostly tags presented greater luminance receptor stimulation than feathers. Birds wearing white, gray, purple, and green tags pecked significantly more at their own tags than those with black (control) tags. Additionally, fewer aggressive interactions were observed in groups with orange tags compared to groups with other colors, except for red. Next, heterogeneous groups of 5 birds with different color tags were videorecorded for 1 h. The precision and accuracy of YOLO to detect each color tag were assessed, yielding values of 95.9% and 97.3%, respectively, with most errors stemming from misclassifications between black and gray tags. Lastly using the YOLO output, we estimated each bird's average social distance, locomotion speed, and the percentage of time spent moving. No behavioral differences associated with tag color were detected. In conclusion, carefully selected colored backpack tags can be identified using AI models and can also hold other sensors, making them powerful tools for behavioral and welfare studies.
Collapse
Affiliation(s)
- Florencia B Rossi
- Instituto de Investigaciones Biológicas y Tecnológicas (IIByT, CONICET-UNC), Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET), Avenida Vélez Sarsfield 1611, Córdoba, Córdoba, Argentina; Facultad de Ciencias Exactas, Físicas y Naturales, Universidad Nacional de Córdoba (UNC), Instituto de Ciencia y Tecnología de los Alimentos (ICTA), Córdoba, Córdoba, Argentina
| | - Nicola Rossi
- Universidad Nacional de Córdoba, Facultad de Ciencias Exactas Físicas y Naturales, Laboratorio de Biología del Comportamiento, Córdoba, Argentina; Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET), Instituto de Diversidad y Ecología Animal (IDEA), Córdoba, Argentina
| | - Gabriel Orso
- Instituto de Investigaciones Biológicas y Tecnológicas (IIByT, CONICET-UNC), Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET), Avenida Vélez Sarsfield 1611, Córdoba, Córdoba, Argentina; Facultad de Ciencias Exactas, Físicas y Naturales, Universidad Nacional de Córdoba (UNC), Instituto de Ciencia y Tecnología de los Alimentos (ICTA), Córdoba, Córdoba, Argentina
| | - Lucas Barberis
- Facultad de Matemática, Astronomía Física y Computación, Universidad Nacional de Córdoba, Córdoba, Argentina; Instituto de Física Enrique Gaviola (IFEG, CONICET-UNC), Córdoba, Córdoba, Argentina
| | - Raul H Marin
- Instituto de Investigaciones Biológicas y Tecnológicas (IIByT, CONICET-UNC), Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET), Avenida Vélez Sarsfield 1611, Córdoba, Córdoba, Argentina; Facultad de Ciencias Exactas, Físicas y Naturales, Universidad Nacional de Córdoba (UNC), Instituto de Ciencia y Tecnología de los Alimentos (ICTA), Córdoba, Córdoba, Argentina
| | - Jackelyn M Kembro
- Instituto de Investigaciones Biológicas y Tecnológicas (IIByT, CONICET-UNC), Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET), Avenida Vélez Sarsfield 1611, Córdoba, Córdoba, Argentina; Facultad de Ciencias Exactas, Físicas y Naturales, Universidad Nacional de Córdoba (UNC), Instituto de Ciencia y Tecnología de los Alimentos (ICTA), Córdoba, Córdoba, Argentina.
| |
Collapse
|
5
|
Yang X, Hu Q, Nie L, Wang C. Energy-aware feature and classifier for behaviour recognition of laying hens in an aviary system. Animal 2025; 19:101377. [PMID: 39675173 DOI: 10.1016/j.animal.2024.101377] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2024] [Revised: 11/07/2024] [Accepted: 11/14/2024] [Indexed: 12/17/2024] Open
Abstract
Long-term monitoring of animal behaviours requires energy-aware features and classifiers to support onboard classification. However, limited studies have been conducted on the behaviour recognition of laying hens, especially in aviary systems. The objective of this study was to configure key parameters for developing onboard behaviour monitoring techniques of aviary laying hens, including proper sliding window length, energy-aware feature, and lightweight classifier. A total of 19 Jingfen No.6 laying hens were reared in an aviary system from day 30 to day 70. Six light-weight accelerometers were attached to the back of birds for behaviour monitoring with a sampling frequency of 20 Hz. Laying hen behaviours were categorised into four groups, including static behaviour (resting and standing), ingestive behaviour (feeding and drinking), walking, and jumping. Two different window lengths (0.5 and 1 s) were tested. The SD of each axial acceleration was considered the only classification feature. The results indicated that performing denoise procedure before feature extraction can improve the classification accuracy by 10-20%. The 1-s window length yielded better accuracy than the 0.5-s window, especially for ingestive and walking behaviours. Classification models based on X-axis accelerations were better than those of Y- and Z-axis with the recognition accuracies of static, ingestive, walking, and jumping behaviours being 97.4, 89.6, 95.7, and 98.5%, respectively. The study might provide insights into developing onboard behaviour recognition algorithms for laying hens.
Collapse
Affiliation(s)
- X Yang
- College of Water Resources and Civil Engineering, China Agricultural University, Beijing 100083, China; Key Laboratory of Agricultural Engineering in Structure and Environment, Ministry of Agriculture and Rural Affairs, Beijing 100083, China
| | - Q Hu
- College of Water Resources and Civil Engineering, China Agricultural University, Beijing 100083, China; Key Laboratory of Agricultural Engineering in Structure and Environment, Ministry of Agriculture and Rural Affairs, Beijing 100083, China
| | - L Nie
- College of Animal Science, Ningxia University, Yinchuan 750021, China
| | - C Wang
- College of Water Resources and Civil Engineering, China Agricultural University, Beijing 100083, China; Key Laboratory of Agricultural Engineering in Structure and Environment, Ministry of Agriculture and Rural Affairs, Beijing 100083, China.
| |
Collapse
|
6
|
Merenda VR, Bodempudi VUC, Pairis-Garcia MD, Li G. Development and validation of machine-learning models for monitoring individual behaviors in group-housed broiler chickens. Poult Sci 2024; 103:104374. [PMID: 39426219 PMCID: PMC11536003 DOI: 10.1016/j.psj.2024.104374] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2024] [Revised: 09/16/2024] [Accepted: 09/26/2024] [Indexed: 10/21/2024] Open
Abstract
Animals' individual behavior is commonly monitored by live or video observation by a person. This can be labor intensive and inconsistent. An alternative is the use of machine learning-based computer vision systems. The objectives of this study were to 1) develop and optimize machine learning model frameworks for detecting, tracking and classifying individual behaviors of group-housed broiler chickens in continuous video recordings; and 2) use an independent dataset to evaluate the performance of the developed machine leaning model framework for individual poultry behaviors differentiation. Forty-two video recordings from 4 different pens (total video duration = 1,620 min) were used to develop and train multiple models for detecting and tracking individual birds and classifying 4 behaviors: feeding, drinking, active, and inactive. The optimal model framework was used to continuously analyze an external set of 11 videos (duration = 326 min), and the second-by-second behavior of each individual broiler was extracted for the comparison of human observation. After comparison of model performance, the YOLOv5l, out of 5 detection models, was selected for detecting individual broilers in a pen; the osnet_x0_25_msmt17, out of 4 tracking algorithms, was selected to track each detected bird in continuous frames; and the Gradient Boosting Classifier, out of 12 machine learning classifiers, was selected to classify the 4 behaviors. Most of the models were able to keep previously assigned individual identifications of the chickens in limited amounts of time, but lost the identities throughout an examination period (≥4 min). The final framework was able to accurately predict feeding (accuracy = 0.895) and drinking time (accuracy = 0.9) but subpar for active (accuracy = 0.545) and inactive time (accuracy = 0.505). The algorithms employed by the machine learning models were able to accurately detect feeding and drinking behavior but still need to be improved for maintaining individual identities of the chickens and identifying active and inactive behaviors.
Collapse
Affiliation(s)
- Victoria R Merenda
- Department of Population Health and Pathobiology, North Carolina State University, Raleigh, 27606
| | | | - Monique D Pairis-Garcia
- Department of Population Health and Pathobiology, North Carolina State University, Raleigh, 27606
| | - Guoming Li
- Department of Poultry Science, The University of Georgia, Athens, 30602.
| |
Collapse
|
7
|
Zhao S, Bai Z, Huo L, Han G, Duan E, Gong D, Gao L. Automatic Perception of Typical Abnormal Situations in Cage-Reared Ducks Using Computer Vision. Animals (Basel) 2024; 14:2192. [PMID: 39123718 PMCID: PMC11311051 DOI: 10.3390/ani14152192] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2024] [Revised: 07/22/2024] [Accepted: 07/25/2024] [Indexed: 08/12/2024] Open
Abstract
Overturning and death are common abnormalities in cage-reared ducks. To achieve timely and accurate detection, this study focused on 10-day-old cage-reared ducks, which are prone to these conditions, and established prior data on such situations. Using the original YOLOv8 as the base network, multiple GAM attention mechanisms were embedded into the feature fusion part (neck) to enhance the network's focus on the abnormal regions in images of cage-reared ducks. Additionally, the Wise-IoU loss function replaced the CIoU loss function by employing a dynamic non-monotonic focusing mechanism to balance the data samples and mitigate excessive penalties from geometric parameters in the model. The image brightness was adjusted by factors of 0.85 and 1.25, and mainstream object-detection algorithms were adopted to test and compare the generalization and performance of the proposed method. Based on six key points around the head, beak, chest, tail, left foot, and right foot of cage-reared ducks, the body structure of the abnormal ducks was refined. Accurate estimation of the overturning and dead postures was achieved using the HRNet-48. The results demonstrated that the proposed method accurately recognized these states, achieving a mean Average Precision (mAP) value of 0.924, which was 1.65% higher than that of the original YOLOv8. The method effectively addressed the recognition interference caused by lighting differences, and exhibited an excellent generalization ability and comprehensive detection performance. Furthermore, the proposed abnormal cage-reared duck pose-estimation model achieved an Object Key point Similarity (OKS) value of 0.921, with a single-frame processing time of 0.528 s, accurately detecting multiple key points of the abnormal cage-reared duck bodies and generating correct posture expressions.
Collapse
Affiliation(s)
- Shida Zhao
- Institute of Agricultural Facilities and Equipment, Jiangsu Academy of Agricultural Sciences, Nanjing 210014, China
- Key Laboratory of Protected Agriculture Engineering in the Middle and Lower Reaches of Yangtze River, Ministry of Agriculture and Rural Affairs, Nanjing 210014, China
| | - Zongchun Bai
- Institute of Agricultural Facilities and Equipment, Jiangsu Academy of Agricultural Sciences, Nanjing 210014, China
- Key Laboratory of Protected Agriculture Engineering in the Middle and Lower Reaches of Yangtze River, Ministry of Agriculture and Rural Affairs, Nanjing 210014, China
| | - Lianfei Huo
- Institute of Agricultural Facilities and Equipment, Jiangsu Academy of Agricultural Sciences, Nanjing 210014, China
- Key Laboratory of Protected Agriculture Engineering in the Middle and Lower Reaches of Yangtze River, Ministry of Agriculture and Rural Affairs, Nanjing 210014, China
| | - Guofeng Han
- Institute of Agricultural Facilities and Equipment, Jiangsu Academy of Agricultural Sciences, Nanjing 210014, China
- Key Laboratory of Protected Agriculture Engineering in the Middle and Lower Reaches of Yangtze River, Ministry of Agriculture and Rural Affairs, Nanjing 210014, China
| | - Enze Duan
- Institute of Agricultural Facilities and Equipment, Jiangsu Academy of Agricultural Sciences, Nanjing 210014, China
- Key Laboratory of Protected Agriculture Engineering in the Middle and Lower Reaches of Yangtze River, Ministry of Agriculture and Rural Affairs, Nanjing 210014, China
| | - Dongjun Gong
- College of Engineering, Huazhong Agricultural University, Wuhan 430070, China
| | - Liaoyuan Gao
- State Key Laboratory of Intelligent Agricultural Power Equipment, Luoyang 471039, China
| |
Collapse
|
8
|
Pearce J, Chang YM, Xia D, Abeyesinghe S. Classification of Behaviour in Conventional and Slow-Growing Strains of Broiler Chickens Using Tri-Axial Accelerometers. Animals (Basel) 2024; 14:1957. [PMID: 38998070 PMCID: PMC11240663 DOI: 10.3390/ani14131957] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2024] [Revised: 06/21/2024] [Accepted: 06/25/2024] [Indexed: 07/14/2024] Open
Abstract
Behavioural states such as walking, sitting and standing are important in indicating welfare, including lameness in broiler chickens. However, manual behavioural observations of individuals are often limited by time constraints and small sample sizes. Three-dimensional accelerometers have the potential to collect information on animal behaviour. We applied a random forest algorithm to process accelerometer data from broiler chickens. Data from three broiler strains at a range of ages (from 25 to 49 days old) were used to train and test the algorithm, and unlike other studies, the algorithm was further tested on an unseen broiler strain. When tested on unseen birds from the three training broiler strains, the random forest model classified behaviours with very good accuracy (92%) and specificity (94%) and good sensitivity (88%) and precision (88%). With the new, unseen strain, the model classified behaviours with very good accuracy (94%), sensitivity (91%), specificity (96%) and precision (91%). We therefore successfully used a random forest model to automatically detect three broiler behaviours across four different strains and different ages using accelerometers. These findings demonstrated that accelerometers can be used to automatically record behaviours to supplement biomechanical and behavioural research and support in the reduction principle of the 3Rs.
Collapse
Affiliation(s)
- Justine Pearce
- The Royal Veterinary College, Hawkshead Lane, Brookmans Park, Hatfield AL9 7TA, UK; (Y.-M.C.); (D.X.); (S.A.)
| | | | | | | |
Collapse
|
9
|
He P, Wu R, Liu D, Dou J, Hayat K, Shang D, Pan J, Lin H. An efficient segmentation model for abnormal chicken droppings recognition based on improved deep dual-resolution network. J Anim Sci 2024; 102:skae098. [PMID: 38587413 PMCID: PMC11285374 DOI: 10.1093/jas/skae098] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2024] [Accepted: 04/07/2024] [Indexed: 04/09/2024] Open
Abstract
The characteristics of chicken droppings are closely linked to their health status. In prior studies, chicken droppings recognition is treated as an object detection task, leading to challenges in labeling and missed detection due to the diverse shapes, overlapping boundaries, and dense distribution of chicken droppings. Additionally, the use of intelligent monitoring equipment equipped with edge devices in farms can significantly reduce manual labor. However, the limited computational power of edge devices presents challenges in deploying real-time segmentation algorithms for field applications. Therefore, this study redefines the task as a segmentation task, with the main objective being the development of a lightweight segmentation model for the automated monitoring of abnormal chicken droppings. A total of 60 Arbor Acres broilers were housed in 5 specific pathogen-free cages for over 3 wk, and 1650 RGB images of chicken droppings were randomly divided into training and testing sets in an 8:2 ratio to develop and test the model. Firstly, by incorporating the attention mechanism, multi-loss function, and auxiliary segmentation head, the segmentation accuracy of the DDRNet was enhanced. Then, by employing the group convolution and an advanced knowledge-distillation algorithm, a lightweight segmentation model named DDRNet-s-KD was obtained, which achieved a mean Dice coefficient (mDice) of 79.43% and an inference speed of 86.10 frames per second (FPS), showing a 2.91% and 61.2% increase in mDice and FPS compared to the benchmark model. Furthermore, the DDRNet-s-KD model was quantized from 32-bit floating-point values to 8-bit integers and then converted to TensorRT format. Impressively, the weight size of the quantized model was only 13.7 MB, representing an 82.96% reduction compared to the benchmark model. This makes it well-suited for deployment on the edge device, achieving an inference speed of 137.51 FPS on Jetson Xavier NX. In conclusion, the methods proposed in this study show significant potential in monitoring abnormal chicken droppings and can provide an effective reference for the implementation of other agricultural embedded systems.
Collapse
Affiliation(s)
- Pengguang He
- College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou, Zhejiang 310058, China
- Key Laboratory of Equipment and Informatization in Environment Controlled Agriculture, Hangzhou, Zhejiang 310058, China
- Ministry of Agriculture and Rural Affairs, Key Laboratory of Intelligent Equipment and Robotics for Agriculture of Zhejiang Province, Hangzhou, Zhejiang 310058, China
| | - Rui Wu
- College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou, Zhejiang 310058, China
- Key Laboratory of Equipment and Informatization in Environment Controlled Agriculture, Hangzhou, Zhejiang 310058, China
- Ministry of Agriculture and Rural Affairs, Key Laboratory of Intelligent Equipment and Robotics for Agriculture of Zhejiang Province, Hangzhou, Zhejiang 310058, China
| | - Da Liu
- College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou, Zhejiang 310058, China
- Ministry of Agriculture and Rural Affairs, Key Laboratory of Intelligent Equipment and Robotics for Agriculture of Zhejiang Province, Hangzhou, Zhejiang 310058, China
| | - Jun Dou
- College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou, Zhejiang 310058, China
- Key Laboratory of Equipment and Informatization in Environment Controlled Agriculture, Hangzhou, Zhejiang 310058, China
- Ministry of Agriculture and Rural Affairs, Key Laboratory of Intelligent Equipment and Robotics for Agriculture of Zhejiang Province, Hangzhou, Zhejiang 310058, China
| | - Khawar Hayat
- College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou, Zhejiang 310058, China
- Key Laboratory of Equipment and Informatization in Environment Controlled Agriculture, Hangzhou, Zhejiang 310058, China
- Ministry of Agriculture and Rural Affairs, Key Laboratory of Intelligent Equipment and Robotics for Agriculture of Zhejiang Province, Hangzhou, Zhejiang 310058, China
| | - Dongmei Shang
- Dandong Ruifeng Animal Pharmaceutical Co., Ltd, Dandong, Liaoning 118000, China
| | - Jinming Pan
- College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou, Zhejiang 310058, China
- Key Laboratory of Equipment and Informatization in Environment Controlled Agriculture, Hangzhou, Zhejiang 310058, China
- Ministry of Agriculture and Rural Affairs, Key Laboratory of Intelligent Equipment and Robotics for Agriculture of Zhejiang Province, Hangzhou, Zhejiang 310058, China
| | - Hongjian Lin
- College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou, Zhejiang 310058, China
- Key Laboratory of Equipment and Informatization in Environment Controlled Agriculture, Hangzhou, Zhejiang 310058, China
- Ministry of Agriculture and Rural Affairs, Key Laboratory of Intelligent Equipment and Robotics for Agriculture of Zhejiang Province, Hangzhou, Zhejiang 310058, China
| |
Collapse
|
10
|
Simian C, Rossi FB, Marin RH, Barberis L, Kembro JM. High-resolution ethograms, accelerometer recordings, and behavioral time series of Japanese quail. Sci Data 2024; 11:14. [PMID: 38168115 PMCID: PMC10762143 DOI: 10.1038/s41597-023-02820-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2023] [Accepted: 12/05/2023] [Indexed: 01/05/2024] Open
Abstract
Although many small vertebrates are capable of performing high-speed behaviors, most studies continue to focus on low-resolution temporal scales (>>1 s). Herein, we present video-recordings, behavior time series, and the computer software for video-analysis of Japanese quail within social groups. Home-boxes were monitored using both top and side video-cameras. High-resolution ethograms were developed for analyses. Pairs of females were assigned as either controls or using one of two methods for attachment of an accelerometer (patch or backpack). Behavior was recorded during 1 h on the first 2-days, sampled at 1 s intervals (days 1 and 2). On day 8, an unfamiliar male was placed in the home-box and its behavior was recorded during the first 10 min, sampled every 1/15 s. Male accelerometer recordings were also obtained. Video-recordings and resulting detailed high-resolution behavioral time series are valuable for reuse in comparative studies regarding the temporal dynamics of behavior within social environments. In addition, they are necessary for the assessment of novel machine learning algorithms that could be used for deciphering the output of accelerometer recordings.
Collapse
Affiliation(s)
- Catalina Simian
- Laboratorio de Biología Reproductiva y Evolución, Instituto de Diversidad y Ecología Animal (IDEA, UNC-CONICET), Córdoba, Argentina
| | - Florencia Belén Rossi
- Instituto de Investigaciones Biológicas y Tecnológicas (IIByT, UNC-CONICET), Córdoba, Argentina
- Universidad Nacional de Córdoba (UNC), Facultad de Ciencias Exactas, Físicas y Naturales, Instituto de Ciencia y Tecnología de los Alimentos (ICTA), Córdoba, Argentina
| | - Raul Hector Marin
- Instituto de Investigaciones Biológicas y Tecnológicas (IIByT, UNC-CONICET), Córdoba, Argentina
- Universidad Nacional de Córdoba (UNC), Facultad de Ciencias Exactas, Físicas y Naturales, Instituto de Ciencia y Tecnología de los Alimentos (ICTA), Córdoba, Argentina
- Universidad Nacional de Córdoba, Facultad de Ciencias Exactas, Físicas y Naturales, Cátedra de Química Biológica, Córdoba, Argentina
| | - Lucas Barberis
- Universidad Nacional de Córdoba, Facultad de Matemática, Astronomía Física y Computación, Córdoba, Argentina
- Instituto de Física Enrique Gaviola (IFEG, UNC-CONICET), Córdoba, Argentina
| | - Jackelyn Melissa Kembro
- Instituto de Investigaciones Biológicas y Tecnológicas (IIByT, UNC-CONICET), Córdoba, Argentina.
- Universidad Nacional de Córdoba (UNC), Facultad de Ciencias Exactas, Físicas y Naturales, Instituto de Ciencia y Tecnología de los Alimentos (ICTA), Córdoba, Argentina.
- Universidad Nacional de Córdoba, Facultad de Ciencias Exactas, Físicas y Naturales, Cátedra de Química Biológica, Córdoba, Argentina.
| |
Collapse
|
11
|
Doornweerd JE, Veerkamp RF, de Klerk B, van der Sluis M, Bouwman AC, Ellen ED, Kootstra G. Tracking individual broilers on video in terms of time and distance. Poult Sci 2024; 103:103185. [PMID: 37980741 PMCID: PMC10663953 DOI: 10.1016/j.psj.2023.103185] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2023] [Revised: 10/06/2023] [Accepted: 10/06/2023] [Indexed: 11/21/2023] Open
Abstract
Tracking group-housed individual broilers using video can provide valuable information on their health, welfare, and performance, allowing breeders to identify novel or indicator traits that aid genetic improvement. However, their similar appearances make tracking individual broilers in a group-housed setting challenging. This study aimed to analyze broiler tracking on video (number of ID-switches, tracking time and distance) and examined potential tracking errors (ID-losses - location, proximity, kinematics) in an experimental pen to enable broiler locomotion phenotyping. This comprehensive analysis provided insights into the potential and challenges of tracking group-housed broilers on video with regards to phenotyping broiler locomotion. Thirty-nine broilers, of which 35 noncolor marked, were housed in an experimental pen (1.80 × 2.61 m), and only data at 18 d of age were used. A YOLOv7-tiny model was trained (n = 140), validated (n = 30), and tested (n = 30) on 200 annotated frames to detect the broilers. On the test set, YOLOv7-tiny had a precision, recall, and average precision (@0.5 - Intersection over Union threshold) of 0.99. A multi-object tracker (SORT) was implemented and evaluated on ground-truth trajectories of thirteen white broilers based on 136 min of video data (1-min intervals). The number of ID-switches varied from 5 to 20 (mean: 9.92) per ground-truth trajectory, tracking times ranged from 1 (by definition) to 51 min (mean: 12.36), and tracking distances ranged from 0.01 to 17.07 meters (mean: 1.89) per tracklet. Tracking errors primarily occurred when broilers were occluded by the drinker, and relatively frequently when broilers were in close proximity (within 10 cm), with velocity and acceleration appearing to have a lesser impact on tracking errors. The study establishes a 'baseline' for future research and identified the potential and challenges of tracking group-housed individual broilers. The results highlighted the importance of addressing ID-switches, identified potential tracking algorithm improvements, and emphasized the need for an external animal identification system to enable objective, simultaneous and semi-continuous locomotion phenotyping of group-housed individual broilers.
Collapse
Affiliation(s)
- J E Doornweerd
- Animal Breeding and Genomics, Wageningen University & Research, 6700 AH Wageningen, the Netherlands.
| | - R F Veerkamp
- Animal Breeding and Genomics, Wageningen University & Research, 6700 AH Wageningen, the Netherlands
| | - B de Klerk
- Research & Development, Cobb Europe BV, 5831 GH Boxmeer, the Netherlands
| | - M van der Sluis
- Animal Breeding and Genomics, Wageningen University & Research, 6700 AH Wageningen, the Netherlands
| | - A C Bouwman
- Animal Breeding and Genomics, Wageningen University & Research, 6700 AH Wageningen, the Netherlands
| | - E D Ellen
- Animal Breeding and Genomics, Wageningen University & Research, 6700 AH Wageningen, the Netherlands
| | - G Kootstra
- Farm Technology, Wageningen University & Research, 6700 AA Wageningen, the Netherlands
| |
Collapse
|
12
|
Nakanishi K, Goto H. A New Index for the Quantitative Evaluation of Surgical Invasiveness Based on Perioperative Patients' Behavior Patterns: Machine Learning Approach Using Triaxial Acceleration. JMIR Perioper Med 2023; 6:e50188. [PMID: 37962919 PMCID: PMC10685283 DOI: 10.2196/50188] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2023] [Revised: 09/12/2023] [Accepted: 10/11/2023] [Indexed: 11/15/2023] Open
Abstract
BACKGROUND The minimally invasive nature of thoracoscopic surgery is well recognized; however, the absence of a reliable evaluation method remains challenging. We hypothesized that the postoperative recovery speed is closely linked to surgical invasiveness, where recovery signifies the patient's behavior transition back to their preoperative state during the perioperative period. OBJECTIVE This study aims to determine whether machine learning using triaxial acceleration data can effectively capture perioperative behavior changes and establish a quantitative index for quantifying variations in surgical invasiveness. METHODS We trained 7 distinct machine learning models using a publicly available human acceleration data set as supervised data. The 3 top-performing models were selected to predict patient actions, as determined by the Matthews correlation coefficient scores. Two patients who underwent different levels of invasive thoracoscopic surgery were selected as participants. Acceleration data were collected via chest sensors for 8 hours during the preoperative and postoperative hospitalization days. These data were categorized into 4 actions (walking, standing, sitting, and lying down) using the selected models. The actions predicted by the model with intermediate results were adopted as the actions of the participants. The daily appearance probability was calculated for each action. The 2 differences between 2 appearance probabilities (sitting vs standing and lying down vs walking) were calculated using 2 coordinates on the x- and y-axes. A 2D vector composed of coordinate values was defined as the index of behavior pattern (iBP) for the day. All daily iBPs were graphed, and the enclosed area and distance between points were calculated and compared between participants to assess the relationship between changes in the indices and invasiveness. RESULTS Patients 1 and 2 underwent lung lobectomy and incisional tumor biopsy, respectively. The selected predictive model was a light-gradient boosting model (mean Matthews correlation coefficient 0.98, SD 0.0027; accuracy: 0.98). The acceleration data yielded 548,466 points for patient 1 and 466,407 points for patient 2. The iBPs of patient 1 were [(0.32, 0.19), (-0.098, 0.46), (-0.15, 0.13), (-0.049, 0.22)] and those of patient 2 were [(0.55, 0.30), (0.77, 0.21), (0.60, 0.25), (0.61, 0.31)]. The enclosed areas were 0.077 and 0.0036 for patients 1 and 2, respectively. Notably, the distances for patient 1 were greater than those for patient 2 ({0.44, 0.46, 0.37, 0.26} vs {0.23, 0.0065, 0.059}; P=.03 [Mann-Whitney U test]). CONCLUSIONS The selected machine learning model effectively predicted the actions of the surgical patients with high accuracy. The temporal distribution of action times revealed changes in behavior patterns during the perioperative phase. The proposed index may facilitate the recognition and visualization of perioperative changes in patients and differences in surgical invasiveness.
Collapse
Affiliation(s)
- Kozo Nakanishi
- Department of General Thoracic Surgery, National Hospital Organization Saitama Hospital, Wako Saitama, Japan
| | - Hidenori Goto
- Department of General Thoracic Surgery, National Hospital Organization Saitama Hospital, Wako Saitama, Japan
| |
Collapse
|
13
|
Adler CAB, Duhra D, Shynkaruk T, Schwean-Lardner K. Research Note: Validation of a low-cost accelerometer to measure physical activity in 30 to 32-d-old male Ross 708 broilers. Poult Sci 2023; 102:102966. [PMID: 37566965 PMCID: PMC10440556 DOI: 10.1016/j.psj.2023.102966] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2023] [Revised: 07/21/2023] [Accepted: 07/22/2023] [Indexed: 08/13/2023] Open
Abstract
Poultry activity measurements are often associated with expensive equipment or time-consuming behavior observations. Since low-cost accelerometers are available, the current study validated the FitBark (FitBark 2, FitBark Inc., Kansas City, MO) accelerometer for use on 30 to 32-d-old male Ross 708 broilers. The FitBark provides aggregated activity levels based on tri-axial accelerometer technology. Broilers were housed in 5 rooms, each divided into 12 2 × 2.3 m pens (60 birds per pen, 31 kg m-2 final density). From 30 to 32 d, 1 broiler per room (n = 5) was randomly selected and equipped with a 13 g FitBark. Elastic loops were placed around the wings to secure the FitBark medially on the back. During the same time, validity was assessed via ceiling-mounted video cameras. The video recordings were analyzed using 20-min continuous sampling during the photo phase at 8 time periods per bird. Behavior was assessed every second using an ethogram (9,600 data points per bird). In the first step, the FitBark data were matched and correlated with the corresponding video-based observed activity (OA) data. The FitBark and OA data were not normally distributed (1-sample KS test, all n = 800, ZFitBark = 0.21, ZOA = 0.24, all P < 0.001). Therefore, data were transformed, and a repeated measures correlation was performed for each bird, showing a positive correlation between the FitBark and OA data (rrm = 0.76, 95% CI = 0.72-0.78, df = 794, P < 0.001). In the second step, sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and accuracy were calculated. The FitBark correctly identified 91% (sensitivity) of the active and 74% (specificity) of the inactive birds. When the FitBark detected an active or inactive bird, there was a probability of 89% (PPV) and 78% (NPV) that the bird was observed to be active or inactive based on the OA data. Accuracy was at 86%. Overall, FitBark are useful for 1-min interval activity measurements in 30 to 32-d-old male Ross 708 broilers. Further research should focus on validating the FitBark at other ages and in different poultry species.
Collapse
Affiliation(s)
- C A B Adler
- Department of Animal and Poultry Science, University of Saskatchewan, Saskatoon, Saskatchewan S7N 5A8, Canada
| | - D Duhra
- Department of Animal and Poultry Science, University of Saskatchewan, Saskatoon, Saskatchewan S7N 5A8, Canada
| | - T Shynkaruk
- Department of Animal and Poultry Science, University of Saskatchewan, Saskatoon, Saskatchewan S7N 5A8, Canada
| | - K Schwean-Lardner
- Department of Animal and Poultry Science, University of Saskatchewan, Saskatoon, Saskatchewan S7N 5A8, Canada.
| |
Collapse
|
14
|
Zhao S, Bai Z, Meng L, Han G, Duan E. Pose Estimation and Behavior Classification of Jinling White Duck Based on Improved HRNet. Animals (Basel) 2023; 13:2878. [PMID: 37760278 PMCID: PMC10525901 DOI: 10.3390/ani13182878] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2023] [Revised: 09/03/2023] [Accepted: 09/05/2023] [Indexed: 09/29/2023] Open
Abstract
In breeding ducks, obtaining the pose information is vital for perceiving their physiological health, ensuring welfare in breeding, and monitoring environmental comfort. This paper proposes a pose estimation method by combining HRNet and CBAM to achieve automatic and accurate detection of duck's multi-poses. Through comparison, HRNet-32 is identified as the optimal option for duck pose estimation. Based on this, multiple CBAM modules are densely embedded into the HRNet-32 network to obtain the pose estimation model based on HRNet-32-CBAM, realizing accurate detection and correlation of eight keypoints across six different behaviors. Furthermore, the model's generalization ability is tested under different illumination conditions, and the model's comprehensive detection abilities are evaluated on Cherry Valley ducklings of 12 and 24 days of age. Moreover, this model is compared with mainstream pose estimation methods to reveal its advantages and disadvantages, and its real-time performance is tested using images of 256 × 256, 512 × 512, and 728 × 728 pixel sizes. The experimental results indicate that for the duck pose estimation dataset, the proposed method achieves an average precision (AP) of 0.943, which has a strong generalization ability and can achieve real-time estimation of the duck's multi-poses under different ages, breeds, and farming modes. This study can provide a technical reference and a basis for the intelligent farming of poultry animals.
Collapse
Affiliation(s)
- Shida Zhao
- Institute of Agricultural Facilities and Equipment, Jiangsu Academy of Agricultural Sciences, Nanjing 210014, China
- Key Laboratory of Protected Agriculture Engineering in the Middle and Lower Reaches of Yangtze River, Ministry of Agriculture and Rural Affairs, Nanjing 210014, China
| | - Zongchun Bai
- Institute of Agricultural Facilities and Equipment, Jiangsu Academy of Agricultural Sciences, Nanjing 210014, China
- Key Laboratory of Protected Agriculture Engineering in the Middle and Lower Reaches of Yangtze River, Ministry of Agriculture and Rural Affairs, Nanjing 210014, China
| | - Lili Meng
- Institute of Agricultural Facilities and Equipment, Jiangsu Academy of Agricultural Sciences, Nanjing 210014, China
- Key Laboratory of Protected Agriculture Engineering in the Middle and Lower Reaches of Yangtze River, Ministry of Agriculture and Rural Affairs, Nanjing 210014, China
- School of Civil Engineering, Engineering Campus, Universiti Sains Malaysia, Nibong Tebal 14300, Malaysia
| | - Guofeng Han
- Institute of Agricultural Facilities and Equipment, Jiangsu Academy of Agricultural Sciences, Nanjing 210014, China
- Key Laboratory of Protected Agriculture Engineering in the Middle and Lower Reaches of Yangtze River, Ministry of Agriculture and Rural Affairs, Nanjing 210014, China
| | - Enze Duan
- Institute of Agricultural Facilities and Equipment, Jiangsu Academy of Agricultural Sciences, Nanjing 210014, China
- Key Laboratory of Protected Agriculture Engineering in the Middle and Lower Reaches of Yangtze River, Ministry of Agriculture and Rural Affairs, Nanjing 210014, China
| |
Collapse
|
15
|
Jiang B, Tang W, Cui L, Deng X. Precision Livestock Farming Research: A Global Scientometric Review. Animals (Basel) 2023; 13:2096. [PMID: 37443894 DOI: 10.3390/ani13132096] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2023] [Revised: 06/16/2023] [Accepted: 06/21/2023] [Indexed: 07/15/2023] Open
Abstract
Precision livestock farming (PLF) utilises information technology to continuously monitor and manage livestock in real-time, which can improve individual animal health, welfare, productivity and the environmental impact of animal husbandry, contributing to the economic, social and environmental sustainability of livestock farming. PLF has emerged as a pivotal area of multidisciplinary interest. In order to clarify the knowledge evolution and hotspot replacement of PLF research, based on the relevant data from the Web of Science database from 1973 to 2023, this study analyzed the main characteristics, research cores and hot topics of PLF research via CiteSpace. The results point to a significant increase in studies on PLF, with countries having advanced livestock farming systems in Europe and America publishing frequently and collaborating closely across borders. Universities in various countries have been leading the research, with Daniel Berckmans serving as the academic leader. Research primarily focuses on animal science, veterinary science, computer science, agricultural engineering, and environmental science. Current research hotspots center around precision dairy and cattle technology, intelligent systems, and animal behavior, with deep learning, accelerometer, automatic milking systems, lameness, estrus detection, and electronic identification being the main research directions, and deep learning and machine learning represent the forefront of current research. Research hot topics mainly include social science in PLF, the environmental impact of PLF, information technology in PLF, and animal welfare in PLF. Future research in PLF should prioritize inter-institutional and inter-scholar communication and cooperation, integration of multidisciplinary and multimethod research approaches, and utilization of deep learning and machine learning. Furthermore, social science issues should be given due attention in PLF, and the integration of intelligent technologies in animal management should be strengthened, with a focus on animal welfare and the environmental impact of animal husbandry, to promote its sustainable development.
Collapse
Affiliation(s)
- Bing Jiang
- College of Economics and Management, Northeast Agricultural University, Harbin 150030, China
- Development Research Center of Modern Agriculture, Northeast Agricultural University, Harbin 150030, China
| | - Wenjie Tang
- College of Economics and Management, Northeast Agricultural University, Harbin 150030, China
| | - Lihang Cui
- College of Economics and Management, Northeast Agricultural University, Harbin 150030, China
| | - Xiaoshang Deng
- College of Economics and Management, Northeast Agricultural University, Harbin 150030, China
| |
Collapse
|
16
|
Fujinami K, Takuno R, Sato I, Shimmura T. Evaluating Behavior Recognition Pipeline of Laying Hens Using Wearable Inertial Sensors. SENSORS (BASEL, SWITZERLAND) 2023; 23:s23115077. [PMID: 37299804 DOI: 10.3390/s23115077] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/30/2023] [Revised: 05/11/2023] [Accepted: 05/23/2023] [Indexed: 06/12/2023]
Abstract
Recently, animal welfare has gained worldwide attention. The concept of animal welfare encompasses the physical and mental well-being of animals. Rearing layers in battery cages (conventional cages) may violate their instinctive behaviors and health, resulting in increased animal welfare concerns. Therefore, welfare-oriented rearing systems have been explored to improve their welfare while maintaining productivity. In this study, we explore a behavior recognition system using a wearable inertial sensor to improve the rearing system based on continuous monitoring and quantifying behaviors. Supervised machine learning recognizes a variety of 12 hen behaviors where various parameters in the processing pipeline are considered, including the classifier, sampling frequency, window length, data imbalance handling, and sensor modality. A reference configuration utilizes a multi-layer perceptron as a classifier; feature vectors are calculated from the accelerometer and angular velocity sensor in a 1.28 s window sampled at 100 Hz; the training data are unbalanced. In addition, the accompanying results would allow for a more intensive design of similar systems, estimation of the impact of specific constraints on parameters, and recognition of specific behaviors.
Collapse
Affiliation(s)
- Kaori Fujinami
- Division of Advanced Information Technology and Computer Science, Institute of Engineering, Tokyo University of Agriculture and Technology, Tokyo 184-8588, Japan
- Department of Bio-Functions and Systems Science, Graduate School of Bio-Applications and Systems Engineering, Tokyo University of Agriculture and Technology, Tokyo 184-8588, Japan
| | - Ryo Takuno
- Department of Bio-Functions and Systems Science, Graduate School of Bio-Applications and Systems Engineering, Tokyo University of Agriculture and Technology, Tokyo 184-8588, Japan
| | - Itsufumi Sato
- Department of Agriculture, Graduate School of Agriculture, Tokyo University of Agriculture and Technology, Tokyo 183-8509, Japan
| | - Tsuyoshi Shimmura
- Institute of Global Innovation Research, Tokyo University of Agriculture and Technology, Tokyo 183-8509, Japan
| |
Collapse
|
17
|
Sur M, Hall JC, Brandt J, Astell M, Poessel SA, Katzner TE. Supervised versus unsupervised approaches to classification of accelerometry data. Ecol Evol 2023; 13:e10035. [PMID: 37206689 PMCID: PMC10191777 DOI: 10.1002/ece3.10035] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2022] [Revised: 03/31/2023] [Accepted: 04/05/2023] [Indexed: 05/21/2023] Open
Abstract
Sophisticated animal-borne sensor systems are increasingly providing novel insight into how animals behave and move. Despite their widespread use in ecology, the diversity and expanding quality and quantity of data they produce have created a need for robust analytical methods for biological interpretation. Machine learning tools are often used to meet this need. However, their relative effectiveness is not well known and, in the case of unsupervised tools, given that they do not use validation data, their accuracy can be difficult to assess. We evaluated the effectiveness of supervised (n = 6), semi-supervised (n = 1), and unsupervised (n = 2) approaches to analyzing accelerometry data collected from critically endangered California condors (Gymnogyps californianus). Unsupervised K-means and EM (expectation-maximization) clustering approaches performed poorly, with adequate classification accuracies of <0.8 but very low values for kappa statistics (range: -0.02 to 0.06). The semi-supervised nearest mean classifier was moderately effective at classification, with an overall classification accuracy of 0.61 but effective classification only of two of the four behavioral classes. Supervised random forest (RF) and k-nearest neighbor (kNN) machine learning models were most effective at classification across all behavior types, with overall accuracies >0.81. Kappa statistics were also highest for RF and kNN, in most cases substantially greater than for other modeling approaches. Unsupervised modeling, which is commonly used for the classification of a priori-defined behaviors in telemetry data, can provide useful information but likely is instead better suited to post hoc definition of generalized behavioral states. This work also shows the potential for substantial variation in classification accuracy among different machine learning approaches and among different metrics of accuracy. As such, when analyzing biotelemetry data, best practices appear to call for the evaluation of several machine learning techniques and several measures of accuracy for each dataset under consideration.
Collapse
Affiliation(s)
- Maitreyi Sur
- Conservation Science Global, Inc.West Cape MayNew JerseyUSA
- Present address:
Radboud Institute for Biological and Environmental Sciences (RIBES)Radboud UniversityNijmegenThe Netherlands
| | - Jonathan C. Hall
- Department of BiologyEastern Michigan UniversityYpsilantiMichiganUSA
| | - Joseph Brandt
- U.S. Fish and Wildlife Service, Hopper Mountain National Wildlife Refuge ComplexVenturaCaliforniaUSA
| | - Molly Astell
- U.S. Fish and Wildlife Service, Hopper Mountain National Wildlife Refuge ComplexVenturaCaliforniaUSA
- Department of BiologyBoise State UniversityBoiseIdahoUSA
| | - Sharon A. Poessel
- U.S. Geological Survey, Forest and Rangeland Ecosystem Science CenterBoiseIdahoUSA
| | - Todd E. Katzner
- U.S. Geological Survey, Forest and Rangeland Ecosystem Science CenterBoiseIdahoUSA
| |
Collapse
|
18
|
Horna F, Leandro GDS, Bícego KC, Macari M, Reis MP, Cerrate S, Sakomura NK. Energy cost of physical activities in growing broilers. Br Poult Sci 2023:1-8. [PMID: 36947419 DOI: 10.1080/00071668.2023.2191309] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/23/2023]
Abstract
The time-energy budget method estimates the energy used for physical activity (PA) based on recorded daily patterns and indirect calorimetry. Nevertheless, information about individual energy cost by type of PA are not available and so this study estimated the energy cost of PA for growing broilers.An indirect calorimetry system for single birds was constructed to measure the variation in the rate of O2 consumption (V˙O2, L/min) and rate of CO2 production (V˙CO2, L/min) produced by these PAs.A total of five birds were used in a replicated trialwhere their body weight (BW) ranged from 1.5 to 2.5 kg to measure the increase in heat production (HP) above resting levels as a result of PA. The procedure in the chamber was divided into five steps: (1) initial baselining, (2) resting metabolic rate, (3) PA such as feeding, drinking and other standing activities, (4) removal of gas exchange produced in step 3 and (5) final baselining. The PA was recorded using a video camera fixed at the chamber's top (and outside).The area under V˙CO2 and V˙O2 curves was used to calculate the CO2 production (vCO2, L) and O2 consumption (vO2, L). Then, the HP (cal/kg-0.75) was calculated according to the Brouwer equation. Two observers analysed the video records to estimate the time spent for each PA (seconds and frequency).To calculate the energetic coefficients, the HP was regressed with the function of time spent to perform each PA allowing to estimate the energy cost for eating, drinking and stand activities, which were 0.607, 0.352 and 0.938 cal/kg-0.75/s, respectively.
Collapse
Affiliation(s)
- Freddy Horna
- São Paulo State University (UNESP), School of Agricultural and Veterinarian Sciences, Jaboticabal, São Paulo, Brazil
| | - Gabriela Da Silva Leandro
- São Paulo State University (UNESP), School of Agricultural and Veterinarian Sciences, Jaboticabal, São Paulo, Brazil
| | - Kênia Carsdoso Bícego
- São Paulo State University (UNESP), School of Agricultural and Veterinarian Sciences, Jaboticabal, São Paulo, Brazil
| | - Marcos Macari
- São Paulo State University (UNESP), School of Agricultural and Veterinarian Sciences, Jaboticabal, São Paulo, Brazil
| | - Matheus Paula Reis
- São Paulo State University (UNESP), School of Agricultural and Veterinarian Sciences, Jaboticabal, São Paulo, Brazil
| | | | - Nilva Kazue Sakomura
- São Paulo State University (UNESP), School of Agricultural and Veterinarian Sciences, Jaboticabal, São Paulo, Brazil
| |
Collapse
|
19
|
Doornweerd JE, Kootstra G, Veerkamp RF, de Klerk B, Fodor I, van der Sluis M, Bouwman AC, Ellen ED. Passive radio frequency identification and video tracking for the determination of location and movement of broilers. Poult Sci 2023; 102:102412. [PMID: 36621101 PMCID: PMC9841275 DOI: 10.1016/j.psj.2022.102412] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2022] [Revised: 11/28/2022] [Accepted: 12/05/2022] [Indexed: 12/13/2022] Open
Abstract
Phenotypes on individual animals are required for breeding programs to be able to select for traits. However, phenotyping individual animals can be difficult and time-consuming, especially for traits related to health, welfare, and performance. Individual broiler behavior could serve as a proxy for these traits when recorded automatically and reliably on many animals. Sensors could record individual broiler behavior, yet different sensors can differ in their assessment. In this study a comparison was made between a passive radio frequency identification (RFID) system (grid of antennas underneath the pen) and video tracking for the determination of location and movement of 3 color-marked broilers at d 18. Furthermore, a systems comparison of derived behavioral metrics such as space usage, locomotion activity and apparent feeding and drinking behavior was made. Color-marked broilers simplified the computer vision task for YOLOv5 to detect, track, and identify the animals. Animal locations derived from the RFID-system and based on video were largely in agreement. Most location differences (77.5%) were within the mean radius of the antennas' enclosing circle (≤128 px, 28.15 cm), and 95.3% of the differences were within a one antenna difference (≤256 px, 56.30 cm). Animal movement was not always registered by the RFID-system whereas video was sensitive to detection noise and the animal's behavior (e.g., pecking). The method used to determine location and the systems' sensitivities to movement led to differences in behavioral metrics. Behavioral metrics derived from video are likely more accurate than RFID-system derived behavioral metrics. However, at present, only the RFID-system can provide individual identification for non-color marked broilers. A combination of verifiable and detailed video with the unique identification of RFID could make it possible to identify, describe, and quantify a wide range of individual broiler behaviors.
Collapse
Affiliation(s)
- J E Doornweerd
- Animal Breeding and Genomics, Wageningen University & Research, 6700 AH Wageningen, the Netherlands.
| | - G Kootstra
- Farm Technology, Wageningen University & Research, 6700 AA Wageningen, the Netherlands
| | - R F Veerkamp
- Animal Breeding and Genomics, Wageningen University & Research, 6700 AH Wageningen, the Netherlands
| | - B de Klerk
- Research & Development, Cobb Europe BV, 5831 GH Boxmeer, the Netherlands
| | - I Fodor
- Animal Breeding and Genomics, Wageningen University & Research, 6700 AH Wageningen, the Netherlands
| | - M van der Sluis
- Animal Breeding and Genomics, Wageningen University & Research, 6700 AH Wageningen, the Netherlands
| | - A C Bouwman
- Animal Breeding and Genomics, Wageningen University & Research, 6700 AH Wageningen, the Netherlands
| | - E D Ellen
- Animal Breeding and Genomics, Wageningen University & Research, 6700 AH Wageningen, the Netherlands
| |
Collapse
|
20
|
EFSA AHAW Panel (EFSA Panel on Animal Health and Welfare), Nielsen SS, Alvarez J, Bicout DJ, Calistri P, Canali E, Drewe JA, Garin‐Bastuji B, Gonzales Rojas JL, Schmidt CG, Herskin MS, Miranda Chueca MÁ, Padalino B, Pasquali P, Roberts HC, Spoolder H, Stahl K, Velarde A, Viltrop A, Winckler C, Tiemann I, de Jong I, Gebhardt‐Henrich SG, Keeling L, Riber AB, Ashe S, Candiani D, García Matas R, Hempen M, Mosbach‐Schulz O, Rojo Gimeno C, Van der Stede Y, Vitali M, Bailly‐Caumette E, Michel V. Welfare of broilers on farm. EFSA J 2023; 21:e07788. [PMID: 36824680 PMCID: PMC9941850 DOI: 10.2903/j.efsa.2023.7788] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/23/2023] Open
Abstract
This Scientific Opinion considers the welfare of domestic fowl (Gallus gallus) related to the production of meat (broilers) and includes the keeping of day-old chicks, broiler breeders, and broiler chickens. Currently used husbandry systems in the EU are described. Overall, 19 highly relevant welfare consequences (WCs) were identified based on severity, duration and frequency of occurrence: 'bone lesions', 'cold stress', 'gastro-enteric disorders', 'group stress', 'handling stress', 'heat stress', 'isolation stress', 'inability to perform comfort behaviour', 'inability to perform exploratory or foraging behaviour', 'inability to avoid unwanted sexual behaviour', 'locomotory disorders', 'prolonged hunger', 'prolonged thirst', 'predation stress', 'restriction of movement', 'resting problems', 'sensory under- and overstimulation', 'soft tissue and integument damage' and 'umbilical disorders'. These WCs and their animal-based measures (ABMs) that can identify them are described in detail. A variety of hazards related to the different husbandry systems were identified as well as ABMs for assessing the different WCs. Measures to prevent or correct the hazards and/or mitigate each of the WCs are listed. Recommendations are provided on quantitative or qualitative criteria to answer specific questions on the welfare of broilers and related to genetic selection, temperature, feed and water restriction, use of cages, light, air quality and mutilations in breeders such as beak trimming, de-toeing and comb dubbing. In addition, minimal requirements (e.g. stocking density, group size, nests, provision of litter, perches and platforms, drinkers and feeders, of covered veranda and outdoor range) for an enclosure for keeping broiler chickens (fast-growing, slower-growing and broiler breeders) are recommended. Finally, 'total mortality', 'wounds', 'carcass condemnation' and 'footpad dermatitis' are proposed as indicators for monitoring at slaughter the welfare of broilers on-farm.
Collapse
|
21
|
Keypoint Detection for Injury Identification during Turkey Husbandry Using Neural Networks. SENSORS 2022; 22:s22145188. [PMID: 35890870 PMCID: PMC9319281 DOI: 10.3390/s22145188] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/08/2022] [Revised: 06/24/2022] [Accepted: 07/09/2022] [Indexed: 02/05/2023]
Abstract
Injurious pecking against conspecifics is a serious problem in turkey husbandry. Bloody injuries act as a trigger mechanism to induce further pecking, and timely detection and intervention can prevent massive animal welfare impairments and costly losses. Thus, the overarching aim is to develop a camera-based system to monitor the flock and detect injuries using neural networks. In a preliminary study, images of turkeys were annotated by labelling potential injuries. These were used to train a network for injury detection. Here, we applied a keypoint detection model to provide more information on animal position and indicate injury location. Therefore, seven turkey keypoints were defined, and 244 images (showing 7660 birds) were manually annotated. Two state-of-the-art approaches for pose estimation were adjusted, and their results were compared. Subsequently, a better keypoint detection model (HRNet-W48) was combined with the segmentation model for injury detection. For example, individual injuries were classified using “near tail” or “near head” labels. Summarizing, the keypoint detection showed good results and could clearly differentiate between individual animals even in crowded situations.
Collapse
|