1
|
Pooley CM, Marion G, Prentice J, Pong-Wong R, Bishop SC, Doeschl-Wilson A. SIRE 2.0: a novel method for estimating polygenic host effects underlying infectious disease transmission, and analytical expressions for prediction accuracies. Genet Sel Evol 2025; 57:17. [PMID: 40169992 PMCID: PMC11963337 DOI: 10.1186/s12711-025-00956-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2024] [Accepted: 01/29/2025] [Indexed: 04/03/2025] Open
Abstract
BACKGROUND Genetic selection of individuals that are less susceptible to infection, less infectious once infected, and recover faster, offers an effective and long-lasting solution to reduce the incidence and impact of infectious diseases in farmed animals. However, computational methods for simultaneously estimating genetic parameters for host susceptibility, infectivity and recoverability from real-word data have been lacking. Our previously developed methodology and software tool SIRE 1.0 (Susceptibility, Infectivity and Recoverability Estimator) allows estimation of host genetic effects of a single nucleotide polymorphism (SNP), or other fixed effects (e.g. breed, vaccination status), for these three host traits using individual disease data typically available from field studies and challenge experiments. SIRE 1.0, however, lacks the capability to estimate genetic parameters for these traits in the likely case of underlying polygenic control. RESULTS This paper introduces novel Bayesian methodology and a new software tool SIRE 2.0 for estimating polygenic contributions (i.e. variance components and additive genetic effects) for host susceptibility, infectivity and recoverability from temporal epidemic data, assuming that pedigree or genomic relationships are known. Analytical expressions for prediction accuracies (PAs) for these traits are derived for simplified scenarios, revealing their dependence on genetic and phenotypic variances, and the distribution of related individuals within and between contact groups. PAs for infectivity are found to be critically dependent on the size of contact groups. Validation of the methodology with data from simulated epidemics demonstrates good agreement between numerically generated PAs and analytical predictions. Genetic correlations between infectivity and other traits substantially increase trait PAs. Incomplete data (e.g. time censored or infrequent sampling) generally yield only small reductions in PAs, except for when infection times are completely unknown, which results in a substantial reduction. CONCLUSIONS The method presented can estimate genetic parameters for host susceptibility, infectivity and recoverability from individual disease records. The freely available SIRE 2.0 software provides a valuable extension to SIRE 1.0 for estimating host polygenic effects underlying infectious disease transmission. This tool will open up new possibilities for analysis and quantification of genetic determinates of disease dynamics.
Collapse
Affiliation(s)
- Christopher M Pooley
- Biomathematics and Statistics Scotland, James Clerk Maxwell Building, The King's Buildings, Peter Guthrie Tait Road, Edinburgh, EH9 3FD, UK.
- The Roslin Institute, The University of Edinburgh, Easter Bush Campus, Midlothian, EH25 9RG, UK.
| | - Glenn Marion
- Biomathematics and Statistics Scotland, James Clerk Maxwell Building, The King's Buildings, Peter Guthrie Tait Road, Edinburgh, EH9 3FD, UK
| | - Jamie Prentice
- The Roslin Institute, The University of Edinburgh, Easter Bush Campus, Midlothian, EH25 9RG, UK
| | - Ricardo Pong-Wong
- The Roslin Institute, The University of Edinburgh, Easter Bush Campus, Midlothian, EH25 9RG, UK
| | - Stephen C Bishop
- The Roslin Institute, The University of Edinburgh, Easter Bush Campus, Midlothian, EH25 9RG, UK
| | - Andrea Doeschl-Wilson
- The Roslin Institute, The University of Edinburgh, Easter Bush Campus, Midlothian, EH25 9RG, UK
| |
Collapse
|
2
|
Liao Y, Qiu Y, Liu B, Qin Y, Wang Y, Wu Z, Xu L, Feng A. YOLOv8A-SD: A Segmentation-Detection Algorithm for Overlooking Scenes in Pig Farms. Animals (Basel) 2025; 15:1000. [PMID: 40218393 PMCID: PMC11987837 DOI: 10.3390/ani15071000] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2025] [Revised: 03/25/2025] [Accepted: 03/28/2025] [Indexed: 04/14/2025] Open
Abstract
A refined YOLOv8A-SD model is introduced to address pig detection challenges in aerial surveillance of pig farms. The model incorporates the ADown attention mechanism and a dual-task strategy combining detection and segmentation tasks. Testing was conducted using top-view footage from a large-scale pig farm in Sichuan, with 924 images for detection training, 216 for validation, and 2985 images for segmentation training, with 1512 for validation. The model achieved 96.1% Precision and 96.3% mAP50 in detection tasks while maintaining strong segmentation performance (IoU: 83.1%). A key finding reveals that training with original images while applying segmentation preprocessing during testing provides optimal results, achieving exceptional counting accuracy (25.05 vs. actual 25.09 pigs) and simplifying practical deployment. The research demonstrates YOLOv8A-SD's effectiveness in complex farming environments, providing reliable monitoring capabilities for intelligent farm management applications.
Collapse
Affiliation(s)
- Yiran Liao
- College of Mechanical and Electrical Engineering, Sichuan Agricultural University, Ya’an 625000, China; (Y.L.); (Y.Q.); (Y.W.); (Z.W.); (A.F.)
| | - Yipeng Qiu
- College of Information Engineering, Sichuan Agricultural University, Ya’an 625000, China;
| | - Bo Liu
- Sichuan Academy of Agricultural Mechanisation Sciences, Ya’an 610000, China;
| | - Yibin Qin
- College of Mechanical and Electrical Engineering, Sichuan Agricultural University, Ya’an 625000, China; (Y.L.); (Y.Q.); (Y.W.); (Z.W.); (A.F.)
| | - Yuchao Wang
- College of Mechanical and Electrical Engineering, Sichuan Agricultural University, Ya’an 625000, China; (Y.L.); (Y.Q.); (Y.W.); (Z.W.); (A.F.)
| | - Zhijun Wu
- College of Mechanical and Electrical Engineering, Sichuan Agricultural University, Ya’an 625000, China; (Y.L.); (Y.Q.); (Y.W.); (Z.W.); (A.F.)
| | - Lijia Xu
- College of Mechanical and Electrical Engineering, Sichuan Agricultural University, Ya’an 625000, China; (Y.L.); (Y.Q.); (Y.W.); (Z.W.); (A.F.)
| | - Ao Feng
- College of Mechanical and Electrical Engineering, Sichuan Agricultural University, Ya’an 625000, China; (Y.L.); (Y.Q.); (Y.W.); (Z.W.); (A.F.)
| |
Collapse
|
3
|
Xiang R, Zhang Y, Lin H, Fu Y, Rao X, Pan J, Pan C. Body Temperature Detection of Group-Housed Pigs Based on the Pairing of Left and Right Ear Roots in Thermal Images. Animals (Basel) 2025; 15:642. [PMID: 40075925 PMCID: PMC11898202 DOI: 10.3390/ani15050642] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2025] [Revised: 02/13/2025] [Accepted: 02/20/2025] [Indexed: 03/14/2025] Open
Abstract
Body temperature is a critical indicator of pig health. This study proposes a non-contact method for detecting body temperature in group-housed pigs by extracting temperature data from thermal images of ear roots. Thermal images in the drinking trough area were captured using a thermal camera, with real-time data transmitted to a monitoring room via optical fibers. The YOLO v11m-OBB model was utilized to detect the ear root areas with oriented bounding boxes, while a novel algorithm, the two-stage left and right ear root pairing algorithm (YOLO TEPA-OBB), paired the ear roots of individual pigs using center distance clustering and angular relationships in a polar coordinate system. The maximum temperature of the ear roots was extracted to represent the body temperature. Experimental results based on 749 ear roots show that the YOLO TEPA-OBB achieves 98.7% precision, 98.4% recall, and 98.7% mean average precision (mAP) in detecting ear roots, with an ear root pairing accuracy of 98.1%. The Pearson correlation coefficient (r) between predicted and reference temperatures is 0.989, with a mean bias of 0.014 °C and a standard deviation of 0.103 °C. This research facilitates real-time body temperature monitoring and precise health management for group-housed pigs.
Collapse
Affiliation(s)
- Rong Xiang
- College of Quality and Standardization, China Jiliang University, Hangzhou 310018, China; (Y.Z.); (C.P.)
| | - Yi Zhang
- College of Quality and Standardization, China Jiliang University, Hangzhou 310018, China; (Y.Z.); (C.P.)
| | - Hongjian Lin
- College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China; (H.L.); (Y.F.); (X.R.); (J.P.)
| | - Yingchun Fu
- College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China; (H.L.); (Y.F.); (X.R.); (J.P.)
| | - Xiuqin Rao
- College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China; (H.L.); (Y.F.); (X.R.); (J.P.)
| | - Jinming Pan
- College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China; (H.L.); (Y.F.); (X.R.); (J.P.)
| | - Chenghao Pan
- College of Quality and Standardization, China Jiliang University, Hangzhou 310018, China; (Y.Z.); (C.P.)
| |
Collapse
|
4
|
Reza MN, Lee KH, Habineza E, Samsuzzaman, Kyoung H, Choi YK, Kim G, Chung SO. RGB-based machine vision for enhanced pig disease symptoms monitoring and health management: a review. JOURNAL OF ANIMAL SCIENCE AND TECHNOLOGY 2025; 67:17-42. [PMID: 39974778 PMCID: PMC11833201 DOI: 10.5187/jast.2024.e111] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/16/2024] [Revised: 11/15/2024] [Accepted: 11/18/2024] [Indexed: 02/21/2025]
Abstract
The growing demands of sustainable, efficient, and welfare-conscious pig husbandry have necessitated the adoption of advanced technologies. Among these, RGB imaging and machine vision technology may offer a promising solution for early disease detection and proactive disease management in advanced pig husbandry practices. This review explores innovative applications for monitoring disease symptoms by assessing features that directly or indirectly indicate disease risk, as well as for tracking body weight and overall health. Machine vision and image processing algorithms enable for the real-time detection of subtle changes in pig appearance and behavior that may signify potential health issues. Key indicators include skin lesions, inflammation, ocular and nasal discharge, and deviations in posture and gait, each of which can be detected non-invasively using RGB cameras. Moreover, when integrated with thermal imaging, RGB systems can detect fever, a reliable indicator of infection, while behavioral monitoring systems can track abnormal posture, reduced activity, and altered feeding and drinking habits, which are often precursors to illness. The technology also facilitates the analysis of respiratory symptoms, such as coughing or sneezing (enabling early identification of respiratory diseases, one of the most significant challenges in pig farming), and the assessment of fecal consistency and color (providing valuable insights into digestive health). Early detection of disease or poor health supports proactive interventions, reducing mortality and improving treatment outcomes. Beyond direct symptom monitoring, RGB imaging and machine vision can indirectly assess disease risk by monitoring body weight, feeding behavior, and environmental factors such as overcrowding and temperature. However, further research is needed to refine the accuracy and robustness of algorithms in diverse farming environments. Ultimately, integrating RGB-based machine vision into existing farm management systems could provide continuous, automated surveillance, generating real-time alerts and actionable insights; these can support data-driven disease prevention strategies, reducing the need for mass medication and the development of antimicrobial resistance.
Collapse
Affiliation(s)
- Md Nasim Reza
- Department of Agricultural Machinery
Engineering, Graduate School, Chungnam National University,
Daejeon 34134, Korea
- Department of Smart Agricultural Systems,
Graduate School, Chungnam National University, Daejeon 34134,
Korea
| | - Kyu-Ho Lee
- Department of Agricultural Machinery
Engineering, Graduate School, Chungnam National University,
Daejeon 34134, Korea
- Department of Smart Agricultural Systems,
Graduate School, Chungnam National University, Daejeon 34134,
Korea
| | - Eliezel Habineza
- Department of Smart Agricultural Systems,
Graduate School, Chungnam National University, Daejeon 34134,
Korea
| | - Samsuzzaman
- Department of Agricultural Machinery
Engineering, Graduate School, Chungnam National University,
Daejeon 34134, Korea
| | - Hyunjin Kyoung
- Division of Animal and Dairy Science,
Chungnam National University, Daejeon 34134, Korea
| | | | - Gookhwan Kim
- National Institute of Agricultural
Sciences, Rural Development Administration, Jeonju 54875,
Korea
| | - Sun-Ok Chung
- Department of Agricultural Machinery
Engineering, Graduate School, Chungnam National University,
Daejeon 34134, Korea
- Department of Smart Agricultural Systems,
Graduate School, Chungnam National University, Daejeon 34134,
Korea
| |
Collapse
|
5
|
Tu S, Ou H, Mao L, Du J, Cao Y, Chen W. Behavior Tracking and Analyses of Group-Housed Pigs Based on Improved ByteTrack. Animals (Basel) 2024; 14:3299. [PMID: 39595351 PMCID: PMC11591442 DOI: 10.3390/ani14223299] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2024] [Revised: 11/02/2024] [Accepted: 11/04/2024] [Indexed: 11/28/2024] Open
Abstract
Daily behavioral analysis of group-housed pigs provides critical insights into early warning systems for pig health issues and animal welfare in smart pig farming. In this study, our main objective was to develop an automated method for monitoring and analyzing the behavior of group-reared pigs to detect health problems and improve animal welfare promptly. We have developed the method named Pig-ByteTrack. Our approach addresses target detection, Multi-Object Tracking (MOT), and behavioral time computation for each pig. The YOLOX-X detection model is employed for pig detection and behavior recognition, followed by Pig-ByteTrack for tracking behavioral information. In 1 min videos, the Pig-ByteTrack algorithm achieved Higher Order Tracking Accuracy (HOTA) of 72.9%, Multi-Object Tracking Accuracy (MOTA) of 91.7%, identification F1 Score (IDF1) of 89.0%, and ID switches (IDs) of 41. Compared with ByteTrack and TransTrack, the Pig-ByteTrack achieved significant improvements in HOTA, IDF1, MOTA, and IDs. In 10 min videos, the Pig-ByteTrack achieved the results with 59.3% of HOTA, 89.6% of MOTA, 53.0% of IDF1, and 198 of IDs, respectively. Experiments on video datasets demonstrate the method's efficacy in behavior recognition and tracking, offering technical support for health and welfare monitoring of pig herds.
Collapse
Affiliation(s)
- Shuqin Tu
- College of Mathematics and Informatics, South China Agricultural University, Guangzhou 510642, China; (S.T.); (H.O.); (J.D.); (Y.C.); (W.C.)
| | - Haoxuan Ou
- College of Mathematics and Informatics, South China Agricultural University, Guangzhou 510642, China; (S.T.); (H.O.); (J.D.); (Y.C.); (W.C.)
| | - Liang Mao
- Institute of Applied Artificial Intelligence of the Guangdong-Hong Kong-Macao Greater Bay Area, Shenzhen Polytechnic University, Shenzhen 518055, China
| | - Jiaying Du
- College of Mathematics and Informatics, South China Agricultural University, Guangzhou 510642, China; (S.T.); (H.O.); (J.D.); (Y.C.); (W.C.)
| | - Yuefei Cao
- College of Mathematics and Informatics, South China Agricultural University, Guangzhou 510642, China; (S.T.); (H.O.); (J.D.); (Y.C.); (W.C.)
| | - Weidian Chen
- College of Mathematics and Informatics, South China Agricultural University, Guangzhou 510642, China; (S.T.); (H.O.); (J.D.); (Y.C.); (W.C.)
| |
Collapse
|
6
|
Eddicks M, Feicht F, Beckjunker J, Genzow M, Alonso C, Reese S, Ritzmann M, Stadler J. Monitoring of Respiratory Disease Patterns in a Multimicrobially Infected Pig Population Using Artificial Intelligence and Aggregate Samples. Viruses 2024; 16:1575. [PMID: 39459909 PMCID: PMC11512249 DOI: 10.3390/v16101575] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2024] [Revised: 10/02/2024] [Accepted: 10/04/2024] [Indexed: 10/28/2024] Open
Abstract
A 24/7 AI sound-based coughing monitoring system was applied in combination with oral fluids (OFs) and bioaerosol (AS)-based screening for respiratory pathogens in a conventional pig nursery. The objective was to assess the additional value of the AI to identify disease patterns in association with molecular diagnostics to gain information on the etiology of respiratory distress in a multimicrobially infected pig population. Respiratory distress was measured 24/7 by the AI and compared to human observations. Screening for swine influenza A virus (swIAV), porcine reproductive and respiratory disease virus (PRRSV), Mycoplasma (M.) hyopneumoniae, Actinobacillus (A.) pleuropneumoniae, and porcine circovirus 2 (PCV2) was conducted using qPCR. Except for M. hyopneumoniae, all of the investigated pathogens were detected within the study period. High swIAV-RNA loads in OFs and AS were significantly associated with a decrease in respiratory health, expressed by a respiratory health score calculated by the AI The odds of detecting PRRSV or A. pleuropneumoniae were significantly higher for OFs compared to AS. qPCR examinations of OFs revealed significantly lower Ct-values for swIAV and A. pleuropneumoniae compared to AS. In addition to acting as an early warning system, AI gained respiratory health data combined with laboratory diagnostics, can indicate the etiology of respiratory distress.
Collapse
Affiliation(s)
- Matthias Eddicks
- Clinic for Swine at the Centre for Clinical Veterinary Medicine, Ludwig-Maximilians-University München, 85764 München, Germany; (M.E.); (F.F.); (M.R.)
| | - Franziska Feicht
- Clinic for Swine at the Centre for Clinical Veterinary Medicine, Ludwig-Maximilians-University München, 85764 München, Germany; (M.E.); (F.F.); (M.R.)
| | - Jochen Beckjunker
- Boehringer Ingelheim Vetmedica GmbH, Ingelheim, 55216 Ingelheim am Rhein, Germany; (J.B.); (M.G.); (C.A.)
| | - Marika Genzow
- Boehringer Ingelheim Vetmedica GmbH, Ingelheim, 55216 Ingelheim am Rhein, Germany; (J.B.); (M.G.); (C.A.)
| | - Carmen Alonso
- Boehringer Ingelheim Vetmedica GmbH, Ingelheim, 55216 Ingelheim am Rhein, Germany; (J.B.); (M.G.); (C.A.)
| | - Sven Reese
- Institute for Anatomy, Histology and Embryology, LMU Munich, 80539 Munich, Germany;
| | - Mathias Ritzmann
- Clinic for Swine at the Centre for Clinical Veterinary Medicine, Ludwig-Maximilians-University München, 85764 München, Germany; (M.E.); (F.F.); (M.R.)
| | - Julia Stadler
- Clinic for Swine at the Centre for Clinical Veterinary Medicine, Ludwig-Maximilians-University München, 85764 München, Germany; (M.E.); (F.F.); (M.R.)
| |
Collapse
|
7
|
García-Vázquez FA. Artificial intelligence and porcine breeding. Anim Reprod Sci 2024; 269:107538. [PMID: 38926001 DOI: 10.1016/j.anireprosci.2024.107538] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/29/2024] [Revised: 06/13/2024] [Accepted: 06/14/2024] [Indexed: 06/28/2024]
Abstract
Livestock management is evolving into a new era, characterized by the analysis of vast quantities of data (Big Data) collected from both traditional breeding methods and new technologies such as sensors, automated monitoring system, and advanced analytics. Artificial intelligence (A-In), which refers to the capability of machines to mimic human intelligence, including subfields like machine learning and deep learning, is playing a pivotal role in this transformation. A wide array of A-In techniques, successfully employed in various industrial and scientific contexts, are now being integrated into mainstream livestock management practices. In the case of swine breeding, while traditional methods have yielded considerable success, the increasing amount of information requires the adoption of new technologies such as A-In to drive productivity, enhance animal welfare, and reduce environmental impact. Current findings suggest that these techniques have the potential to match or exceed the performance of traditional methods, often being more scalable in terms of efficiency and sustainability within the breeding industry. This review provides insights into the application of A-In in porcine breeding, from the perspectives of both sows (including welfare and reproductive management) and boars (including semen quality and health), and explores new approaches which are already being applied in other species.
Collapse
Affiliation(s)
- Francisco A García-Vázquez
- Departamento de Fisiología, Facultad de Veterinaria, Campus de Excelencia Mare Nostrum, Universidad de Murcia, Murcia 30100, Spain; Instituto Murciano de Investigación Biosanitaria (IMIB-Arrixaca), Murcia, Spain.
| |
Collapse
|
8
|
Sonalio K, Boyen F, Devriendt B, Chantziaras I, Beuckelaere L, Biebaut E, Haesebrouck F, Santamarta I, de Oliveira LG, Maes D. Rationalizing the use of common parameters and technological tools to follow up Mycoplasma hyopneumoniae infections in pigs. Porcine Health Manag 2024; 10:31. [PMID: 39180129 PMCID: PMC11342468 DOI: 10.1186/s40813-024-00381-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2024] [Accepted: 08/05/2024] [Indexed: 08/26/2024] Open
Abstract
BACKGROUND Mycoplasma (M.) hyopneumoniae is associated with respiratory disease in pigs and is the primary agent of enzootic pneumonia. Quantification of M. hyopneumoniae-related outcome parameters can be difficult, expensive, and time-consuming, in both research and field settings. In addition to well-established methods, technological tools are becoming available to monitor various aspects of relevant animal- and environment-related features, often in real-time. Therefore, this study aimed to assess whether certain parameters, such as animal movement and body temperature using microchips (IMT), correlate with established parameters and whether the currently used parameters can be rationalized. RESULTS The percentage of movement was significantly reduced by M. hyopneumoniae infection in pigs (p < 0.05), where the M. hyopneumoniae-infected group showed a lower percentage of movement (1.9%) when compared to the negative control group (6.9%). On the other hand, macroscopic (MLCL) and microscopic (MLL) lung lesions, respiratory disease score (RDS), M. hyopneumoniae-DNA load, and anti-M. hyopneumoniae antibody levels increased significantly in the M. hyopneumoniae-infected group 28 days post-inoculation (p < 0.05). Moderate (r > 0.30) to very strong correlations (> 0.80) were observed between the abovementioned parameters (p < 0.05), except for IMT. A significant and moderate correlation was reported between IMT and rectal temperature (r = 0.49; p < 0.05). Last, the average daily weight gain and the percentage of air in the lung were not affected by M. hyopneumoniae infection (p > 0.05). CONCLUSIONS M. hyopneumoniae infection significantly reduced the movement of piglets and increased lung lesions, M. hyopneumoniae-DNA load, and anti-M. hyopneumoniae antibody levels; and, good correlations were observed between most parameters, indicating a direct relationship between them. Thus, we suggest that changes in movement might be a reliable indicator of M. hyopneumoniae infection in pigs, and that a selected group of parameters-specifically RDS, MLCL, MLL, M. hyopneumoniae-DNA load, anti-M. hyopneumoniae antibody levels, and movement-are optimal to assess M. hyopneumoniae infection under experimental conditions.
Collapse
Affiliation(s)
- Karina Sonalio
- Department of Internal Medicine, Reproduction and Population Medicine, Faculty of Veterinary Medicine, Ghent University, Merelbeke, Belgium.
- Department of Veterinary Clinic and Surgery, School of Agricultural and Veterinarian Sciences, São Paulo State University (Unesp), Jaboticabal, Brazil.
| | - Filip Boyen
- Department of Pathobiology, Pharmacology and Zoological Medicine, Faculty of Veterinary Medicine, Ghent University, Merelbeke, Belgium
| | - Bert Devriendt
- Department of Translational Physiology, Infectiology and Public Health, Faculty of Veterinary Medicine, Ghent University, Merelbeke, Belgium
| | - Ilias Chantziaras
- Department of Internal Medicine, Reproduction and Population Medicine, Faculty of Veterinary Medicine, Ghent University, Merelbeke, Belgium
| | - Lisa Beuckelaere
- Department of Internal Medicine, Reproduction and Population Medicine, Faculty of Veterinary Medicine, Ghent University, Merelbeke, Belgium
| | - Evelien Biebaut
- Department of Internal Medicine, Reproduction and Population Medicine, Faculty of Veterinary Medicine, Ghent University, Merelbeke, Belgium
| | - Freddy Haesebrouck
- Department of Pathobiology, Pharmacology and Zoological Medicine, Faculty of Veterinary Medicine, Ghent University, Merelbeke, Belgium
| | | | - Luís Guilherme de Oliveira
- Department of Veterinary Clinic and Surgery, School of Agricultural and Veterinarian Sciences, São Paulo State University (Unesp), Jaboticabal, Brazil
| | - Dominiek Maes
- Department of Internal Medicine, Reproduction and Population Medicine, Faculty of Veterinary Medicine, Ghent University, Merelbeke, Belgium
| |
Collapse
|
9
|
Knoll M, Gygax L, Hillmann E. Pinpointing pigs: performance and challenges of an ultra-wideband real-time location system for tracking growing-finishing pigs under practical conditions. Animal 2024; 18:101163. [PMID: 38744229 DOI: 10.1016/j.animal.2024.101163] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2023] [Revised: 04/09/2024] [Accepted: 04/09/2024] [Indexed: 05/16/2024] Open
Abstract
Real-Time Location Systems (RTLSs) are promising precision livestock farming tools and have been employed in behavioural studies across various farm animal species. However, their application in research with fattening pigs is so far unexplored. The implementation of these systems has great potential to gain insight into pigs' spatial behaviour such as the use of functional areas and pigs' proximity to each other as indicators for social relationships. The aim of this study was therefore to validate the accuracy, precision, and data quality of the commercial Noldus Information Technology BV TrackLab system. We conducted different measurement sets: first, we performed static measurements in 12 pens at four different locations in each pen at three heights each using a single ultra-wideband tag (UWB). We recorded unfiltered x- and y-coordinates at 1 Hz. We repeated these measurements with six tags aligned in a 2 × 3 grid with varied spacing to test interference between the tags. We also tested dynamic performance by moving the tags along the centre line of the pens. Finally, we measured the data quality with 55 growing pigs in six pens, including the identification of location 'jumps' from the inside to the outside of the pen. Each pen housed ten animals fitted with a UWB tag attached to their farm ear tag. We collected data for 10 days and analysed seven 24-h periods of raw and filtered data. The mean accuracy of the RTLS measurements was 0.53 m (precision: 0.14 m) for single and 0.46 m (precision: 0.07 m) for grouped tags. Accuracy improved with increasing measurement height for single tags but less clearly for grouped tags (P [height single] = 0.01; P [height grouped] = 0.22). When tags were fitted to animals, 63.3% of the filtered data was lost and 21.8% of the filtered location estimates were outside the pens. Altogether, the TrackLab system was capable of fairly accurate and precise assignment of the functional areas where individual animals were located, but was insufficient for the analysis of social relationships. Furthermore, the frequent occurrence of gaps in signal transmission and the overall high data loss rates presented significant limitations. Additionally, the challenging hardware requirements for attaching sensors to the animals underline the need for further technological advances in RTLS for the application with growing-finishing pigs.
Collapse
Affiliation(s)
- M Knoll
- Humboldt-Universität zu Berlin, Department of Life Sciences, Albrecht Daniel Thaer Institute of Agricultural and Horticultural Sciences, Animal Husbandry and Ethology, Unter den Linden 6, 10099 Berlin, Germany.
| | - L Gygax
- Humboldt-Universität zu Berlin, Department of Life Sciences, Albrecht Daniel Thaer Institute of Agricultural and Horticultural Sciences, Animal Husbandry and Ethology, Unter den Linden 6, 10099 Berlin, Germany
| | - E Hillmann
- Humboldt-Universität zu Berlin, Department of Life Sciences, Albrecht Daniel Thaer Institute of Agricultural and Horticultural Sciences, Animal Husbandry and Ethology, Unter den Linden 6, 10099 Berlin, Germany
| |
Collapse
|
10
|
Sharifuzzaman M, Mun HS, Ampode KMB, Lagua EB, Park HR, Kim YH, Hasan MK, Yang CJ. Technological Tools and Artificial Intelligence in Estrus Detection of Sows-A Comprehensive Review. Animals (Basel) 2024; 14:471. [PMID: 38338113 PMCID: PMC10854728 DOI: 10.3390/ani14030471] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2023] [Revised: 01/30/2024] [Accepted: 01/30/2024] [Indexed: 02/12/2024] Open
Abstract
In animal farming, timely estrus detection and prediction of the best moment for insemination is crucial. Traditional sow estrus detection depends on the expertise of a farm attendant which can be inconsistent, time-consuming, and labor-intensive. Attempts and trials in developing and implementing technological tools to detect estrus have been explored by researchers. The objective of this review is to assess the automatic methods of estrus recognition in operation for sows and point out their strong and weak points to assist in developing new and improved detection systems. Real-time methods using body and vulvar temperature, posture recognition, and activity measurements show higher precision. Incorporating artificial intelligence with multiple estrus-related parameters is expected to enhance accuracy. Further development of new systems relies mostly upon the improved algorithm and accurate data provided. Future systems should be designed to minimize the misclassification rate, so better detection is achieved.
Collapse
Affiliation(s)
- Md Sharifuzzaman
- Animal Nutrition and Feed Science Laboratory, Department of Animal Science and Technology, Sunchon National University, Suncheon 57922, Republic of Korea; (M.S.); (H.-S.M.); (K.M.B.A.); (E.B.L.); (H.-R.P.); (M.K.H.)
- Department of Animal Science and Veterinary Medicine, Bangabandhu Sheikh Mujibur Rahman Science and Technology University, Gopalganj 8100, Bangladesh
| | - Hong-Seok Mun
- Animal Nutrition and Feed Science Laboratory, Department of Animal Science and Technology, Sunchon National University, Suncheon 57922, Republic of Korea; (M.S.); (H.-S.M.); (K.M.B.A.); (E.B.L.); (H.-R.P.); (M.K.H.)
- Department of Multimedia Engineering, Sunchon National University, Suncheon 57922, Republic of Korea
| | - Keiven Mark B. Ampode
- Animal Nutrition and Feed Science Laboratory, Department of Animal Science and Technology, Sunchon National University, Suncheon 57922, Republic of Korea; (M.S.); (H.-S.M.); (K.M.B.A.); (E.B.L.); (H.-R.P.); (M.K.H.)
- Department of Animal Science, College of Agriculture, Sultan Kudarat State University, Tacurong 9800, Philippines
| | - Eddiemar B. Lagua
- Animal Nutrition and Feed Science Laboratory, Department of Animal Science and Technology, Sunchon National University, Suncheon 57922, Republic of Korea; (M.S.); (H.-S.M.); (K.M.B.A.); (E.B.L.); (H.-R.P.); (M.K.H.)
- Interdisciplinary Program in IT-Bio Convergence System (BK21 Plus), Sunchon National University, Suncheon 57922, Republic of Korea
| | - Hae-Rang Park
- Animal Nutrition and Feed Science Laboratory, Department of Animal Science and Technology, Sunchon National University, Suncheon 57922, Republic of Korea; (M.S.); (H.-S.M.); (K.M.B.A.); (E.B.L.); (H.-R.P.); (M.K.H.)
- Interdisciplinary Program in IT-Bio Convergence System (BK21 Plus), Sunchon National University, Suncheon 57922, Republic of Korea
| | - Young-Hwa Kim
- Interdisciplinary Program in IT-Bio Convergence System (BK21 Plus), Chonnam National University, Gwangju 61186, Republic of Korea;
| | - Md Kamrul Hasan
- Animal Nutrition and Feed Science Laboratory, Department of Animal Science and Technology, Sunchon National University, Suncheon 57922, Republic of Korea; (M.S.); (H.-S.M.); (K.M.B.A.); (E.B.L.); (H.-R.P.); (M.K.H.)
- Department of Poultry Science, Sylhet Agricultural University, Sylhet 3100, Bangladesh
| | - Chul-Ju Yang
- Animal Nutrition and Feed Science Laboratory, Department of Animal Science and Technology, Sunchon National University, Suncheon 57922, Republic of Korea; (M.S.); (H.-S.M.); (K.M.B.A.); (E.B.L.); (H.-R.P.); (M.K.H.)
- Interdisciplinary Program in IT-Bio Convergence System (BK21 Plus), Sunchon National University, Suncheon 57922, Republic of Korea
| |
Collapse
|
11
|
Mora M, Piles M, David I, Rosa GJM. Integrating computer vision algorithms and RFID system for identification and tracking of group-housed animals: an example with pigs. J Anim Sci 2024; 102:skae174. [PMID: 38908015 PMCID: PMC11245691 DOI: 10.1093/jas/skae174] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2024] [Accepted: 06/19/2024] [Indexed: 06/24/2024] Open
Abstract
Precision livestock farming aims to individually and automatically monitor animal activity to ensure their health, well-being, and productivity. Computer vision has emerged as a promising tool for this purpose. However, accurately tracking individuals using imaging remains challenging, especially in group housing where animals may have similar appearances. Close interaction or crowding among animals can lead to the loss or swapping of animal IDs, compromising tracking accuracy. To address this challenge, we implemented a framework combining a tracking-by-detection method with a radio frequency identification (RFID) system. We tested this approach using twelve pigs in a single pen as an illustrative example. Three of the pigs had distinctive natural coat markings, enabling their visual identification within the group. The remaining pigs either shared similar coat color patterns or were entirely white, making them visually indistinguishable from each other. We employed the latest version of the You Only Look Once (YOLOv8) and BoT-SORT algorithms for detection and tracking, respectively. YOLOv8 was fine-tuned with a dataset of 3,600 images to detect and classify different pig classes, achieving a mean average precision of all the classes of 99%. The fine-tuned YOLOv8 model and the tracker BoT-SORT were then applied to a 166.7-min video comprising 100,018 frames. Results showed that pigs with distinguishable coat color markings could be tracked 91% of the time on average. For pigs with similar coat color, the RFID system was used to identify individual animals when they entered the feeding station, and this RFID identification was linked to the image trajectory of each pig, both backward and forward. The two pigs with similar markings could be tracked for an average of 48.6 min, while the seven white pigs could be tracked for an average of 59.1 min. In all cases, the tracking time assigned to each pig matched the ground truth 90% of the time or more. Thus, our proposed framework enabled reliable tracking of group-housed pigs for extended periods, offering a promising alternative to the independent use of image or RFID approaches alone. This approach represents a significant step forward in combining multiple devices for animal identification, tracking, and traceability, particularly when homogeneous animals are kept in groups.
Collapse
Affiliation(s)
- Mónica Mora
- Institute of Agrifood Research and Technology (IRTA) – Animal Breeding and Genetics, Barcelona 08140, Spain
- Department of Animal and Dairy Sciences, University of Wisconsin-Madison, Madison, WI 53706, USA
| | - Miriam Piles
- Institute of Agrifood Research and Technology (IRTA) – Animal Breeding and Genetics, Barcelona 08140, Spain
| | - Ingrid David
- GenPhySE, Université de Toulouse, INRAE, ENVT, Castanet Tolosan 31326, France
| | - Guilherme J M Rosa
- Department of Animal and Dairy Sciences, University of Wisconsin-Madison, Madison, WI 53706, USA
| |
Collapse
|
12
|
Reza MN, Ali MR, Samsuzzaman, Kabir MSN, Karim MR, Ahmed S, Kyoung H, Kim G, Chung SO. Thermal imaging and computer vision technologies for the enhancement of pig husbandry: a review. JOURNAL OF ANIMAL SCIENCE AND TECHNOLOGY 2024; 66:31-56. [PMID: 38618025 PMCID: PMC11007457 DOI: 10.5187/jast.2024.e4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/20/2023] [Revised: 01/03/2024] [Accepted: 01/03/2024] [Indexed: 04/16/2024]
Abstract
Pig farming, a vital industry, necessitates proactive measures for early disease detection and crush symptom monitoring to ensure optimum pig health and safety. This review explores advanced thermal sensing technologies and computer vision-based thermal imaging techniques employed for pig disease and piglet crush symptom monitoring on pig farms. Infrared thermography (IRT) is a non-invasive and efficient technology for measuring pig body temperature, providing advantages such as non-destructive, long-distance, and high-sensitivity measurements. Unlike traditional methods, IRT offers a quick and labor-saving approach to acquiring physiological data impacted by environmental temperature, crucial for understanding pig body physiology and metabolism. IRT aids in early disease detection, respiratory health monitoring, and evaluating vaccination effectiveness. Challenges include body surface emissivity variations affecting measurement accuracy. Thermal imaging and deep learning algorithms are used for pig behavior recognition, with the dorsal plane effective for stress detection. Remote health monitoring through thermal imaging, deep learning, and wearable devices facilitates non-invasive assessment of pig health, minimizing medication use. Integration of advanced sensors, thermal imaging, and deep learning shows potential for disease detection and improvement in pig farming, but challenges and ethical considerations must be addressed for successful implementation. This review summarizes the state-of-the-art technologies used in the pig farming industry, including computer vision algorithms such as object detection, image segmentation, and deep learning techniques. It also discusses the benefits and limitations of IRT technology, providing an overview of the current research field. This study provides valuable insights for researchers and farmers regarding IRT application in pig production, highlighting notable approaches and the latest research findings in this field.
Collapse
Affiliation(s)
- Md Nasim Reza
- Department of Smart Agricultural Systems,
Graduate School, Chungnam National University, Daejeon 34134,
Korea
- Department of Agricultural Machinery
Engineering, Graduate School, Chungnam National University,
Daejeon 34134, Korea
| | - Md Razob Ali
- Department of Agricultural Machinery
Engineering, Graduate School, Chungnam National University,
Daejeon 34134, Korea
| | - Samsuzzaman
- Department of Agricultural Machinery
Engineering, Graduate School, Chungnam National University,
Daejeon 34134, Korea
| | - Md Shaha Nur Kabir
- Department of Agricultural Industrial
Engineering, Faculty of Engineering, Hajee Mohammad Danesh Science and
Technology University, Dinajpur 5200, Bangladesh
| | - Md Rejaul Karim
- Department of Agricultural Machinery
Engineering, Graduate School, Chungnam National University,
Daejeon 34134, Korea
- Farm Machinery and Post-harvest Processing
Engineering Division, Bangladesh Agricultural Research
Institute, Gazipur 1701, Bangladesh
| | - Shahriar Ahmed
- Department of Agricultural Machinery
Engineering, Graduate School, Chungnam National University,
Daejeon 34134, Korea
| | - Hyunjin Kyoung
- Division of Animal and Dairy Science,
Chungnam National University, Daejeon 34134, Korea
| | - Gookhwan Kim
- National Institute of Agricultural
Sciences, Rural Development Administration, Jeonju 54875,
Korea
| | - Sun-Ok Chung
- Department of Smart Agricultural Systems,
Graduate School, Chungnam National University, Daejeon 34134,
Korea
- Department of Agricultural Machinery
Engineering, Graduate School, Chungnam National University,
Daejeon 34134, Korea
| |
Collapse
|
13
|
Zhou H, Chung S, Kakar JK, Kim SC, Kim H. Pig Movement Estimation by Integrating Optical Flow with a Multi-Object Tracking Model. SENSORS (BASEL, SWITZERLAND) 2023; 23:9499. [PMID: 38067875 PMCID: PMC10708576 DOI: 10.3390/s23239499] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/10/2023] [Revised: 11/23/2023] [Accepted: 11/27/2023] [Indexed: 12/18/2023]
Abstract
Pig husbandry constitutes a significant segment within the broader framework of livestock farming, with porcine well-being emerging as a paramount concern due to its direct implications on pig breeding and production. An easily observable proxy for assessing the health of pigs lies in their daily patterns of movement. The daily movement patterns of pigs can be used as an indicator of their health, in which more active pigs are usually healthier than those who are not active, providing farmers with knowledge of identifying pigs' health state before they become sick or their condition becomes life-threatening. However, the conventional means of estimating pig mobility largely rely on manual observations by farmers, which is impractical in the context of contemporary centralized and extensive pig farming operations. In response to these challenges, multi-object tracking and pig behavior methods are adopted to monitor pig health and welfare closely. Regrettably, these existing methods frequently fall short of providing precise and quantified measurements of movement distance, thereby yielding a rudimentary metric for assessing pig health. This paper proposes a novel approach that integrates optical flow and a multi-object tracking algorithm to more accurately gauge pig movement based on both qualitative and quantitative analyses of the shortcomings of solely relying on tracking algorithms. The optical flow records accurate movement between two consecutive frames and the multi-object tracking algorithm offers individual tracks for each pig. By combining optical flow and the tracking algorithm, our approach can accurately estimate each pig's movement. Moreover, the incorporation of optical flow affords the capacity to discern partial movements, such as instances where only the pig's head is in motion while the remainder of its body remains stationary. The experimental results show that the proposed method has superiority over the method of solely using tracking results, i.e., bounding boxes. The reason is that the movement calculated based on bounding boxes is easily affected by the size fluctuation while the optical flow data can avoid these drawbacks and even provide more fine-grained motion information. The virtues inherent in the proposed method culminate in the provision of more accurate and comprehensive information, thus enhancing the efficacy of decision-making and management processes within the realm of pig farming.
Collapse
Affiliation(s)
- Heng Zhou
- Department of Electronics and Information Engineering, Jeonbuk National University, Jeonju 54896, Republic of Korea; (H.Z.); (J.K.K.)
- Core Research Institute of Intelligent Robots, Jeonbuk National University, Jeonju 54896, Republic of Korea; (S.C.); (S.C.K.)
| | - Seyeon Chung
- Core Research Institute of Intelligent Robots, Jeonbuk National University, Jeonju 54896, Republic of Korea; (S.C.); (S.C.K.)
| | - Junaid Khan Kakar
- Department of Electronics and Information Engineering, Jeonbuk National University, Jeonju 54896, Republic of Korea; (H.Z.); (J.K.K.)
- Core Research Institute of Intelligent Robots, Jeonbuk National University, Jeonju 54896, Republic of Korea; (S.C.); (S.C.K.)
| | - Sang Cheol Kim
- Core Research Institute of Intelligent Robots, Jeonbuk National University, Jeonju 54896, Republic of Korea; (S.C.); (S.C.K.)
| | - Hyongsuk Kim
- Core Research Institute of Intelligent Robots, Jeonbuk National University, Jeonju 54896, Republic of Korea; (S.C.); (S.C.K.)
- Department of Electronics Engineering, Jeonbuk National University, Jeonju 54896, Republic of Korea
| |
Collapse
|
14
|
An L, Ren J, Yu T, Hai T, Jia Y, Liu Y. Three-dimensional surface motion capture of multiple freely moving pigs using MAMMAL. Nat Commun 2023; 14:7727. [PMID: 38001106 PMCID: PMC10673844 DOI: 10.1038/s41467-023-43483-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2022] [Accepted: 11/09/2023] [Indexed: 11/26/2023] Open
Abstract
Understandings of the three-dimensional social behaviors of freely moving large-size mammals are valuable for both agriculture and life science, yet challenging due to occlusions in close interactions. Although existing animal pose estimation methods captured keypoint trajectories, they ignored deformable surfaces which contained geometric information essential for social interaction prediction and for dealing with the occlusions. In this study, we develop a Multi-Animal Mesh Model Alignment (MAMMAL) system based on an articulated surface mesh model. Our self-designed MAMMAL algorithms automatically enable us to align multi-view images into our mesh model and to capture 3D surface motions of multiple animals, which display better performance upon severe occlusions compared to traditional triangulation and allow complex social analysis. By utilizing MAMMAL, we are able to quantitatively analyze the locomotion, postures, animal-scene interactions, social interactions, as well as detailed tail motions of pigs. Furthermore, experiments on mouse and Beagle dogs demonstrate the generalizability of MAMMAL across different environments and mammal species.
Collapse
Affiliation(s)
- Liang An
- Department of Automation, Tsinghua University, Beijing, China
| | - Jilong Ren
- State Key Laboratory of Stem Cell and Reproductive Biology, Institute of Zoology, Chinese Academy of Sciences, Beijing, China
- Beijing Farm Animal Research Center, Institute of Zoology, Chinese Academy of Sciences, Beijing, China
| | - Tao Yu
- Department of Automation, Tsinghua University, Beijing, China
- Tsinghua University Beijing National Research Center for Information Science and Technology (BNRist), Beijing, China
| | - Tang Hai
- State Key Laboratory of Stem Cell and Reproductive Biology, Institute of Zoology, Chinese Academy of Sciences, Beijing, China.
- Beijing Farm Animal Research Center, Institute of Zoology, Chinese Academy of Sciences, Beijing, China.
| | - Yichang Jia
- School of Medicine, Tsinghua University, Beijing, China.
- IDG/McGovern Institute for Brain Research at Tsinghua, Beijing, China.
- Tsinghua Laboratory of Brain and Intelligence, Beijing, China.
| | - Yebin Liu
- Department of Automation, Tsinghua University, Beijing, China.
- Institute for Brain and Cognitive Sciences, Tsinghua University, Beijing, China.
| |
Collapse
|
15
|
Voogt AM, Schrijver RS, Temürhan M, Bongers JH, Sijm DTHM. Opportunities for Regulatory Authorities to Assess Animal-Based Measures at the Slaughterhouse Using Sensor Technology and Artificial Intelligence: A Review. Animals (Basel) 2023; 13:3028. [PMID: 37835634 PMCID: PMC10571985 DOI: 10.3390/ani13193028] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2023] [Revised: 09/19/2023] [Accepted: 09/20/2023] [Indexed: 10/15/2023] Open
Abstract
Animal-based measures (ABMs) are the preferred way to assess animal welfare. However, manual scoring of ABMs is very time-consuming during the meat inspection. Automatic scoring by using sensor technology and artificial intelligence (AI) may bring a solution. Based on review papers an overview was made of ABMs recorded at the slaughterhouse for poultry, pigs and cattle and applications of sensor technology to measure the identified ABMs. Also, relevant legislation and work instructions of the Dutch Regulatory Authority (RA) were scanned on applied ABMs. Applications of sensor technology in a research setting, on farm or at the slaughterhouse were reported for 10 of the 37 ABMs identified for poultry, 4 of 32 for cattle and 13 of 41 for pigs. Several applications are related to aspects of meat inspection. However, by European law meat inspection must be performed by an official veterinarian, although there are exceptions for the post mortem inspection of poultry. The examples in this study show that there are opportunities for using sensor technology by the RA to support the inspection and to give more insight into animal welfare risks. The lack of external validation for multiple commercially available systems is a point of attention.
Collapse
Affiliation(s)
- Annika M. Voogt
- Office for Risk Assessment & Research (BuRO), Netherlands Food and Consumer Product Safety Authority (NVWA), P.O. Box 43006, 3540 AA Utrecht, The Netherlands
| | | | | | | | | |
Collapse
|
16
|
Wang S, Jiang H, Qiao Y, Jiang S. A Method for Obtaining 3D Point Cloud Data by Combining 2D Image Segmentation and Depth Information of Pigs. Animals (Basel) 2023; 13:2472. [PMID: 37570282 PMCID: PMC10417003 DOI: 10.3390/ani13152472] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2023] [Revised: 07/21/2023] [Accepted: 07/25/2023] [Indexed: 08/13/2023] Open
Abstract
This paper proposes a method for automatic pig detection and segmentation using RGB-D data for precision livestock farming. The proposed method combines the enhanced YOLOv5s model with the Res2Net bottleneck structure, resulting in improved fine-grained feature extraction and ultimately enhancing the precision of pig detection and segmentation in 2D images. Additionally, the method facilitates the acquisition of 3D point cloud data of pigs in a simpler and more efficient way by using the pig mask obtained in 2D detection and segmentation and combining it with depth information. To evaluate the effectiveness of the proposed method, two datasets were constructed. The first dataset consists of 5400 images captured in various pig pens under diverse lighting conditions, while the second dataset was obtained from the UK. The experimental results demonstrated that the improved YOLOv5s_Res2Net achieved a mAP@0.5:0.95 of 89.6% and 84.8% for both pig detection and segmentation tasks on our dataset, while achieving a mAP@0.5:0.95 of 93.4% and 89.4% on the Edinburgh pig behaviour dataset. This approach provides valuable insights for improving pig management, conducting welfare assessments, and estimating weight accurately.
Collapse
Affiliation(s)
- Shunli Wang
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China; (S.W.); (H.J.)
| | - Honghua Jiang
- College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China; (S.W.); (H.J.)
| | - Yongliang Qiao
- Australian Institute for Machine Learning (AIML), The University of Adelaide, Adelaide, SA 5005, Australia
| | - Shuzhen Jiang
- Key Laboratory of Efficient Utilisation of Non-Grain Feed Resources (Co-Construction by Ministry and Province), Ministry of Agriculture and Rural Affairs, Department of Animal Science and Technology, Shandong Agricultural University, Tai’an 271018, China;
| |
Collapse
|
17
|
Kühnemund A, Götz S, Recke G. Automatic Detection of Group Recumbency in Pigs via AI-Supported Camera Systems. Animals (Basel) 2023; 13:2205. [PMID: 37444003 DOI: 10.3390/ani13132205] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2023] [Revised: 06/23/2023] [Accepted: 06/29/2023] [Indexed: 07/15/2023] Open
Abstract
The resting behavior of rearing pigs provides information about their perception of the current temperature. A pen that is too cold or too warm can impact the well-being of the animals as well as their physical development. Previous studies that have automatically recorded animal behavior often utilized body posture. However, this method is error-prone because hidden animals (so-called false positives) strongly influence the results. In the present study, a method was developed for the automated identification of time periods in which all pigs are lying down using video recordings (an AI-supported camera system). We used velocity data (measured by the camera) of pigs in the pen to identify these periods. To determine the threshold value for images with the highest probability of containing only recumbent pigs, a dataset with 9634 images and velocity values was used. The resulting velocity threshold (0.0006020622 m/s) yielded an accuracy of 94.1%. Analysis of the testing dataset revealed that recumbent pigs were correctly identified based on velocity values derived from video recordings. This represents an advance toward automated detection from the previous manual detection method.
Collapse
Affiliation(s)
- Alexander Kühnemund
- Hochschule Osnabrück, Fachbereich Landwirtschaftliche Betriebswirtschaftslehre, Oldenburger Landstraße 24, 49090 Osnabrück, Germany
| | - Sven Götz
- VetVise GmbH, Bünteweg 2, 30559 Hannover, Germany
| | - Guido Recke
- Hochschule Osnabrück, Fachbereich Landwirtschaftliche Betriebswirtschaftslehre, Oldenburger Landstraße 24, 49090 Osnabrück, Germany
| |
Collapse
|
18
|
Jiang B, Tang W, Cui L, Deng X. Precision Livestock Farming Research: A Global Scientometric Review. Animals (Basel) 2023; 13:2096. [PMID: 37443894 DOI: 10.3390/ani13132096] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2023] [Revised: 06/16/2023] [Accepted: 06/21/2023] [Indexed: 07/15/2023] Open
Abstract
Precision livestock farming (PLF) utilises information technology to continuously monitor and manage livestock in real-time, which can improve individual animal health, welfare, productivity and the environmental impact of animal husbandry, contributing to the economic, social and environmental sustainability of livestock farming. PLF has emerged as a pivotal area of multidisciplinary interest. In order to clarify the knowledge evolution and hotspot replacement of PLF research, based on the relevant data from the Web of Science database from 1973 to 2023, this study analyzed the main characteristics, research cores and hot topics of PLF research via CiteSpace. The results point to a significant increase in studies on PLF, with countries having advanced livestock farming systems in Europe and America publishing frequently and collaborating closely across borders. Universities in various countries have been leading the research, with Daniel Berckmans serving as the academic leader. Research primarily focuses on animal science, veterinary science, computer science, agricultural engineering, and environmental science. Current research hotspots center around precision dairy and cattle technology, intelligent systems, and animal behavior, with deep learning, accelerometer, automatic milking systems, lameness, estrus detection, and electronic identification being the main research directions, and deep learning and machine learning represent the forefront of current research. Research hot topics mainly include social science in PLF, the environmental impact of PLF, information technology in PLF, and animal welfare in PLF. Future research in PLF should prioritize inter-institutional and inter-scholar communication and cooperation, integration of multidisciplinary and multimethod research approaches, and utilization of deep learning and machine learning. Furthermore, social science issues should be given due attention in PLF, and the integration of intelligent technologies in animal management should be strengthened, with a focus on animal welfare and the environmental impact of animal husbandry, to promote its sustainable development.
Collapse
Affiliation(s)
- Bing Jiang
- College of Economics and Management, Northeast Agricultural University, Harbin 150030, China
- Development Research Center of Modern Agriculture, Northeast Agricultural University, Harbin 150030, China
| | - Wenjie Tang
- College of Economics and Management, Northeast Agricultural University, Harbin 150030, China
| | - Lihang Cui
- College of Economics and Management, Northeast Agricultural University, Harbin 150030, China
| | - Xiaoshang Deng
- College of Economics and Management, Northeast Agricultural University, Harbin 150030, China
| |
Collapse
|
19
|
Double-Camera Fusion System for Animal-Position Awareness in Farming Pens. Foods 2022; 12:foods12010084. [PMID: 36613301 PMCID: PMC9818956 DOI: 10.3390/foods12010084] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2022] [Revised: 12/06/2022] [Accepted: 12/09/2022] [Indexed: 12/29/2022] Open
Abstract
In livestock breeding, continuous and objective monitoring of animals is manually unfeasible due to the large scale of breeding and expensive labour. Computer vision technology can generate accurate and real-time individual animal or animal group information from video surveillance. However, the frequent occlusion between animals and changes in appearance features caused by varying lighting conditions makes single-camera systems less attractive. We propose a double-camera system and image registration algorithms to spatially fuse the information from different viewpoints to solve these issues. This paper presents a deformable learning-based registration framework, where the input image pairs are initially linearly pre-registered. Then, an unsupervised convolutional neural network is employed to fit the mapping from one view to another, using a large number of unlabelled samples for training. The learned parameters are then used in a semi-supervised network and fine-tuned with a small number of manually annotated landmarks. The actual pixel displacement error is introduced as a complement to an image similarity measure. The performance of the proposed fine-tuned method is evaluated on real farming datasets and demonstrates significant improvement in lowering the registration errors than commonly used feature-based and intensity-based methods. This approach also reduces the registration time of an unseen image pair to less than 0.5 s. The proposed method provides a high-quality reference processing step for improving subsequent tasks such as multi-object tracking and behaviour recognition of animals for further analysis.
Collapse
|
20
|
Song H, Zhao B, Hu J, Sun H, Zhou Z. Research on Improved DenseNets Pig Cough Sound Recognition Model Based on SENets. ELECTRONICS 2022; 11:3562. [DOI: 10.3390/electronics11213562] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/05/2025]
Abstract
In order to real-time monitor the health status of pigs in the process of breeding and to achieve the purpose of early warning of swine respiratory diseases, the SE-DenseNet-121 recognition model was established to recognize pig cough sounds. The 13-dimensional MFCC, ΔMFCC and Δ2MFCC were transverse spliced to obtain six groups of parameters that could reflect the static, dynamic and mixed characteristics of pig sound signals respectively, and the DenseNet-121 recognition model was used to compare the performance of the six sets of parameters to obtain the optimal set of parameters. The DenseNet-121 recognition model was improved by using the SENets attention module to enhance the recognition model’s ability to extract effective features from the pig sound signals. The results showed that the optimal set of parameters was the 26-dimensional MFCC + ΔMFCC, and the rate of recognition accuracy, recall, precision and F1 score of the SE-DenseNet-121 recognition model for pig cough sounds were 93.8%, 98.6%, 97% and 97.8%, respectively. The above results can be used to develop a pig cough sound recognition system for early warning of pig respiratory diseases.
Collapse
Affiliation(s)
- Hang Song
- College of Engineering, Heilongjiang Bayi Agricultural University, Daqing 163319, China
| | - Bin Zhao
- College of Engineering, Heilongjiang Bayi Agricultural University, Daqing 163319, China
| | - Jun Hu
- College of Engineering, Heilongjiang Bayi Agricultural University, Daqing 163319, China
| | - Haonan Sun
- College of Engineering, Heilongjiang Bayi Agricultural University, Daqing 163319, China
| | - Zheng Zhou
- College of Engineering, Heilongjiang Bayi Agricultural University, Daqing 163319, China
| |
Collapse
|
21
|
Grandin T. Practical Application of the Five Domains Animal Welfare Framework for Supply Food Animal Chain Managers. Animals (Basel) 2022; 12:2831. [PMID: 36290216 PMCID: PMC9597751 DOI: 10.3390/ani12202831] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2022] [Revised: 10/06/2022] [Accepted: 10/12/2022] [Indexed: 11/16/2022] Open
Abstract
The author has worked as a consultant with global commercial supply managers for over 20 years. The focus of this commentary will be practical application of The Five Domains Model in commercial systems. Commercial buyers of meat need simple easy-to-use guidelines. They have to use auditors that can be trained in a workshop that lasts for only a few days. Auditing of slaughter plants by major buyers has resulted in great improvements. Supply chain managers need clear guidance on conditions that would result in a failed audit. Animal based outcome measures that can be easily assessed should be emphasized in commercial systems. Some examples of these key animal welfare indicators are: percentage of animals stunned effectively with a single application of the stunner, percentage of lame animals, foot pad lesions on poultry, and body condition scoring. A farm that supplies a buyer must also comply with housing specifications. The farm either has the specified housing or does not have it. It will be removed from the approved supplier list if housing does not comply. These types of easy to assess indicators can be easily evaluated within the four domains of nutrition, environment, health and behavioral interactions. The Five Domains Framework can also be used in a program for continuous improvement of animal welfare.
Collapse
Affiliation(s)
- Temple Grandin
- Department of Animal Science, Colorado State University, Fort Collins, CO 80526, USA
| |
Collapse
|