1
|
McLean CJ, Fisher DN. Measuring the effect of RFID and marker recognition tags on cockroach (Blattodea: Blaberidae) behavior using AI-aided tracking. JOURNAL OF INSECT SCIENCE (ONLINE) 2025; 25:5. [PMID: 39861965 PMCID: PMC11760971 DOI: 10.1093/jisesa/ieaf002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/01/2024] [Revised: 11/27/2024] [Accepted: 01/11/2025] [Indexed: 01/27/2025]
Abstract
Radio frequency identification (RFID) technology and marker recognition algorithms can offer an efficient and non-intrusive means of tracking animal positions. As such, they have become important tools for invertebrate behavioral research. Both approaches require fixing a tag or marker to the study organism, and so it is useful to quantify the effects such procedures have on behavior before proceeding with further research. However, frequently studies do not report doing such tests. Here, we demonstrate a time-efficient and accessible method for quantifying the impact of tagging on individual movement using open-source automated video tracking software. We tested the effect of RFID tags and tags suitable for marker recognition algorithms on the movement of Argentinian wood roaches (Blapicta dubia, Blattodea: Blaberidae) by filming tagged and untagged roaches in laboratory conditions. We employed DeepLabCut on the resultant videos to track cockroach movement and extract measures of behavioral traits. We found no statistically significant differences between RFID tagged and untagged groups in average speed over the trial period, the number of unique zones explored, and the number of discrete walks. However, groups that were tagged with labels for marker recognition had significantly higher values for all 3 metrics. We therefore support the use of RFID tags to monitor the behavior of B. dubia but note that the effect of using labels suitable for label recognition to identify individuals should be taken into consideration when measuring B.dubia behavior. We hope that this study can provide an accessible and viable roadmap for further work investigating the effects of tagging on insect behavior.
Collapse
Affiliation(s)
- Callum J McLean
- School of Biological Sciences, University of Aberdeen, King’s College, Aberdeen, UK
| | - David N Fisher
- School of Biological Sciences, University of Aberdeen, King’s College, Aberdeen, UK
| |
Collapse
|
2
|
Reza MN, Lee KH, Habineza E, Samsuzzaman, Kyoung H, Choi YK, Kim G, Chung SO. RGB-based machine vision for enhanced pig disease symptoms monitoring and health management: a review. JOURNAL OF ANIMAL SCIENCE AND TECHNOLOGY 2025; 67:17-42. [PMID: 39974778 PMCID: PMC11833201 DOI: 10.5187/jast.2024.e111] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/16/2024] [Revised: 11/15/2024] [Accepted: 11/18/2024] [Indexed: 02/21/2025]
Abstract
The growing demands of sustainable, efficient, and welfare-conscious pig husbandry have necessitated the adoption of advanced technologies. Among these, RGB imaging and machine vision technology may offer a promising solution for early disease detection and proactive disease management in advanced pig husbandry practices. This review explores innovative applications for monitoring disease symptoms by assessing features that directly or indirectly indicate disease risk, as well as for tracking body weight and overall health. Machine vision and image processing algorithms enable for the real-time detection of subtle changes in pig appearance and behavior that may signify potential health issues. Key indicators include skin lesions, inflammation, ocular and nasal discharge, and deviations in posture and gait, each of which can be detected non-invasively using RGB cameras. Moreover, when integrated with thermal imaging, RGB systems can detect fever, a reliable indicator of infection, while behavioral monitoring systems can track abnormal posture, reduced activity, and altered feeding and drinking habits, which are often precursors to illness. The technology also facilitates the analysis of respiratory symptoms, such as coughing or sneezing (enabling early identification of respiratory diseases, one of the most significant challenges in pig farming), and the assessment of fecal consistency and color (providing valuable insights into digestive health). Early detection of disease or poor health supports proactive interventions, reducing mortality and improving treatment outcomes. Beyond direct symptom monitoring, RGB imaging and machine vision can indirectly assess disease risk by monitoring body weight, feeding behavior, and environmental factors such as overcrowding and temperature. However, further research is needed to refine the accuracy and robustness of algorithms in diverse farming environments. Ultimately, integrating RGB-based machine vision into existing farm management systems could provide continuous, automated surveillance, generating real-time alerts and actionable insights; these can support data-driven disease prevention strategies, reducing the need for mass medication and the development of antimicrobial resistance.
Collapse
Affiliation(s)
- Md Nasim Reza
- Department of Agricultural Machinery
Engineering, Graduate School, Chungnam National University,
Daejeon 34134, Korea
- Department of Smart Agricultural Systems,
Graduate School, Chungnam National University, Daejeon 34134,
Korea
| | - Kyu-Ho Lee
- Department of Agricultural Machinery
Engineering, Graduate School, Chungnam National University,
Daejeon 34134, Korea
- Department of Smart Agricultural Systems,
Graduate School, Chungnam National University, Daejeon 34134,
Korea
| | - Eliezel Habineza
- Department of Smart Agricultural Systems,
Graduate School, Chungnam National University, Daejeon 34134,
Korea
| | - Samsuzzaman
- Department of Agricultural Machinery
Engineering, Graduate School, Chungnam National University,
Daejeon 34134, Korea
| | - Hyunjin Kyoung
- Division of Animal and Dairy Science,
Chungnam National University, Daejeon 34134, Korea
| | | | - Gookhwan Kim
- National Institute of Agricultural
Sciences, Rural Development Administration, Jeonju 54875,
Korea
| | - Sun-Ok Chung
- Department of Agricultural Machinery
Engineering, Graduate School, Chungnam National University,
Daejeon 34134, Korea
- Department of Smart Agricultural Systems,
Graduate School, Chungnam National University, Daejeon 34134,
Korea
| |
Collapse
|
3
|
Fang C, Wu Z, Zheng H, Yang J, Ma C, Zhang T. MCP: Multi-Chicken Pose Estimation Based on Transfer Learning. Animals (Basel) 2024; 14:1774. [PMID: 38929393 PMCID: PMC11200378 DOI: 10.3390/ani14121774] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2024] [Revised: 06/07/2024] [Accepted: 06/10/2024] [Indexed: 06/28/2024] Open
Abstract
Poultry managers can better understand the state of poultry through poultry behavior analysis. As one of the key steps in behavior analysis, the accurate estimation of poultry posture is the focus of this research. This study mainly analyzes a top-down pose estimation method of multiple chickens. Therefore, we propose the "multi-chicken pose" (MCP), a pose estimation system for multiple chickens through deep learning. Firstly, we find the position of each chicken from the image via the chicken detector; then, an estimate of the pose of each chicken is made using a pose estimation network, which is based on transfer learning. On this basis, the pixel error (PE), root mean square error (RMSE), and image quantity distribution of key points are analyzed according to the improved chicken keypoint similarity (CKS). The experimental results show that the algorithm scores in different evaluation metrics are a mean average precision (mAP) of 0.652, a mean average recall (mAR) of 0.742, a percentage of correct keypoints (PCKs) of 0.789, and an RMSE of 17.30 pixels. To the best of our knowledge, this is the first time that transfer learning has been used for the pose estimation of multiple chickens as objects. The method can provide a new path for future poultry behavior analysis.
Collapse
Affiliation(s)
- Cheng Fang
- College of Engineering, South China Agricultural University, 483 Wushan Road, Guangzhou 510642, China; (C.F.)
| | - Zhenlong Wu
- College of Engineering, South China Agricultural University, 483 Wushan Road, Guangzhou 510642, China; (C.F.)
| | - Haikun Zheng
- College of Engineering, South China Agricultural University, 483 Wushan Road, Guangzhou 510642, China; (C.F.)
| | - Jikang Yang
- College of Engineering, South China Agricultural University, 483 Wushan Road, Guangzhou 510642, China; (C.F.)
| | - Chuang Ma
- College of Engineering, South China Agricultural University, 483 Wushan Road, Guangzhou 510642, China; (C.F.)
| | - Tiemin Zhang
- College of Engineering, South China Agricultural University, 483 Wushan Road, Guangzhou 510642, China; (C.F.)
- National Engineering Research Center for Breeding Swine Industry, Guangzhou 510642, China
- Guangdong Laboratory for Lingnan Modern Agriculture, Guangzhou 510642, China
| |
Collapse
|
4
|
Petso T, Jamisola RS. Wildlife conservation using drones and artificial intelligence in Africa. Sci Robot 2023; 8:eadm7008. [PMID: 38117868 DOI: 10.1126/scirobotics.adm7008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2023] [Accepted: 11/22/2023] [Indexed: 12/22/2023]
Abstract
The use of drones and artificial intelligence may offer more reliable methods of counting populations and monitoring wildlife.
Collapse
Affiliation(s)
- Tinao Petso
- Department of Mechanical, Energy, and Industrial Engineering, Botswana International University of Science and Technology, Private Bag 16, Palapye, Botswana
| | - Rodrigo S Jamisola
- Department of Mechanical, Energy, and Industrial Engineering, Botswana International University of Science and Technology, Private Bag 16, Palapye, Botswana
| |
Collapse
|
5
|
Nasiri A, Amirivojdan A, Zhao Y, Gan H. Estimating the Feeding Time of Individual Broilers via Convolutional Neural Network and Image Processing. Animals (Basel) 2023; 13:2428. [PMID: 37570235 PMCID: PMC10416955 DOI: 10.3390/ani13152428] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2023] [Revised: 07/21/2023] [Accepted: 07/25/2023] [Indexed: 08/13/2023] Open
Abstract
Feeding behavior is one of the critical welfare indicators of broilers. Hence, understanding feeding behavior can provide important information regarding the usage of poultry resources and insights into farm management. Monitoring poultry behaviors is typically performed based on visual human observation. Despite the successful applications of this method, its implementation in large poultry farms takes time and effort. Thus, there is a need for automated approaches to overcome these challenges. Consequently, this study aimed to evaluate the feeding time of individual broilers by a convolutional neural network-based model. To achieve the goal of this research, 1500 images collected from a poultry farm were labeled for training the You Only Look Once (YOLO) model to detect the broilers' heads. A Euclidean distance-based tracking algorithm was developed to track the detected heads, as well. The developed algorithm estimated the broiler's feeding time by recognizing whether its head is inside the feeder. Three 1-min labeled videos were applied to evaluate the proposed algorithm's performance. The algorithm achieved an overall feeding time estimation accuracy of each broiler per visit to the feeding pan of 87.3%. In addition, the obtained results prove that the proposed algorithm can be used as a real-time tool in poultry farms.
Collapse
Affiliation(s)
- Amin Nasiri
- Department of Biosystems Engineering and Soil Science, University of Tennessee, Knoxville, TN 37996, USA; (A.N.); (A.A.)
| | - Ahmad Amirivojdan
- Department of Biosystems Engineering and Soil Science, University of Tennessee, Knoxville, TN 37996, USA; (A.N.); (A.A.)
| | - Yang Zhao
- Department of Animal Science, University of Tennessee, Knoxville, TN 37996, USA;
| | - Hao Gan
- Department of Biosystems Engineering and Soil Science, University of Tennessee, Knoxville, TN 37996, USA; (A.N.); (A.A.)
| |
Collapse
|