1
|
Evaluating the Activity of Pigs with Radio-Frequency Identification and Virtual Walking Distances. Animals (Basel) 2023; 13:3112. [PMID: 37835719 PMCID: PMC10571748 DOI: 10.3390/ani13193112] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2023] [Revised: 09/28/2023] [Accepted: 09/29/2023] [Indexed: 10/15/2023] Open
Abstract
Monitoring the activity of animals can help with assessing their health status. We monitored the walking activity of fattening pigs using a UHF-RFID system. Four hundred fattening pigs with UHF-RFID ear tags were recorded by RFID antennas at the troughs, playing devices and drinkers during the fattening period. A minimum walking distance, or virtual walking distance, was determined for each pig per day by calculating the distances between two consecutive reading areas. This automatically calculated value was used as an activity measure and not only showed differences between the pigs but also between different fattening stages. The longer the fattening periods lasted, the less walking activity was detected. The virtual walking distance ranged between 281 m on average in the first fattening stage and about 141 m in the last fattening stage in a restricted environment. The findings are similar to other studies considering walking distances of fattening pigs, but are far less labor-intensive and time-consuming than direct observations.
Collapse
|
2
|
Comparison of automated video tracking systems in the open field test: ANY-Maze versus EthoVision XT. J Neurosci Methods 2023; 397:109940. [PMID: 37544382 DOI: 10.1016/j.jneumeth.2023.109940] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2023] [Revised: 07/26/2023] [Accepted: 08/03/2023] [Indexed: 08/08/2023]
Abstract
BACKGROUND ANY-Maze and EthoVision XT are two commonly used automated animal tracking systems employed to produce reliable and consistent results in behavioural paradigms. Data obtained with both tracking systems have presented differences, particularly when varying laboratory lighting conditions and contrasts of mice coat colour against the arena background in both water maze and tunnel maze. METHOD In this study, two fluorescent lighting conditions (58 and 295 lux), local to our laboratory, and different coat-coloured mouse lines (C57BL/6 J - black; CD1 - agouti; C3H/HeN - white) were used to compare reproducibility in measures of tracking systems (ANY-Maze versus EthoVision) in the open field test. RESULTS Differences between systems were reliant on the contrasts between coat and background colours. Surprisingly, black animals presented the greatest differences in read-outs between tracking systems, regardless of lighting conditions. Data from both video observation tools differed mainly in exploration-related parameters (distance travelled), but less in more static proxies (time in thigmotaxis zone). Overall, EthoVision XT returned higher values for most parameters analysed relative to ANY-Maze. More inconsistencies in recording and analysis can be expected from other video recording systems. CONCLUSION Data analysis software provides an additional source of variation in need of consideration when reproducibility in behavioural neuroscience is required.
Collapse
|
3
|
Automated Quantification of the Behaviour of Beef Cattle Exposed to Heat Load Conditions. Animals (Basel) 2023; 13:ani13061125. [PMID: 36978665 PMCID: PMC10044595 DOI: 10.3390/ani13061125] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2023] [Revised: 02/28/2023] [Accepted: 03/16/2023] [Indexed: 03/30/2023] Open
Abstract
Cattle change their behaviour in response to hot temperatures, including by engaging in stepping that indicates agitation. The automated recording of these responses would be helpful in the timely diagnosis of animals experiencing heat loading. Behavioural responses of beef cattle to hot environmental conditions were studied to investigate whether it was possible to assess behavioural responses by video-digitised image analysis. Open-source automated behavioural quantification software was used to record pixel changes in 13 beef cattle videorecorded in a climate-controlled chamber during exposure to a simulated typical heat event in Queensland, Australia. Increased digitised movement was observed during the heat event, which was related to stepping and grooming/scratching activities in standing animals. The 13 cattle were exposed in two cohorts, in which the first group of cattle (n = 6) was fed a standard finisher diet based on a high percentage of cereal grains, and the second group of cattle (n = 7) received a substituted diet in which 8% of the grains were replaced by lucerne hay. The second group displayed a smaller increase in digitised movements on exposure to heat than the first, suggesting less discomfort under hot conditions. The results suggest that cattle exposed to heat display increased movement that can be detected automatically by video digitisation software, and that replacing some cereal grain with forage in the diet of feedlot cattle may reduce the measured activity responses to the heat.
Collapse
|
4
|
A Deep Learning Model for Detecting Cage-Free Hens on the Litter Floor. Animals (Basel) 2022; 12:ani12151983. [PMID: 35953972 PMCID: PMC9367364 DOI: 10.3390/ani12151983] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2022] [Revised: 07/29/2022] [Accepted: 08/03/2022] [Indexed: 11/16/2022] Open
Abstract
Real-time and automatic detection of chickens (e.g., laying hens and broilers) is the cornerstone of precision poultry farming based on image recognition. However, such identification becomes more challenging under cage-free conditions comparing to caged hens. In this study, we developed a deep learning model (YOLOv5x-hens) based on YOLOv5, an advanced convolutional neural network (CNN), to monitor hens’ behaviors in cage-free facilities. More than 1000 images were used to train the model and an additional 200 images were adopted to test it. One-way ANOVA and Tukey HSD analyses were conducted using JMP software (JMP Pro 16 for Mac, SAS Institute, Cary, North Caronia) to determine whether there are significant differences between the predicted number of hens and the actual number of hens under various situations (i.e., age, light intensity, and observational angles). The difference was considered significant at p < 0.05. Our results show that the evaluation metrics (Precision, Recall, F1 and mAP@0.5) of the YOLOv5x-hens model were 0.96, 0.96, 0.96 and 0.95, respectively, in detecting hens on the litter floor. The newly developed YOLOv5x-hens was tested with stable performances in detecting birds under different lighting intensities, angles, and ages over 8 weeks (i.e., birds were 8−16 weeks old). For instance, the model was tested with 95% accuracy after the birds were 8 weeks old. However, younger chicks such as one-week old birds were harder to be tracked (e.g., only 25% accuracy) due to interferences of equipment such as feeders, drink lines, and perches. According to further data analysis, the model performed efficiently in real-time detection with an overall accuracy more than 95%, which is the key step for the tracking of individual birds for evaluation of production and welfare. However, there are some limitations of the current version of the model. Error detections came from highly overlapped stock, uneven light intensity, and images occluded by equipment (i.e., drinking line and feeder). Future research is needed to address those issues for a higher detection. The current study established a novel CNN deep learning model in research cage-free facilities for the detection of hens, which provides a technical basis for developing a machine vision system for tracking individual birds for evaluation of the animals’ behaviors and welfare status in commercial cage-free houses.
Collapse
|
5
|
Estimation of Resilience Parameters Following LPS Injection Based on Activity Measured With Computer Vision. FRONTIERS IN ANIMAL SCIENCE 2022. [DOI: 10.3389/fanim.2022.883940] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Resilience could be referred to as the animal’s ability to successfully adapt to a challenge. This is typically displayed by a quick return to initial metabolic or activity levels and behaviors. Pigs have distinct diurnal activity patterns. Deviations from these patterns could potentially be utilized to quantify resilience. However, human observations of activity are labor intensive and not feasible in practice on a large scale. In this study, we show the use of a computer vision tracking algorithm to quantify resilience based on activity individual patterns following a lipopolysaccharide (LPS) challenge, which induced a sickness response. We followed 121 individual pigs housed in barren or enriched housing systems, as previous work suggests an impact of housing on resilience, for eight days. The enriched housing consisted of delayed weaning in a group farrowing system and extra space compared with the barren pens and environmental enrichment. Enriched housed pigs were more active pre-injection of LPS, especially during peak activity times, than barren housed pigs (49.4 ± 9.9 vs. 39.1 ± 5.0 meter/hour). Four pigs per pen received an LPS injection and two pigs a saline injection. LPS injected animals were more likely to show a dip in activity than controls (86% vs 17%). Duration and Area Under the Curve (AUC) of the dip were not affected by housing. However, pigs with the same AUC could have a long and shallow dip or a steep and short dip. Therefore the AUC:duration ratio was calculated, and enriched housed pigs had a higher AUC:duration ratio compared to barren housed pigs (9244.1 ± 5429.8 vs 5919.6 ± 4566.1). Enriched housed pigs might therefore have a different strategy to cope with an LPS sickness challenge. However, more research on this strategy and the use of activity to quantify resilience and its relationship to physiological parameters is therefore needed.
Collapse
|
6
|
Biometric Physiological Responses from Dairy Cows Measured by Visible Remote Sensing Are Good Predictors of Milk Productivity and Quality through Artificial Intelligence. SENSORS 2021; 21:s21206844. [PMID: 34696059 PMCID: PMC8541531 DOI: 10.3390/s21206844] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/01/2021] [Revised: 10/11/2021] [Accepted: 10/13/2021] [Indexed: 12/14/2022]
Abstract
New and emerging technologies, especially those based on non-invasive video and thermal infrared cameras, can be readily tested on robotic milking facilities. In this research, implemented non-invasive computer vision methods to estimate cow's heart rate, respiration rate, and abrupt movements captured using RGB cameras and machine learning modelling to predict eye temperature, milk production and quality are presented. RGB and infrared thermal videos (IRTV) were acquired from cows using a robotic milking facility. Results from 102 different cows with replicates (n = 150) showed that an artificial neural network (ANN) model using only inputs from RGB cameras presented high accuracy (R = 0.96) in predicting eye temperature (°C), using IRTV as ground truth, daily milk productivity (kg-milk-day-1), cow milk productivity (kg-milk-cow-1), milk fat (%) and milk protein (%) with no signs of overfitting. The ANN model developed was deployed using an independent 132 cow samples obtained on different days, which also rendered high accuracy and was similar to the model development (R = 0.93). This model can be easily applied using affordable RGB camera systems to obtain all the proposed targets, including eye temperature, which can also be used to model animal welfare and biotic/abiotic stress. Furthermore, these models can be readily deployed in conventional dairy farms.
Collapse
|
7
|
Welfare Health and Productivity in Commercial Pig Herds. Animals (Basel) 2021; 11:1176. [PMID: 33924224 PMCID: PMC8074599 DOI: 10.3390/ani11041176] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2021] [Revised: 04/16/2021] [Accepted: 04/17/2021] [Indexed: 12/02/2022] Open
Abstract
In recent years, there have been very dynamic changes in both pork production and pig breeding technology around the world. The general trend of increasing the efficiency of pig production, with reduced employment, requires optimisation and a comprehensive approach to herd management. One of the most important elements on the way to achieving this goal is to maintain animal welfare and health. The health of the pigs on the farm is also a key aspect in production economics. The need to maintain a high health status of pig herds by eliminating the frequency of different disease units and reducing the need for antimicrobial substances is part of a broadly understood high potential herd management strategy. Thanks to the use of sensors (cameras, microphones, accelerometers, or radio-frequency identification transponders), the images, sounds, movements, and vital signs of animals are combined through algorithms and analysed for non-invasive monitoring of animals, which allows for early detection of diseases, improves their welfare, and increases the productivity of breeding. Automated, innovative early warning systems based on continuous monitoring of specific physiological (e.g., body temperature) and behavioural parameters can provide an alternative to direct diagnosis and visual assessment by the veterinarian or the herd keeper.
Collapse
|
8
|
Activity Time Budgets-A Potential Tool to Monitor Equine Welfare? Animals (Basel) 2021; 11:ani11030850. [PMID: 33802908 PMCID: PMC8002676 DOI: 10.3390/ani11030850] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2021] [Revised: 03/14/2021] [Accepted: 03/15/2021] [Indexed: 11/16/2022] Open
Abstract
Simple Summary Horses’ behavior is a good indicator of their welfare status. However, its complexity requires objective, quantifiable, and unambiguous evidence-based assessment criteria. As healthy, stress-free horses exhibit a highly repetitive daily routine, horses’ time budget (amount of time in a 24 h period spent on specific activities) can assist in equine welfare assessment. A systematic review of the literature yielded 12 papers that assessed equine time budgets for eating, resting and movement for a minimum of 24 continuous hours. A total of 144 horses (1–27 years old), 59 semi-feral and 85 domesticated horses, are included in this review. The reported 24 h time budgets for eating ranged from 10% to 66.6%, for resting from 8.1% to 66%, for lying from 2.7% to 27.3%, and for movement from 0.015% to 19.1%. The large variance in time budgets between studies can largely be attributed to differences in age and environmental conditions. Management interventions (free access to food, increased space, decreased population density) in domesticated horses yielded time budgets similar to semi-feral horses. The data support the importance of environmental conditions for horses’ well-being and the ability of time budgets to assist in monitoring horses’ welfare. Abstract Horses’ behavior can provide valuable insight into their subjective state and is thus a good indicator of welfare. However, its complexity requires objective, quantifiable, and unambiguous evidence-based assessment criteria. As healthy, stress-free horses exhibit a highly repetitive daily routine, temporal quantification of their behavioral activities (time budget analysis) can assist in equine welfare assessment. Therefore, the present systematic review aimed to provide an up-to-date analysis of equine time budget studies. A review of the literature yielded 12 papers that fulfilled the inclusion criteria: assessment of equine time budgets for eating, resting and movement for a minimum of 24 continuous hours. A total of 144 horses (1–27 years old), 59 semi-feral and 85 domesticated horses, are included in this review. The 24 h time budgets for foraging or eating (10–6.6%), resting (8.1–66%), lying (2.7–27.3%), and locomotion (0.015–19.1%) showed large variance between studies, which can largely be attributed to differences in age and environmental conditions. Management interventions in domesticated horses (ad libitum access to food, increased space, decreased population density) resulted in time budgets similar to their (semi-)feral conspecifics, emphasizing the importance of environmental conditions and the ability of time budgets to assist in monitoring horses’ welfare.
Collapse
|
9
|
|
10
|
Development and Validation of an Automated Video Tracking Model for Stabled Horses. Animals (Basel) 2020; 10:ani10122258. [PMID: 33266297 PMCID: PMC7760072 DOI: 10.3390/ani10122258] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2020] [Revised: 11/25/2020] [Accepted: 11/27/2020] [Indexed: 12/25/2022] Open
Abstract
Simple Summary Although there are some methods to detect pain in horses, because of bias and time-consumption, those methods are practically challenging. However, in recent years rapidly developed automated tracking methods have proven that computer-based behaviour monitoring is more reliable in many animal species. That is why in this study we aimed to investigate an automated video tracking model for horses in a clinical context. The findings will help to develop the automated detection of daily activity, to meet the ultimate objective of objectively assessing the pain and wellbeing of horses. An initial analysis of the obtained data offers the opportunity to construct an algorithm to track automatically behaviour patterns of horses. Abstract Changes in behaviour are often caused by painful conditions. Therefore, the assessment of behaviour is important for the recognition of pain, but also for the assessment of quality of life. Automated detection of movement and the behaviour of a horse in the box stall should represent a significant advancement. In this study, videos of horses in an animal hospital were recorded using an action camera and a time-lapse mode. These videos were processed using the convolutional neural network Loopy for automated prediction of body parts. Development of the model was carried out in several steps, including annotation of the key points, training of the network to generate the model and checking the model for its accuracy. The key points nose, withers and tail are detected with a sensitivity of more than 80% and an error rate between 2 and 7%, depending on the key point. By means of a case study, the possibility of further analysis with the acquired data was investigated. The results will significantly improve the pain recognition of horses and will help to develop algorithms for the automated recognition of behaviour using machine learning.
Collapse
|
11
|
Image Analysis and Computer Vision Applications in Animal Sciences: An Overview. Front Vet Sci 2020; 7:551269. [PMID: 33195522 PMCID: PMC7609414 DOI: 10.3389/fvets.2020.551269] [Citation(s) in RCA: 27] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2020] [Accepted: 09/15/2020] [Indexed: 11/13/2022] Open
Abstract
Computer Vision, Digital Image Processing, and Digital Image Analysis can be viewed as an amalgam of terms that very often are used to describe similar processes. Most of this confusion arises because these are interconnected fields that emerged with the development of digital image acquisition. Thus, there is a need to understand the connection between these fields, how a digital image is formed, and the differences regarding the many sensors available, each best suited for different applications. From the advent of the charge-coupled devices demarking the birth of digital imaging, the field has advanced quite fast. Sensors have evolved from grayscale to color with increasingly higher resolution and better performance. Also, many other sensors have appeared, such as infrared cameras, stereo imaging, time of flight sensors, satellite, and hyperspectral imaging. There are also images generated by other signals, such as sound (ultrasound scanners and sonars) and radiation (standard x-ray and computed tomography), which are widely used to produce medical images. In animal and veterinary sciences, these sensors have been used in many applications, mostly under experimental conditions and with just some applications yet developed on commercial farms. Such applications can range from the assessment of beef cuts composition to live animal identification, tracking, behavior monitoring, and measurement of phenotypes of interest, such as body weight, condition score, and gait. Computer vision systems (CVS) have the potential to be used in precision livestock farming and high-throughput phenotyping applications. We believe that the constant measurement of traits through CVS can reduce management costs and optimize decision-making in livestock operations, in addition to opening new possibilities in selective breeding. Applications of CSV are currently a growing research area and there are already commercial products available. However, there are still challenges that demand research for the successful development of autonomous solutions capable of delivering critical information. This review intends to present significant developments that have been made in CVS applications in animal and veterinary sciences and to highlight areas in which further research is still needed before full deployment of CVS in breeding programs and commercial farms.
Collapse
|
12
|
An open-source video tracking system for mouse locomotor activity analysis. BMC Res Notes 2020; 13:48. [PMID: 32000855 PMCID: PMC6990588 DOI: 10.1186/s13104-020-4916-6] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2019] [Accepted: 01/22/2020] [Indexed: 02/06/2023] Open
Abstract
Objective The ability to accurately and efficiently quantify mouse locomotor activity is essential for evaluating therapeutic efficacy and phenotyping genetically modified mice, in particular for the research of neuromuscular diseases. Our objective is to develop a program for video tracking of mice and locomotion analysis. Results Here we describe a MATLAB program for video tracking of mice and locomotion analysis. The system is composed of a webcam, an open field, and a computer with MATLAB installed. Animal behavior is recorded by the webcam and the video is then analyzed for mouse position on each frame by a customized MATLAB code. The system has been tested for analyzing two or more mice simultaneously placed in individual chambers. The accumulative moving distance, velocity and thigmotaxis (percentage of time spending in the outer peripheral of the arena, which is commonly used as an index of anxiety) within a test period can be readily obtained. This system can be easily implemented in any laboratory as an in vivo locomotion assay to assess the neuromuscular abnormality of genetically modified animals and the impact of therapeutic interventions.
Collapse
|
13
|
Abstract
Pigs are often selected for large animal models including for neuroscience and behavioral research, because their anatomy and biochemistry are similar to those of humans. However, behavioral assessments, in combination with objective long-term monitoring, is difficult. In this study, we introduced an automated video tracking system which was previously used in rodent studies, for use with pig models. Locomotor behaviors (total distance, number of zone transitions, and velocity) were evaluated and their changes were validated by different 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP) administration methods and dosing regimens. Three minipigs (23-29 kg) received subcutaneous or intravenous MPTP, either 1 or 3 times per week. Immediately after MPTP injection, the minipigs remained in a corner and exhibited reduced trajectory. In addition, the total distance travelled, number of zone transitions, and velocity were greatly reduced at every MPTP administration in all the minipigs, accompanying to increased resting time. However, the MPTP-induced symptoms were reversed when MPTP administration was terminated. In conclusion, this automated video-tracking system was able to monitor long-term locomotor activity and differentiate detailed alterations in large animals. It has the advantages of being easy to use, higher resolution, less effort, and more delicate tracking. Additionally, as our method can be applied to the animals' home pen, no habituation is needed.
Collapse
|
14
|
Recording behaviour of indoor-housed farm animals automatically using machine vision technology: A systematic review. PLoS One 2019; 14:e0226669. [PMID: 31869364 PMCID: PMC6927615 DOI: 10.1371/journal.pone.0226669] [Citation(s) in RCA: 50] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2019] [Accepted: 12/03/2019] [Indexed: 01/02/2023] Open
Abstract
Large-scale phenotyping of animal behaviour traits is time consuming and has led to increased demand for technologies that can automate these procedures. Automated tracking of animals has been successful in controlled laboratory settings, but recording from animals in large groups in highly variable farm settings presents challenges. The aim of this review is to provide a systematic overview of the advances that have occurred in automated, high throughput image detection of farm animal behavioural traits with welfare and production implications. Peer-reviewed publications written in English were reviewed systematically following Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. After identification, screening, and assessment for eligibility, 108 publications met these specifications and were included for qualitative synthesis. Data collected from the papers included camera specifications, housing conditions, group size, algorithm details, procedures, and results. Most studies utilized standard digital colour video cameras for data collection, with increasing use of 3D cameras in papers published after 2013. Papers including pigs (across production stages) were the most common (n = 63). The most common behaviours recorded included activity level, area occupancy, aggression, gait scores, resource use, and posture. Our review revealed many overlaps in methods applied to analysing behaviour, and most studies started from scratch instead of building upon previous work. Training and validation sample sizes were generally small (mean±s.d. groups = 3.8±5.8) and in data collection and testing took place in relatively controlled environments. To advance our ability to automatically phenotype behaviour, future research should build upon existing knowledge and validate technology under commercial settings and publications should explicitly describe recording conditions in detail to allow studies to be reproduced.
Collapse
|
15
|
Computer vision and remote sensing to assess physiological responses of cattle to pre-slaughter stress, and its impact on beef quality: A review. Meat Sci 2019; 156:11-22. [PMID: 31121361 DOI: 10.1016/j.meatsci.2019.05.007] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2019] [Revised: 05/04/2019] [Accepted: 05/06/2019] [Indexed: 10/26/2022]
Abstract
Pre-slaughter stress is well-known to affect meat quality of beef carcasses and methods have been developed to assess this stress. However, development of more practical and less invasive methods are required in order to assess the response of cattle to pre-slaughter stressors, which will potentially also assist with the prediction of beef quality. This review outlines the importance of pre-slaughter stress as well as existing and emerging technologies for quantification of the pre-slaughter stress. The review includes; i) indicators of meat quality and how they are affected by pre-slaughter stress in cattle, ii) contact techniques that have been commonly used to measure stress indicators in animals, iii) remotely sensed imagery techniques recently used as non-invasive methods to monitor physiological and behavioural parameters and iv) potential implementation of remotely sensed imagery data to perform contactless assessment of physiological measurements, which could be related to the pre-slaughter stress, as well as to the indicators of beef quality. Relevance to industry, conclusions and recommendations for research are included.
Collapse
|
16
|
A novel automated system to acquire biometric and morphological measurements and predict body weight of pigs via 3D computer vision. J Anim Sci 2019; 97:496-508. [PMID: 30371785 DOI: 10.1093/jas/sky418] [Citation(s) in RCA: 31] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2018] [Accepted: 10/25/2018] [Indexed: 11/12/2022] Open
Abstract
Computer vision applications in livestock are appealing since they enable measurement of traits of interest without the need to directly interact with the animals. This allows the possibility of multiple measurements of traits of interest with minimal animal stress. In the current study, an automated computer vision system was devised and evaluated for extraction of features of interest, as body measurements and shape descriptors, and prediction of body weight in pigs. From the 655 pigs that had data collected 580 had more than 5 frames recorded and were used for development of the predictive models. The cross-validation for the models developed with data from nursery and finishing pigs presented an R2 ranging from 0.86 (random selected image) to 0.94 (median of images truncated on the third quartile), whereas with the dataset without nursery pigs, the R2 estimates ranged from 0.70 (random selected image) to 0.84 (median of images truncated on the third quartile). However, overall the mean absolute error was lower for the models fitted without data on nursery animals. From the body measures extracted from the image, body volume, area, and length were the most informative for prediction of body weight. The inclusion of the remaining body measurements (width and heights) or shape descriptors to the model promoted significant improvement of the predictions, whereas the further inclusion of sex and line effects were not significant.
Collapse
|
17
|
Validation of locomotion scoring as a new and inexpensive technique to record circadian locomotor activity in large mammals. Heliyon 2018; 4:e00980. [PMID: 30582033 PMCID: PMC6287081 DOI: 10.1016/j.heliyon.2018.e00980] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/06/2018] [Revised: 11/19/2018] [Accepted: 11/23/2018] [Indexed: 11/29/2022] Open
Abstract
Background The locomotor activity (LA) rhythm, widely studied in rodents, has not been fully investigated in large mammals. This is due to the high cost and the brittleness of the required devices. Alternatively, the locomotion scoring method (SM), consisting of attribution of a score to various levels of activity would be a consistent method to assess the circadian LA rhythm in such species. New method To test this, a SM with a score ranging from 0 to 5 has been developed and used in two domestic large mammals, the camel and the goat. One minute interval scoring was performed using visual screening and monitoring of infra-red camera recording videos and carried out by two evaluators. Results The SM provides a clear daily LA rhythm that has been validated using an automate device, the Actiwatch-Mini. The obtained curves and actograms were indeed highly similar to those acquired from the Actiwatch-Mini. Moreover, there were no statistical differences in the period and acrophase. The period was exactly of 24.0h and the acrophases occurred at 12h05 ± 00h03 and 12h14 ± 00h07 for the camel and at 13h13 ± 00h09 and 12h57 ± 00h09 for the goat using SM and Actiwatch-Mini respectively. Comparison with existing methods Compared to the automatic system, the SM is inexpensive and has the advantage of describing all types of performed movements. Conclusions The new developed SM is highly reliable and sufficiently accurate to assess conveniently the LA rhythm and specific behaviors in large mammals. This opens new perspectives to study chronobiology in animal models of desert, tropical and equatorial zones.
Collapse
|
18
|
Abstract
Background Rodent models are frequently used in the research of pain and continue to provide valuable data on the mechanisms driving pain, although they are criticized due to limited translational ability to human conditions. Previously we have suggested pigs as a model for development of drugs for neuropathic pain. In this study, we investigate the spontaneous behavior of pigs following peripheral neuritis trauma (PNT)-induced neuropathic pain. Methods A computerized monitoring system was used to evaluate the changes in open field test in addition to applying a composite behavior scoring system. The data suggest that the PNT operation did not affect the animal’s ability to walk as the total distance walked by PNT animals was not significantly different from the total distance walked by sham-operated animals. However, PNT animals expressed a significant change in the pattern of walking. This effect was unrelated to the time that the animals spent in the open field. Following treatment with different drugs (morphine, buprenorphine, or gabapentin), the walking pattern of the animals in the open field changed in a drug-specific manner. In addition, the detailed behavior score revealed drug-specific changes following treatment. Results Pharmacokinetic analysis of the drug concentration in blood and cerebrospinal fluid correlated with the behavioral analysis. Conclusion The data of this study suggest that the open field test together with the detailed behavior score applied in this model are a powerful tool to assess the spontaneous behavior of pigs following PNT-induced neuropathic pain.
Collapse
|
19
|
A system for the real-time tracking of operant behavior as an application of 3D camera. J Exp Anal Behav 2018; 110:522-544. [PMID: 30230551 DOI: 10.1002/jeab.471] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2018] [Revised: 08/03/2018] [Accepted: 08/13/2018] [Indexed: 12/17/2022]
Abstract
The capacity of 3D cameras to measure many different aspects of behavior (e.g., velocity, pattern, and posture) could contribute to the understanding of behavior. The present article describes a system for the real-time tracking of operant behavior, which is applicable to other domains of behavioral science as well. Methods for real-time 3D tracking of animal behavior are described, along with sample C++ programs. A demonstration using one zebrafish as a subject indicated that the present system successfully tracked the 3D motion of the fish. Moreover, the acquisition of a target response (i.e., approach to a corner of the aquarium) was demonstrated with the arrangement of a reinforcement contingency at the corner in the absence of a traditional, salient operandum. The system offers the capacity to characterize more completely ongoing behavior in learning tasks across a range of species than simply performance of discrete operant responses. The system also is capable of tracking multiple individuals simultaneously so it is possible both to study social interactions and arrange contingencies for engaging in social behavior. Other possible applications of 3D cameras are discussed.
Collapse
|
20
|
D-Track-A semi-automatic 3D video-tracking technique to analyse movements and routines of aquatic animals with application to captive dolphins. PLoS One 2018; 13:e0201614. [PMID: 30114265 PMCID: PMC6095516 DOI: 10.1371/journal.pone.0201614] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2018] [Accepted: 07/18/2018] [Indexed: 11/19/2022] Open
Abstract
Scoring and tracking animal movements manually is a time consuming and subjective process, susceptible to errors due to fatigue. Automated and semi-automated video-based tracking methods have been developed to overcome the errors and biases of manual analyses. In this manuscript we present D-Track, an open-source semi-automatic tracking system able to quantify the 3D trajectories of dolphins, non-invasively, in the water. This software produces a three-dimensional reconstruction of the pool and tracks the animal at different depths, using standard cameras. D-Track allows the determination of spatial preferences of the animals, their speed and its variations, and the identification of behavioural routines. We tested the system with two captive dolphins during different periods of the day. Both animals spent around 85% of the time at the surface of the Deep Area of their pool (5-meters depth). Both dolphins showed a stable average speed throughout 31 sessions, with slow speeds predominant (maximum 1.7 ms-1). Circular swimming was highly variable, with significant differences in the size and duration of the “circles”, between animals, within-animals and across sessions. The D-Track system is a novel tool to study the behaviour of aquatic animals, and it represents a convenient and inexpensive solution for laboratories and marine parks to monitor the preferences and routines of their animals.
Collapse
|
21
|
Voluntary locomotor activity promotes myogenic growth potential in domestic pigs. Sci Rep 2018; 8:2533. [PMID: 29416067 PMCID: PMC5803246 DOI: 10.1038/s41598-018-20652-2] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2017] [Accepted: 01/23/2018] [Indexed: 01/25/2023] Open
Abstract
Self-determined physical activity is an essential behavioural need and can vary considerably between individuals of a given species. Although locomotion is suggested as a prerequisite for adequate function of skeletal muscle, domestic pigs are usually reared under limited space allowance. The aim of our study was to investigate if a different voluntary locomotor activity leads to altered properties in the muscle structure, biochemistry and mRNA expression of selected genes involved in myogenesis and skeletal muscle metabolism. Based on a video tracking method, we assigned pigs to three categories according to their total distances walked over five observed time points: long distance, medium distance, and short distance. The microstructure and biochemistry parameters of the M. semitendinosus were unaffected by the distance categories. However, we found distance-dependent differences in the mRNA expression of the genes encoding growth (IGF2, EGF, MSTN) and transcription factors (MRF4, MYOD). In particular, the IGF2/MSTN ratio appears to be a sensitive indicator, at the molecular level, for the locomotor activity of individuals. Our results indicate that the myogenic growth potential of pigs under standard rearing conditions is triggered by their displayed voluntary locomotor activity, but the covered distances are insufficient to induce adaptive changes at the tissue level.
Collapse
|
22
|
Automated tracking to measure behavioural changes in pigs for health and welfare monitoring. Sci Rep 2017; 7:17582. [PMID: 29242594 PMCID: PMC5730557 DOI: 10.1038/s41598-017-17451-6] [Citation(s) in RCA: 63] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2017] [Accepted: 11/27/2017] [Indexed: 11/13/2022] Open
Abstract
Since animals express their internal state through behaviour, changes in said behaviour may be used to detect early signs of problems, such as in animal health. Continuous observation of livestock by farm staff is impractical in a commercial setting to the degree required to detect behavioural changes relevant for early intervention. An automated monitoring system is developed; it automatically tracks pig movement with depth video cameras, and automatically measures standing, feeding, drinking, and locomotor activities from 3D trajectories. Predictions of standing, feeding, and drinking were validated, but not locomotor activities. An artificial, disruptive challenge; i.e., introduction of a novel object, is used to cause reproducible behavioural changes to enable development of a system to detect the changes automatically. Validation of the automated monitoring system with the controlled challenge study provides a reproducible framework for further development of robust early warning systems for pigs. The automated system is practical in commercial settings because it provides continuous monitoring of multiple behaviours, with metrics of behaviours that may be considered more intuitive and have diagnostic validity. The method has the potential to transform how livestock are monitored, directly impact their health and welfare, and address issues in livestock farming, such as antimicrobial use.
Collapse
|
23
|
|
24
|
|
25
|
Simple command-line open-source software to analyse behavioural observation video recordings. ACTA ZOOL ACAD SCI H 2017. [DOI: 10.17109/azh.63.1.137.2017] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
|
26
|
Early detection of health and welfare compromises through automated detection of behavioural changes in pigs. Vet J 2016; 217:43-51. [PMID: 27810210 PMCID: PMC5110645 DOI: 10.1016/j.tvjl.2016.09.005] [Citation(s) in RCA: 125] [Impact Index Per Article: 15.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2015] [Revised: 09/20/2016] [Accepted: 09/23/2016] [Indexed: 11/23/2022]
Abstract
Early detection of health and welfare compromises in commercial piggeries is essential for timely intervention to enhance treatment success, reduce impact on welfare, and promote sustainable pig production. Behavioural changes that precede or accompany subclinical and clinical signs may have diagnostic value. Often referred to as sickness behaviour, this encompasses changes in feeding, drinking, and elimination behaviours, social behaviours, and locomotion and posture. Such subtle changes in behaviour are not easy to quantify and require lengthy observation input by staff, which is impractical on a commercial scale. Automated early-warning systems may provide an alternative by objectively measuring behaviour with sensors to automatically monitor and detect behavioural changes. This paper aims to: (1) review the quantifiable changes in behaviours with potential diagnostic value; (2) subsequently identify available sensors for measuring behaviours; and (3) describe the progress towards automating monitoring and detection, which may allow such behavioural changes to be captured, measured, and interpreted and thus lead to automation in commercial, housed piggeries. Multiple sensor modalities are available for automatic measurement and monitoring of behaviour, which require humans to actively identify behavioural changes. This has been demonstrated for the detection of small deviations in diurnal drinking, deviations in feeding behaviour, monitoring coughs and vocalisation, and monitoring thermal comfort, but not social behaviour. However, current progress is in the early stages of developing fully automated detection systems that do not require humans to identify behavioural changes; e.g., through automated alerts sent to mobile phones. Challenges for achieving automation are multifaceted and trade-offs are considered between health, welfare, and costs, between analysis of individuals and groups, and between generic and compromise-specific behaviours.
Collapse
|
27
|
Quick, Accurate, Smart: 3D Computer Vision Technology Helps Assessing Confined Animals' Behaviour. PLoS One 2016; 11:e0158748. [PMID: 27415814 PMCID: PMC4944961 DOI: 10.1371/journal.pone.0158748] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2016] [Accepted: 06/21/2016] [Indexed: 11/18/2022] Open
Abstract
Mankind directly controls the environment and lifestyles of several domestic species for purposes ranging from production and research to conservation and companionship. These environments and lifestyles may not offer these animals the best quality of life. Behaviour is a direct reflection of how the animal is coping with its environment. Behavioural indicators are thus among the preferred parameters to assess welfare. However, behavioural recording (usually from video) can be very time consuming and the accuracy and reliability of the output rely on the experience and background of the observers. The outburst of new video technology and computer image processing gives the basis for promising solutions. In this pilot study, we present a new prototype software able to automatically infer the behaviour of dogs housed in kennels from 3D visual data and through structured machine learning frameworks. Depth information acquired through 3D features, body part detection and training are the key elements that allow the machine to recognise postures, trajectories inside the kennel and patterns of movement that can be later labelled at convenience. The main innovation of the software is its ability to automatically cluster frequently observed temporal patterns of movement without any pre-set ethogram. Conversely, when common patterns are defined through training, a deviation from normal behaviour in time or between individuals could be assessed. The software accuracy in correctly detecting the dogs’ behaviour was checked through a validation process. An automatic behaviour recognition system, independent from human subjectivity, could add scientific knowledge on animals’ quality of life in confinement as well as saving time and resources. This 3D framework was designed to be invariant to the dog’s shape and size and could be extended to farm, laboratory and zoo quadrupeds in artificial housing. The computer vision technique applied to this software is innovative in non-human animal behaviour science. Further improvements and validation are needed, and future applications and limitations are discussed.
Collapse
|
28
|
Abstract
The study of developmental neurotoxicity (DNT) continues to be an important component of safety evaluation of candidate therapeutic agents and of industrial and environmental chemicals. Developmental neurotoxicity is considered to be an adverse change in the central and/or peripheral nervous system during development of an organism and has been primarily evaluated by studying functional outcomes, such as changes in behavior, neuropathology, neurochemistry, and/or neurophysiology. Neurobehavioral evaluations are a component of a wide range of toxicology studies in laboratory animal models, whereas neurochemistry and neurophysiology are less commonly employed. Although the primary focus of this article is on neurobehavioral evaluation in pre- and postnatal development and juvenile toxicology studies used in pharmaceutical development, concepts may also apply to adult nonclinical safety studies and Environmental Protection Agency/chemical assessments. This article summarizes the proceedings of a symposium held during the 2015 American College of Toxicology annual meeting and includes a discussion of the current status of DNT testing as well as potential issues and recommendations. Topics include the regulatory context for DNT testing; study design and interpretation; behavioral test selection, including a comparison of core learning and memory systems; age of testing; repeated testing of the same animals; use of alternative animal models; impact of findings; and extrapolation of animal results to humans. Integration of the regulatory experience and scientific concepts presented during this symposium, as well as from subsequent discussion and input, provides a synopsis of the current state of DNT testing in safety assessment, as well as a potential roadmap for future advancement.
Collapse
|
29
|
MouseMove: an open source program for semi-automated analysis of movement and cognitive testing in rodents. Sci Rep 2015; 5:16171. [PMID: 26530459 PMCID: PMC4632026 DOI: 10.1038/srep16171] [Citation(s) in RCA: 48] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2015] [Accepted: 09/30/2015] [Indexed: 12/27/2022] Open
Abstract
The Open Field (OF) test is one of the most commonly used assays for assessing exploratory behaviour and generalised locomotor activity in rodents. Nevertheless, the vast majority of researchers still rely upon costly commercial systems for recording and analysing OF test results. Consequently, our aim was to design a freely available program for analysing the OF test and to provide an accompanying protocol that was minimally invasive, rapid, unbiased, without the need for specialised equipment or training. Similar to commercial systems, we show that our software—called MouseMove—accurately quantifies numerous parameters of movement including travel distance, speed, turning and curvature. To assess its utility, we used MouseMove to quantify unilateral locomotor deficits in mice following the filament-induced middle cerebral artery occlusion model of acute ischemic stroke. MouseMove can also monitor movement within defined regions-of-interest and is therefore suitable for analysing the Novel Object Recognition test and other field-related cognitive tests. To the best of our knowledge, MouseMove is the first open source software capable of providing qualitative and quantitative information on mouse locomotion in a semi-automated and high-throughput fashion, and hence MouseMove represents a sound alternative to commercial movement analysis systems.
Collapse
|
30
|
Illumination and Reflectance Estimation with its Application in Foreground Detection. SENSORS 2015; 15:21407-26. [PMID: 26343675 PMCID: PMC4610519 DOI: 10.3390/s150921407] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/14/2015] [Revised: 08/18/2015] [Accepted: 08/24/2015] [Indexed: 11/29/2022]
Abstract
In this paper, we introduce a novel approach to estimate the illumination and reflectance of an image. The approach is based on illumination-reflectance model and wavelet theory. We use a homomorphic wavelet filter (HWF) and define a wavelet quotient image (WQI) model based on dyadic wavelet transform. The illumination and reflectance components are estimated by using HWF and WQI, respectively. Based on the illumination and reflectance estimation we develop an algorithm to segment sows in grayscale video recordings which are captured in complex farrowing pens. Experimental results demonstrate that the algorithm can be applied to detect the domestic animals in complex environments such as light changes, motionless foreground objects and dynamic background.
Collapse
|
31
|
Application of 3-D imaging sensor for tracking minipigs in the open field test. J Neurosci Methods 2014; 235:219-25. [DOI: 10.1016/j.jneumeth.2014.07.012] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2014] [Revised: 07/16/2014] [Accepted: 07/16/2014] [Indexed: 11/26/2022]
|
32
|
Automated video analysis of pig activity at pen level highly correlates to human observations of behavioural activities. Livest Sci 2014. [DOI: 10.1016/j.livsci.2013.12.011] [Citation(s) in RCA: 48] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
|
33
|
Abstract
When studying animal behaviour within an open environment, movement-related data are often important for behavioural analyses. Therefore, simple and efficient techniques are needed to present and analyze the data of such movements. However, it is challenging to present both spatial and temporal information of movements within a two-dimensional image representation. To address this challenge, we developed the spectral time-lapse (STL) algorithm that re-codes an animal’s position at every time point with a time-specific color, and overlays it with a reference frame of the video, to produce a summary image. We additionally incorporated automated motion tracking, such that the animal’s position can be extracted and summary statistics such as path length and duration can be calculated, as well as instantaneous velocity and acceleration. Here we describe the STL algorithm and offer a freely available MATLAB toolbox that implements the algorithm and allows for a large degree of end-user control and flexibility.
Collapse
|
34
|
|
35
|
ETHOWATCHER: validation of a tool for behavioral and video-tracking analysis in laboratory animals. Comput Biol Med 2012; 42:257-64. [DOI: 10.1016/j.compbiomed.2011.12.002] [Citation(s) in RCA: 86] [Impact Index Per Article: 7.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2010] [Revised: 07/11/2011] [Accepted: 12/02/2011] [Indexed: 11/30/2022]
|
36
|
An infrared range camera-based approach for three-dimensional locomotion tracking and pose reconstruction in a rodent. J Neurosci Methods 2011; 201:116-23. [PMID: 21835202 DOI: 10.1016/j.jneumeth.2011.07.019] [Citation(s) in RCA: 37] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2011] [Revised: 07/20/2011] [Accepted: 07/21/2011] [Indexed: 11/18/2022]
Abstract
We herein introduce an automated three-dimensional (3D) locomotion tracking and pose reconstruction system for rodents with superior robustness, rapidity, reliability, resolution, simplicity, and cost. An off-the-shelf composite infrared (IR) range camera was adopted to grab high-resolution depth images (640×480×2048 pixels at 20Hz) in our system for automated behavior analysis. For the inherent 3D structure of the depth images, we developed a compact algorithm to reconstruct the locomotion and body behavior with superior temporal and solid spatial resolution. Since the range camera operates in the IR spectrum, interference from the visible light spectrum did not affect the tracking performance. The accuracy of our system was 98.1±3.2%. We also validated the system, which yielded strong correlation with automated and manual tracking. Meanwhile, the system replicates a detailed dynamic rat model in virtual space, which demonstrates the movements of the extremities of the body and locomotion in detail on varied terrain.
Collapse
|
37
|
|
38
|
Automatic system for analysis of locomotor activity in rodents—A reproducibility study. J Neurosci Methods 2011; 195:216-21. [DOI: 10.1016/j.jneumeth.2010.12.016] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2010] [Revised: 12/11/2010] [Accepted: 12/13/2010] [Indexed: 11/28/2022]
|
39
|
Application of a Disposable Screen-Printed Electrode to Depression Diagnosis for Laboratory Rats Based on Blood Serotonin Detection. ANAL SCI 2011; 27:839-43. [DOI: 10.2116/analsci.27.839] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
|
40
|
The utility of the minipig as an animal model in regulatory toxicology. J Pharmacol Toxicol Methods 2010; 62:196-220. [DOI: 10.1016/j.vascn.2010.05.009] [Citation(s) in RCA: 309] [Impact Index Per Article: 22.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2009] [Revised: 05/21/2010] [Accepted: 05/24/2010] [Indexed: 11/26/2022]
|
41
|
|
42
|
The precision of video and photocell tracking systems and the elimination of tracking errors with infrared backlighting. J Neurosci Methods 2010; 188:45-52. [PMID: 20138914 DOI: 10.1016/j.jneumeth.2010.01.035] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2009] [Revised: 01/20/2010] [Accepted: 01/29/2010] [Indexed: 11/19/2022]
Abstract
Automated tracking offers a number of advantages over both manual and photocell tracking methodologies, including increased reliability, validity, and flexibility of application. Despite the advantages that video offers, our experience has been that video systems cannot track a mouse consistently when its coat color is in low contrast with the background. Furthermore, the local lab lighting can influence how well results are quantified. To test the effect of lighting, we built devices that provide a known path length for any given trial duration, at a velocity close to the average speed of a mouse in the open-field and the circular water maze. We found that the validity of results from two commercial video tracking systems (ANY-maze and EthoVision XT) depends greatly on the level of contrast and the quality of the lighting. A photocell detection system was immune to lighting problems but yielded a path length that deviated from the true length. Excellent precision was achieved consistently, however, with video tracking using infrared backlighting in both the open field and water maze. A high correlation (r=0.98) between the two software systems was observed when infrared backlighting was used with live mice.
Collapse
|
43
|
Antidepressant effects of ginseng total saponins in the forced swimming test and chronic mild stress models of depression. Prog Neuropsychopharmacol Biol Psychiatry 2009; 33:1417-24. [PMID: 19632285 DOI: 10.1016/j.pnpbp.2009.07.020] [Citation(s) in RCA: 159] [Impact Index Per Article: 10.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/10/2009] [Revised: 07/14/2009] [Accepted: 07/17/2009] [Indexed: 01/19/2023]
Abstract
Ginseng total saponins (GTS) are the major active components of Panax ginseng C.A. Meyer, which has been used as a popular tonic herb for 2000 years in Far East countries. In the present study, two classic animal models: the forced swimming test (FST) and the chronic mild stress (CMS) model were used to evaluate the antidepressant-like activities of GTS. It was observed that GTS at doses of 50 and 100 mg/kg significantly reduced the immobility time in the FST in mice after 7-day treatment. GTS also reversed the reduction in the sucrose preference index, decrease in locomotor activity as well as prolongation of latency of feeding in the novelty environment displayed by CMS rats. In addition, HPLC-ECD and immunohistochemical staining analysis indicated that the CMS-induced decrease in monoamine neurotransmitter concentration and brain-derived neurotrophic factor (BDNF) expression in the hippocampus were almost completely reversed by GTS. In conclusion, GTS exerts antidepressant-like effects in two highly specific and predictive animal models of depression. The activity of GTS in antidepression may be mediated partly through enhancing the monoamine neurotransmitter concentration and BDNF expression in the hippocampus.
Collapse
|
44
|
|
45
|
Video imaging system for automated shaping and analysis of complex locomotory behavior. J Neurosci Methods 2009; 182:34-42. [PMID: 19501618 DOI: 10.1016/j.jneumeth.2009.05.016] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2009] [Revised: 05/20/2009] [Accepted: 05/25/2009] [Indexed: 10/20/2022]
Abstract
Although many observational technologies have been developed for the study of behavior, most of these technologies have suffered from the inability to engender highly reproducible behaviors that can be observed and modified. We have developed ACROBAT (Automated Control in Real-Time of Operant Behavior and Training), a video imaging system and associated computer algorithms that allow the fully automated shaping and analysis of complex locomotory behaviors. While this operant conditioning system is particularly useful for measuring the acquisition and maintenance of complex topographies, it also provides a more general and user friendly platform on which to develop novel paradigms for the study of learning and memory in animals. In this paper we describe the instrumentation and software developed, demonstrate the use of ACROBAT to shape a specific topography, and show how the system can be used to facilitate the study of arthritic pain in mice.
Collapse
|
46
|
Experimental model of lacunar infarction in the gyrencephalic brain of the miniature pig: neurological assessment and histological, immunohistochemical, and physiological evaluation of dynamic corticospinal tract deformation. Stroke 2007; 39:205-12. [PMID: 18048856 DOI: 10.1161/strokeaha.107.489906] [Citation(s) in RCA: 67] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/07/2023]
Abstract
BACKGROUND AND PURPOSE Lacunar infarction accounts for 25% of ischemic strokes, but the pathological characteristics have not been investigated systematically. A new experimental model of lacunar infarction in the miniature pig was developed to investigate the pathophysiological changes in the corticospinal tract from the acute to chronic phases. METHODS Thirty-five miniature pigs underwent transcranial surgery for permanent anterior choroidal artery occlusion. Animals recovered for 24 hours (n=7), 2 (n=5), 3 (n=2), 4 (n=2), 6 (n=1), 7 (n=7), 8 (n=2), and 9 days (n=1), 2 weeks (n=2), 4 weeks (n=3), and more than 4 weeks (n=3). Neurology, electrophysiology, histology, and MRI were performed. Seven additional miniature pigs underwent transient anterior choroidal artery occlusion to study muscle motor-evoked potentials and evaluate corticospinal tract function during transient anterior choroidal artery occlusion. RESULTS The protocol had a 91.4% success rate in induction of internal capsule infarction 286+/-153 mm(3) (mean+/-SD). Motor-evoked potentials revealed the presence of penumbral tissue in the internal capsule after 6 to 15 minutes anterior choroidal artery occlusion. Total neurological deficit scores of 15.0 (95% CI, 13.5 to 16.4) and 3.4 (0.3 to 6.4) were recorded for permanent anterior choroidal artery occlusion and sham groups, respectively (P<0.001, maximum score 25) with motor deficit scores of 3.4 (95% CI, 2.9 to 4.0) and 0.0 (CI, 0.0 to 0.0), respectively (P<0.001, maximum score 9). Histology revealed that the internal capsule lesion expands gradually from acute to chronic phases. CONCLUSIONS This new model of lacunar infarction induces a reproducible infarct in subcortical white matter with a measurable functional deficit and evidence of penumbral tissue acutely.
Collapse
|
47
|
Assessment of neonatal rat's activity by the automated registration of the animal entries in the squares of a testing arena. J Neurosci Methods 2007; 164:299-303. [PMID: 17548112 DOI: 10.1016/j.jneumeth.2007.04.014] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2007] [Revised: 04/11/2007] [Accepted: 04/22/2007] [Indexed: 11/29/2022]
Abstract
Automated registration of neonatal rat entries in the squares of a testing chamber is suggested for the animal locomotion assessment. This method allows detection of paddling and pivoting activities that are not accompanied by forward movement of the animal. The proposed technique is also relatively insensitive to nonlocomotor changes in a pup's body position, such as breathing and shaking, and thus offers a selective detection of locomotor-related activity. The application of the method permits the evaluation of spontaneous and stimulated motor activity of neonatal rats using relatively short test duration and a minimal number of animals.
Collapse
|
48
|
The effect of the inter-phase delay interval in the spontaneous object recognition test for pigs. Behav Brain Res 2007; 181:210-7. [PMID: 17524499 DOI: 10.1016/j.bbr.2007.04.007] [Citation(s) in RCA: 24] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2007] [Revised: 04/09/2007] [Accepted: 04/15/2007] [Indexed: 10/23/2022]
Abstract
In the neuroscience community interest for using the pig is growing. Several disease models have been developed creating a need for validation of behavioural paradigms in these animals. Here, we report the effect of different inter-phase delay intervals on the performance of Göttingen minipigs in the spontaneous object recognition test. The test consisted of a sample and a test phase. First, the pigs explored two similar objects. After a 10-min, 1-h, or 24-h delay two different objects were presented; one familiar from the sample phase and one novel. An exploration-time difference between the novel and the familiar object was interpreted as recognition of the familiar object. We scored the exploration times both manually and automatically, and compared the methods. A strong discrimination between novel and familiar objects after a 10-min inter-phase delay interval and no discrimination after 24h were found in our set-up of the spontaneous object recognition test. After a 1-h delay, the pigs still showed a significant habituation to the familiar object, but no discrimination was observed. Discrimination between the two objects was mainly confined to the first half of the test phase, and we observed a high between-subject variation. Furthermore, automatic tracking was valid for determination of habituation and discrimination parameters but lead to an overestimation of individual measurements. We conclude that the spontaneous object recognition test for pigs is sensitive to increasing inter-phase delay intervals, and that automatic data acquisition can be applied.
Collapse
|
49
|
The use of pigs in neuroscience: Modeling brain disorders. Neurosci Biobehav Rev 2007; 31:728-51. [PMID: 17445892 DOI: 10.1016/j.neubiorev.2007.02.003] [Citation(s) in RCA: 359] [Impact Index Per Article: 21.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2006] [Revised: 02/05/2007] [Accepted: 02/18/2007] [Indexed: 11/22/2022]
Abstract
The use of pigs in neuroscience research has increased in the past decade, which has seen broader recognition of the potential of pigs as an animal for experimental modeling of human brain disorders. The volume of available background data concerning pig brain anatomy and neurochemistry has increased considerably in recent years. The pig brain, which is gyrencephalic, resembles the human brain more in anatomy, growth and development than do the brains of commonly used small laboratory animals. The size of the pig brain permits the identification of cortical and subcortical structures by imaging techniques. Furthermore, the pig is an increasingly popular laboratory animal for transgenic manipulations of neural genes. The present paper focuses on evaluating the potential for modeling symptoms, phenomena or constructs of human brain diseases in pigs, the neuropsychiatric disorders in particular. Important practical and ethical aspects of the use of pigs as an experimental animal as pertaining to relevant in vivo experimental brain techniques are reviewed. Finally, current knowledge of aspects of behavioral processes including learning and memory are reviewed so as to complete the summary of the status of pigs as a species suitable for experimental models of diverse human brain disorders.
Collapse
|
50
|
A simple webcam-based approach for the measurement of rodent locomotion and other behavioural parameters. J Neurosci Methods 2006; 157:91-7. [PMID: 16701901 DOI: 10.1016/j.jneumeth.2006.04.005] [Citation(s) in RCA: 33] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/24/2005] [Revised: 03/14/2006] [Accepted: 04/06/2006] [Indexed: 11/28/2022]
Abstract
We hereby describe a simple and inexpensive approach to evaluate the position and locomotion of rodents in an arena. The system is based on webcam registering of animal behaviour with subsequent analysis on customized software. Based on black/white differentiation, it provides rapid evaluation of animal position over a period of time, and can be used in a myriad of behavioural tasks in which locomotion, velocity or place preference are variables of interest. A brief review of the results obtained so far with this system and a discussion of other possible applications in behavioural neuroscience are also included. Such a system can be easily implemented in most laboratories and can significantly reduce the time and costs involved in behavioural analysis, especially in developing countries.
Collapse
|