1
|
Akbarein H, Taaghi MH, Mohebbi M, Soufizadeh P. Applications and Considerations of Artificial Intelligence in Veterinary Sciences: A Narrative Review. Vet Med Sci 2025; 11:e70315. [PMID: 40173266 PMCID: PMC11964155 DOI: 10.1002/vms3.70315] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/20/2024] [Revised: 02/18/2025] [Accepted: 03/07/2025] [Indexed: 04/04/2025] Open
Abstract
In recent years, artificial intelligence (AI) has brought about a significant transformation in healthcare, streamlining manual tasks and allowing professionals to focus on critical responsibilities while AI handles complex procedures. This shift is not limited to human healthcare; it extends to veterinary medicine as well, where AI's predictive analytics and diagnostic abilities are improving standards of animal care. Consequently, healthcare systems stand to gain notable advantages, such as enhanced accessibility, treatment efficacy, and optimized resource allocation, owing to the seamless integration of AI. This article presents a comprehensive review of the manifold applications of AI within the domain of veterinary science, categorizing them into four domains: clinical practice, biomedical research, public health, and administration. It also examines the primary machine learning algorithms used in relevant studies, highlighting emerging trends in the field. The research serves as a valuable resource for scholars, offering insights into current trends and serving as a starting point for those new to the field.
Collapse
Affiliation(s)
- Hesameddin Akbarein
- Department of Food Hygiene & Quality ControlFaculty of Veterinary MedicineUniversity of TehranTehranIran
| | | | - Mahyar Mohebbi
- Department of Surgery and RadiologyFaculty of Veterinary MedicineUniversity of TehranTehranIran
| | - Parham Soufizadeh
- Faculty of Veterinary MedicineUniversity of TehranTehranIran
- Department of Research and DevelopmentIntellia AgencyTehranIran
| |
Collapse
|
2
|
Phelipon R, Lansade L, Razzaq M. Using deep learning models to decode emotional states in horses. Sci Rep 2025; 15:13154. [PMID: 40269006 PMCID: PMC12018932 DOI: 10.1038/s41598-025-95853-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2024] [Accepted: 03/21/2025] [Indexed: 04/25/2025] Open
Abstract
In this study, we explore machine learning models for predicting emotional states in ridden horses. We manually label the images to train the models in a supervised manner. We perform data exploration and use different cropping methods, mainly based on Yolo and Faster R-CNN, to create two new datasets: 1) the cropped body, and 2) the cropped head dataset. We train various convolutional neural network (CNN) models on both cropped and uncropped datasets and compare their performance in emotion prediction of ridden horses. Despite the cropped head dataset lacking important regions like the tail (commonly annotated by experts), it yields the best results with an accuracy of 87%, precision of 79%, and recall of 97%. Furthermore, we update our models using various techniques, such as transfer learning and fine-tuning, to further improve their performance. Finally, we employ three interpretation methods to analyze the internal workings of our models, finding that LIME effectively identifies features similar to those used by experts for annotation.
Collapse
Affiliation(s)
- Romane Phelipon
- INRAE, CNRS, Université de Tours, PRC, 37380, Nouzilly, France
| | - Lea Lansade
- INRAE, CNRS, Université de Tours, PRC, 37380, Nouzilly, France
| | - Misbah Razzaq
- INRAE, CNRS, Université de Tours, PRC, 37380, Nouzilly, France.
| |
Collapse
|
3
|
Kschonek J, Twele L, Deters K, Miller M, Reinmold J, Emmerich I, Hennig-Pauka I, Kemper N, Kreienbrock L, Wendt M, Kästner S, Grosse Beilage E. Part I: understanding pain in pigs-basic knowledge about pain assessment, measures and therapy. Porcine Health Manag 2025; 11:12. [PMID: 40069905 PMCID: PMC11895375 DOI: 10.1186/s40813-025-00421-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2024] [Accepted: 01/17/2025] [Indexed: 03/15/2025] Open
Abstract
BACKGROUND Pigs can suffer from pain due to spontaneously occurring diseases, wounds, injuries, trauma, and physiological conditions such as the farrowing process; however, this pain is often neglected. To increase knowledge and awareness about this phenomenon, the current article presents a scoping review of basic and new approaches for identifying, evaluating, and treating pain in pigs. METHODS A scoping review was conducted with results from a search of the electronic database VetSearch and CABI. With regard to eligibility criteria, 49 out of 725 publications between 2015 and the end of March 2023 were included. The findings are narratively synthesized and reported orienting on the PRISMA ScR guideline. RESULTS The results of this review showed that practitioners need to consider pain not only as a sign of a disease but also as a critical aspect of welfare. If both the symptoms of pain and the underlying reasons remain unassessed, the longevity and prosperity of pigs may be at risk. In this respect, veterinarians are obliged to know about intricacies of pain and pain mechanisms and to provide adequate treatment for their patients. CONCLUSION It is pivotal to increase knowledge about pain mechanisms, the reasons for heterogeneity in behavioural signs of pain, and methods for evaluating whether a pig is experiencing pain. This article will help practitioners update their knowledge of this topic and discuss the implications for everyday practice.
Collapse
Affiliation(s)
- Julia Kschonek
- Institute for Biometry, Epidemiology and Information Processing (IBEI), University of Veterinary Medicine, Foundation, Hannover, Bünteweg 2, 30559, Hannover, Germany.
| | - Lara Twele
- Clinic for Horses, University of Veterinary Medicine, Foundation, Hannover, Bünteweg 9, 30559, Hannover, Germany
| | - Kathrin Deters
- Field Station for Epidemiology, University of Veterinary Medicine, Foundation, Hannover, Büscheler Str. 9, 49456, Bakum, Germany
| | - Moana Miller
- Institute for Animal Hygiene, Animal Welfare and Farm Animal Behavior, University of Veterinary Medicine, Foundation, Hannover, Bischofsholer Damm 15, 30173, Hannover, Germany
| | - Jennifer Reinmold
- Field Station for Epidemiology, University of Veterinary Medicine, Foundation, Hannover, Büscheler Str. 9, 49456, Bakum, Germany
| | - Ilka Emmerich
- Institute of Pharmacology, Pharmacy and Toxicology, Faculty of Veterinary Medicine, University Leipzig, An den Tierkliniken 39, 04103, Leipzig, Germany
| | - Isabel Hennig-Pauka
- Field Station for Epidemiology, University of Veterinary Medicine, Foundation, Hannover, Büscheler Str. 9, 49456, Bakum, Germany
| | - Nicole Kemper
- Institute for Animal Hygiene, Animal Welfare and Farm Animal Behavior, University of Veterinary Medicine, Foundation, Hannover, Bischofsholer Damm 15, 30173, Hannover, Germany
| | - Lothar Kreienbrock
- Institute for Biometry, Epidemiology and Information Processing (IBEI), University of Veterinary Medicine, Foundation, Hannover, Bünteweg 2, 30559, Hannover, Germany
| | - Michael Wendt
- Clinic for Swine and Small Ruminants, Forensic Medicine and Ambulatory Service, University of Veterinary Medicine, Foundation, Hannover, Bischofsholer Damm 15, 30173, Hannover, Germany
| | - Sabine Kästner
- Clinic for Small Animals, University of Veterinary Medicine, Foundation, Hannover, Bünteweg 2, 30559, Hannover, Germany
| | - Elisabeth Grosse Beilage
- Field Station for Epidemiology, University of Veterinary Medicine, Foundation, Hannover, Büscheler Str. 9, 49456, Bakum, Germany
| |
Collapse
|
4
|
Bhave A, Kieson E, Hafner A, Gloor PA. Identifying Novel Emotions and Wellbeing of Horses from Videos Through Unsupervised Learning. SENSORS (BASEL, SWITZERLAND) 2025; 25:859. [PMID: 39943498 PMCID: PMC11819734 DOI: 10.3390/s25030859] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/05/2025] [Revised: 01/22/2025] [Accepted: 01/30/2025] [Indexed: 02/16/2025]
Abstract
This research applies unsupervised learning on a large original dataset of horses in the wild to identify previously unidentified horse emotions. We construct a novel, high-quality, diverse dataset of 3929 images consisting of five wild horse breeds worldwide at different geographical locations. We base our analysis on the seven Panksepp emotions of mammals "Exploring", "Sadness", "Playing", "Rage", "Fear", "Affectionate" and "Lust", along with one additional emotion "Pain" which has been shown to be highly relevant for horses. We apply the contrastive learning framework MoCo (Momentum Contrast for Unsupervised Visual Representation Learning) on our dataset to predict the seven Panksepp emotions and "Pain" using unsupervised learning. We significantly modify the MoCo framework, building a custom downstream classifier network that connects with a frozen CNN encoder that is pretrained using MoCo. Our method allows the encoder network to learn similarities and differences within image groups on its own without labels. The clusters thus formed are indicative of deeper nuances and complexities within a horse's mood, which can possibly hint towards the existence of novel and complex equine emotions.
Collapse
Affiliation(s)
- Aarya Bhave
- Massachusetts Institute of Technology, System Design & Management, Cambridge, MA 02142, USA;
| | | | - Alina Hafner
- TUM School of Computation, Information and Technology, Technical University of Munich, Arcisstraße 21, 80333 Munich, Germany;
| | - Peter A. Gloor
- Massachusetts Institute of Technology, System Design & Management, Cambridge, MA 02142, USA;
| |
Collapse
|
5
|
Feighelstein M, Luna SP, Silva NO, Trindade PE, Shimshoni I, van der Linden D, Zamansky A. Comparison between AI and human expert performance in acute pain assessment in sheep. Sci Rep 2025; 15:626. [PMID: 39754012 PMCID: PMC11698723 DOI: 10.1038/s41598-024-83950-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2023] [Accepted: 12/18/2024] [Indexed: 01/06/2025] Open
Abstract
This study explores the question whether Artificial Intelligence (AI) can outperform human experts in animal pain recognition using sheep as a case study. It uses a dataset of N = 48 sheep undergoing surgery with video recordings taken before (no pain) and after (pain) surgery. Four veterinary experts used two types of pain scoring scales: the sheep facial expression scale (SFPES) and the Unesp-Botucatu composite behavioral scale (USAPS), which is the 'golden standard' in sheep pain assessment. The developed AI pipeline based on CLIP encoder significantly outperformed human facial scoring (AUC difference = 0.115, p < 0.001) when having access to the same visual information (front and lateral face images). It further effectively equaled human USAPS behavioral scoring (AUC difference = 0.027, p = 0.163), but the small improvement was not statistically significant. The fact that the machine can outperform human experts in recognizing pain in sheep when exposed to the same visual information has significant implications for clinical practice, which warrant further scientific discussion.
Collapse
Affiliation(s)
| | - Stelio P Luna
- School of Veterinary Medicine and Animal Science, Sao Paolo State University (Unesp), São Paulo, Brazil
| | - Nuno O Silva
- School of Veterinary Medicine and Animal Science, Sao Paolo State University (Unesp), São Paulo, Brazil
| | - Pedro E Trindade
- Department of Population Pathobiology, North Carolina State University, Raleigh, USA
| | - Ilan Shimshoni
- Department of Information Systems, University of Haifa, Haifa, Israel
| | - Dirk van der Linden
- Department of Computer and Information Sciences, Northumbria University, Newcastle upon Tyne, UK
| | - Anna Zamansky
- Department of Information Systems, University of Haifa, Haifa, Israel.
| |
Collapse
|
6
|
Merkies K, Trudel K. How well can you tell? Success of human categorisation of horse behavioural responses depicted in media. Anim Welf 2024; 33:e50. [PMID: 39600357 PMCID: PMC11589072 DOI: 10.1017/awf.2024.55] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2024] [Revised: 10/07/2024] [Accepted: 10/10/2024] [Indexed: 11/29/2024]
Abstract
Horses employ a range of subtle to overt behaviours to communicate their current affective state. Humans who are more cognisant of their own bodily sensations may be more attuned to recognising affective states in horses (Equus caballus) thereby promoting positive human-horse interactions. This study investigated human ability to categorise human-horse interactions depicted in media relative to equine behaviour experts and compared participant scores to their level of interoception. Using an online survey, participants (n = 534) categorised 31 photographs and videos as (overt) positive, likely (subtle) positive, neutral, likely (subtle) negative or (overt) negative human-horse interactions from the horse's point of view and completed the Multidimensional Assessment of Interoceptive Awareness questionnaire (MAIA-2) to assess their level of interoception. Demographic information was also collected (age, gender, education, level of experience with horses, location). Participants differed from expert categorisations of horse affective states across all categories, exactly matching experts only 52.5% of the time and approximately matching experts for positive and negative valence 78.5% of the time. The MAIA-2 did not predict participant ability to accurately categorise human-horse interactions. Women outperformed men in categorising overt positive, overt negative and subtle negative human-horse interactions. Increased levels of education and greater experience with horses were associated with improved categorisation of certain human-horse interactions. More training or awareness is needed to recognise behavioural indicators of horse affect to guide appropriate human-horse activities that impact horse welfare.
Collapse
Affiliation(s)
- Katrina Merkies
- Department of Animal Biosciences and Campbell Centre for the Study of Animal Welfare, University of GuelphGuelph, ONN1G 2W1, Canada
| | - Katelyn Trudel
- Department of Animal Biosciences and Campbell Centre for the Study of Animal Welfare, University of GuelphGuelph, ONN1G 2W1, Canada
| |
Collapse
|
7
|
Chiavaccini L, Gupta A, Anclade N, Chiavaccini G, De Gennaro C, Johnson AN, Portela DA, Romano M, Vettorato E, Luethy D. Automated acute pain prediction in domestic goats using deep learning-based models on video-recordings. Sci Rep 2024; 14:27104. [PMID: 39511381 PMCID: PMC11543859 DOI: 10.1038/s41598-024-78494-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2024] [Accepted: 10/31/2024] [Indexed: 11/15/2024] Open
Abstract
Facial expressions are essential in animal communication, and facial expression-based pain scales have been developed for different species. Automated pain recognition offers a valid alternative to manual annotation with growing evidence across species. This study applied machine learning (ML) methods, using a pre-trained VGG-16 base and a Support Vector Machine classifier to automate pain recognition in caprine patients in hospital settings, evaluating different frame extraction rates and validation techniques. The study included goats of different breed, age, sex, and varying medical conditions presented to the University of Florida's Large Animal Hospital. Painful status was determined using the UNESP-Botucatu Goat Acute Pain Scale. The final dataset comprised images from 40 goats (20 painful, 20 non-painful), with 2,253 'non-painful' and 3,154 'painful' images at 1 frame per second (FPS) extraction rate and 7,630 'non-painful' and 9,071 'painful' images at 3 FPS. Images were used to train deep learning-based models with different approaches. The model input was raw images, and pain presence was the target attribute (model output). For the single train-test split and 5-fold cross-validation, the models achieved approximately 80% accuracy, while the subject-wise 10-fold cross-validation showed mean accuracies above 60%. These findings suggest ML's potential in goat pain assessment.
Collapse
Affiliation(s)
- Ludovica Chiavaccini
- Department of Comparative, Diagnostic and Population Medicine, College of Veterinary Medicine, University of Florida, 2015 SW 16th Avenue, PO BOX 100123, Gainesville, FL, 32610-0123, USA.
| | - Anjali Gupta
- Department of Comparative, Diagnostic and Population Medicine, College of Veterinary Medicine, University of Florida, 2015 SW 16th Avenue, PO BOX 100123, Gainesville, FL, 32610-0123, USA
| | - Nicole Anclade
- Department of Comparative, Diagnostic and Population Medicine, College of Veterinary Medicine, University of Florida, 2015 SW 16th Avenue, PO BOX 100123, Gainesville, FL, 32610-0123, USA
| | | | - Chiara De Gennaro
- Department of Comparative, Diagnostic and Population Medicine, College of Veterinary Medicine, University of Florida, 2015 SW 16th Avenue, PO BOX 100123, Gainesville, FL, 32610-0123, USA
| | - Alanna N Johnson
- Department of Comparative, Diagnostic and Population Medicine, College of Veterinary Medicine, University of Florida, 2015 SW 16th Avenue, PO BOX 100123, Gainesville, FL, 32610-0123, USA
| | - Diego A Portela
- Department of Comparative, Diagnostic and Population Medicine, College of Veterinary Medicine, University of Florida, 2015 SW 16th Avenue, PO BOX 100123, Gainesville, FL, 32610-0123, USA
| | - Marta Romano
- Department of Comparative, Diagnostic and Population Medicine, College of Veterinary Medicine, University of Florida, 2015 SW 16th Avenue, PO BOX 100123, Gainesville, FL, 32610-0123, USA
| | - Enzo Vettorato
- Department of Comparative, Diagnostic and Population Medicine, College of Veterinary Medicine, University of Florida, 2015 SW 16th Avenue, PO BOX 100123, Gainesville, FL, 32610-0123, USA
| | - Daniela Luethy
- Department of Clinical Studies - New Bolton Center, School of Veterinary Medicine, University of Pennsylvania, Philadelphia, PA, USA
| |
Collapse
|
8
|
Zimmermann B, Castro ANC, Lendez PA, Carrica Illia M, Carrica Illia MP, Teyseyre AR, Toloza JM, Ghezzi MD, Mota-Rojas D. Anatomical and functional basis of facial expressions and their relationship with emotions in horses. Res Vet Sci 2024; 180:105418. [PMID: 39303445 DOI: 10.1016/j.rvsc.2024.105418] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2024] [Revised: 08/31/2024] [Accepted: 09/13/2024] [Indexed: 09/22/2024]
Abstract
An emotion is defined as the affective response to a stimulus that leads to specific bodily changes, enabling individuals to react to positive or negative environmental conditions. In the absence of speech, emotions in animals are primarily studied by observing expressive components, such as facial expressions. This review aims to analyze the available literature on the influence of environmental stimuli on measurable behaviors in horses, describing the anatomical components involved in perception at the central nervous system level and the efferent pathways that trigger facial muscle contraction or relaxation, thus altering facial expressions. Additionally, articles addressing the function of facial expressions in communication are discussed, emphasizing their role in social interactions in this species. While there is limited research on equine neurophysiology, considering the common structure of the limbic system in most mammals, studies conducted on canines and primates were taken into account. In conclusion, the article underscores the importance of understanding equine facial expressions to assess their emotional states and, by extension, their welfare.
Collapse
Affiliation(s)
- Barbara Zimmermann
- Animal Welfare Area, Faculty of Veterinary Sciences (FCV), Universidad Nacional del Centro de la Provincia de Buenos Aires (UNCPBA), University Campus, Tandil 7000, Argentina; Anatomy Area, Faculty of Veterinary Sciences (FCV), Universidad Nacional del Centro de la Provincia de Buenos Aires (UNCPBA), University Campus, Tandil 7000, Argentina.
| | - Alejandra Nelly Cristina Castro
- Anatomy Area, Faculty of Veterinary Sciences (FCV), Universidad Nacional del Centro de la Provincia de Buenos Aires (UNCPBA), University Campus, Tandil 7000, Argentina.
| | - Pamela Anahí Lendez
- Anatomy Area, Faculty of Veterinary Sciences (FCV), Universidad Nacional del Centro de la Provincia de Buenos Aires (UNCPBA), University Campus, Tandil 7000, Argentina; CIVETAN, UNCPBA-CICPBA-CONICET, Tandil, Buenos Aires, Argentina.
| | - Mariano Carrica Illia
- Anatomy Area, Faculty of Veterinary Sciences (FCV), Universidad Nacional del Centro de la Provincia de Buenos Aires (UNCPBA), University Campus, Tandil 7000, Argentina.
| | - María Paula Carrica Illia
- Anatomy Area, Faculty of Veterinary Sciences (FCV), Universidad Nacional del Centro de la Provincia de Buenos Aires (UNCPBA), University Campus, Tandil 7000, Argentina.
| | - Alfredo Raúl Teyseyre
- Higher Institute of Software Engineering of Tandil (ISISTAN) Faculty of Exact Sciences (FCExa), Universidad Nacional del Centro de la Provincia de Buenos Aires (UNCPBA), - Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET), University Campus, Tandil 7000, Argentina.
| | - Juan Manuel Toloza
- Higher Institute of Software Engineering of Tandil (ISISTAN) Faculty of Exact Sciences (FCExa), Universidad Nacional del Centro de la Provincia de Buenos Aires (UNCPBA), - Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET), University Campus, Tandil 7000, Argentina.
| | - Marcelo Daniel Ghezzi
- Animal Welfare Area, Faculty of Veterinary Sciences (FCV), Universidad Nacional del Centro de la Provincia de Buenos Aires (UNCPBA), University Campus, Tandil 7000, Argentina; Anatomy Area, Faculty of Veterinary Sciences (FCV), Universidad Nacional del Centro de la Provincia de Buenos Aires (UNCPBA), University Campus, Tandil 7000, Argentina.
| | - Daniel Mota-Rojas
- Neurophysiology, Behaviour and Animal Welfare Assessment, DPAA, Xochimilco Campus, Universidad Autónoma Metropolitana, Ciudad de México 04960, Mexico.
| |
Collapse
|
9
|
van Loon JPAM, de Grauw JC, van Dierendonck MC, Burden F, Rickards K. Objective assessment of chronic pain in donkeys using the Donkey Chronic Pain Scale. Vet Anaesth Analg 2024; 51:531-538. [PMID: 39142979 DOI: 10.1016/j.vaa.2024.05.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2023] [Revised: 05/15/2024] [Accepted: 05/25/2024] [Indexed: 08/16/2024]
Abstract
OBJECTIVE To clinically evaluate previously developed pain scales [Donkey Chronic Pain Composite Pain Scale (DCP-CPS), Donkey Chronic Pain Facial Assessment of Pain (DCP-FAP) and combined Donkey Chronic Pain Scale (DCPS)], including behavioural and facial expression-based variables, for the assessment of chronic pain in donkeys. STUDY DESIGN Prospective, blinded clinical study. ANIMAL A group of 77 donkeys (34 patients and 43 healthy control animals). METHODS Animals were assessed by two observers that were blinded to the condition of the animals. RESULTS Both DCP-CPS and DCP-FAP, and resulting combined DCPS scores, showed good interobserver reliability [intraclass correlation coefficient (ICC) = 0.91, 95% confidence interval (CI) = 0.86-0.95, p < 0.001; ICC = 0.71, CI = 0.50-0.83, p < 0.001 and ICC = 0.84, CI = 0.72-0.91, p < 0.001, respectively]. All scores (DCP-CPS, DCP-FAP and the resulting combined DCPS) were significantly higher for patients than for controls at all time points (p < 0.001 for all three scales). Sensitivity and specificity for identification of pain (cut-off value >3) was 73.0% and 65.1% for DCP-CPS, and 60.9% and 83.3% for DCP-FAP, respectively. For the combined DCPS, sensitivity was 87.0% and specificity 90.9% (cut-off value >6). CONCLUSIONS AND CLINICAL RELEVANCE Based on behavioural and facial expression-based variables, DCPS proved a promising and reproducible tool to assess different types of chronic pain in donkeys. The combination of behavioural and facial expression-based variables showed the best discriminatory characteristics in the current study. Further studies are needed for refinement of these tools.
Collapse
Affiliation(s)
- Johannes P A M van Loon
- Department of Clinical Sciences, Faculty of Veterinary Medicine, Utrecht University, Utrecht, The Netherlands; Sporthorse Medical Diagnostic Centre (SMDC), Heesch, The Netherlands.
| | - Janny C de Grauw
- Department of Clinical Sciences, Faculty of Veterinary Medicine, Utrecht University, Utrecht, The Netherlands; Department of Clinical Sciences and Services, Royal Veterinary College, Hatfield, United Kingdom
| | - Machteld C van Dierendonck
- Veterinary Faculty Ghent University, Ghent, Belgium; Department of Veterinary Sciences Antwerp University, Antwerp, Belgium; Equus Research & Therapy, Stroe, The Netherlands
| | - Faith Burden
- The Donkey Sanctuary, Sidmouth, Devon, United Kingdom
| | | |
Collapse
|
10
|
Chiavaccini L, Gupta A, Chiavaccini G. From facial expressions to algorithms: a narrative review of animal pain recognition technologies. Front Vet Sci 2024; 11:1436795. [PMID: 39086767 PMCID: PMC11288915 DOI: 10.3389/fvets.2024.1436795] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2024] [Accepted: 07/03/2024] [Indexed: 08/02/2024] Open
Abstract
Facial expressions are essential for communication and emotional expression across species. Despite the improvements brought by tools like the Horse Grimace Scale (HGS) in pain recognition in horses, their reliance on human identification of characteristic traits presents drawbacks such as subjectivity, training requirements, costs, and potential bias. Despite these challenges, the development of facial expression pain scales for animals has been making strides. To address these limitations, Automated Pain Recognition (APR) powered by Artificial Intelligence (AI) offers a promising advancement. Notably, computer vision and machine learning have revolutionized our approach to identifying and addressing pain in non-verbal patients, including animals, with profound implications for both veterinary medicine and animal welfare. By leveraging the capabilities of AI algorithms, we can construct sophisticated models capable of analyzing diverse data inputs, encompassing not only facial expressions but also body language, vocalizations, and physiological signals, to provide precise and objective evaluations of an animal's pain levels. While the advancement of APR holds great promise for improving animal welfare by enabling better pain management, it also brings forth the need to overcome data limitations, ensure ethical practices, and develop robust ground truth measures. This narrative review aimed to provide a comprehensive overview, tracing the journey from the initial application of facial expression recognition for the development of pain scales in animals to the recent application, evolution, and limitations of APR, thereby contributing to understanding this rapidly evolving field.
Collapse
Affiliation(s)
- Ludovica Chiavaccini
- Department of Comparative, Diagnostic, and Population Medicine, College of Veterinary Medicine, University of Florida, Gainesville, FL, United States
| | - Anjali Gupta
- Department of Comparative, Diagnostic, and Population Medicine, College of Veterinary Medicine, University of Florida, Gainesville, FL, United States
| | | |
Collapse
|
11
|
Feighelstein M, Riccie-Bonot C, Hasan H, Weinberg H, Rettig T, Segal M, Distelfeld T, Shimshoni I, Mills DS, Zamansky A. Automated recognition of emotional states of horses from facial expressions. PLoS One 2024; 19:e0302893. [PMID: 39008504 PMCID: PMC11249218 DOI: 10.1371/journal.pone.0302893] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2023] [Accepted: 04/16/2024] [Indexed: 07/17/2024] Open
Abstract
Animal affective computing is an emerging new field, which has so far mainly focused on pain, while other emotional states remain uncharted territories, especially in horses. This study is the first to develop AI models to automatically recognize horse emotional states from facial expressions using data collected in a controlled experiment. We explore two types of pipelines: a deep learning one which takes as input video footage, and a machine learning one which takes as input EquiFACS annotations. The former outperforms the latter, with 76% accuracy in separating between four emotional states: baseline, positive anticipation, disappointment and frustration. Anticipation and frustration were difficult to separate, with only 61% accuracy.
Collapse
Affiliation(s)
| | | | - Hana Hasan
- Information Systems Department, University of Haifa, Haifa, Israel
| | - Hallel Weinberg
- Information Systems Department, University of Haifa, Haifa, Israel
| | - Tidhar Rettig
- Information Systems Department, University of Haifa, Haifa, Israel
| | - Maya Segal
- Faculty of Electrical Engineering, Technion, Israel Institute of Technology, Haifa, Israel
| | - Tomer Distelfeld
- Faculty of Electrical Engineering, Technion, Israel Institute of Technology, Haifa, Israel
| | - Ilan Shimshoni
- Information Systems Department, University of Haifa, Haifa, Israel
| | - Daniel S. Mills
- Department of Life Sciences, Joseph Banks Laboratories, University of Lincoln, Lincoln, United Kingdom
| | - Anna Zamansky
- Information Systems Department, University of Haifa, Haifa, Israel
| |
Collapse
|
12
|
Onuma K, Watanabe M, Sasaki N. The grimace scale: a useful tool for assessing pain in laboratory animals. Exp Anim 2024; 73:234-245. [PMID: 38382945 PMCID: PMC11254488 DOI: 10.1538/expanim.24-0010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2024] [Accepted: 02/13/2024] [Indexed: 02/23/2024] Open
Abstract
Accurately and promptly assessing pain in experimental animals is extremely important to avoid unnecessary suffering of the animals and to enhance the reproducibility of experiments. This is a key concern for veterinarians, animal caretakers, and researchers from the perspectives of veterinary care and animal welfare. Various methods including ethology, immunohistochemistry, electrophysiology, and molecular biology are used for pain assessment. However, the grimace scale, which was developed by taking cues from interpreting pain through facial expressions of non-verbal infants, has become recognized as a very simple and practical method for objectively evaluating pain levels by scoring changes in an animal's expressions. This method, which was first implemented with mice approximately 10 years ago, is now being applied to various experimental animals and is widely used in research settings. This review focuses on the usability of the grimace scale from the "cage-side" perspective, aiming to make it a more user-friendly tool for those involved in animal experiments. Differences in facial expressions in response to pain in various animals, examples of applying the grimace scale, current automated analytical methods, and future prospects are discussed.
Collapse
Affiliation(s)
- Kenta Onuma
- Laboratory of Laboratory Animal Science and Medicine, School of Veterinary Medicine, Kitasato University, 35-1 Higashi-23, Towada, Aomori 034-0021, Japan
| | - Masaki Watanabe
- Laboratory of Laboratory Animal Science and Medicine, School of Veterinary Medicine, Kitasato University, 35-1 Higashi-23, Towada, Aomori 034-0021, Japan
| | - Nobuya Sasaki
- Laboratory of Laboratory Animal Science and Medicine, School of Veterinary Medicine, Kitasato University, 35-1 Higashi-23, Towada, Aomori 034-0021, Japan
| |
Collapse
|
13
|
Gris VN, Crespo TR, Kaneko A, Okamoto M, Suzuki J, Teramae JN, Miyabe-Nishiwaki T. Deep Learning for Face Detection and Pain Assessment in Japanese macaques ( Macaca fuscata). JOURNAL OF THE AMERICAN ASSOCIATION FOR LABORATORY ANIMAL SCIENCE : JAALAS 2024; 63:403-411. [PMID: 38428929 PMCID: PMC11270042 DOI: 10.30802/aalas-jaalas-23-000056] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/13/2023] [Revised: 07/31/2023] [Accepted: 01/04/2024] [Indexed: 03/03/2024]
Abstract
Facial expressions have increasingly been used to assess emotional states in mammals. The recognition of pain in research animals is essential for their well-being and leads to more reliable research outcomes. Automating this process could contribute to early pain diagnosis and treatment. Artificial neural networks have become a popular option for image classification tasks in recent years due to the development of deep learning. In this study, we investigated the ability of a deep learning model to detect pain in Japanese macaques based on their facial expression. Thirty to 60 min of video footage from Japanese macaques undergoing laparotomy was used in the study. Macaques were recorded undisturbed in their cages before surgery (No Pain) and one day after the surgery before scheduled analgesia (Pain). Videos were processed for facial detection and image extraction with the algorithms RetinaFace (adding a bounding box around the face for image extraction) or Mask R-CNN (contouring the face for extraction). ResNet50 used 75% of the images to train systems; the other 25% were used for testing. Test accuracy varied from 48 to 54% after box extraction. The low accuracy of classification after box extraction was likely due to the incorporation of features that were not relevant for pain (for example, background, illumination, skin color, or objects in the enclosure). However, using contour extraction, preprocessing the images, and fine-tuning, the network resulted in 64% appropriate generalization. These results suggest that Mask R-CNN can be used for facial feature extractions and that the performance of the classifying model is relatively accurate for nonannotated single-frame images.
Collapse
Affiliation(s)
- Vanessa N Gris
- Primate Research Institute and
- Center for the Evolutionary Origins of Human Behavior, Kyoto University, Inuyama, Japan; and
| | - Thomás R Crespo
- Department of Advanced Mathematical Sciences, Graduate School of Informatics, Kyoto University, Kyoto, Japan
| | - Akihisa Kaneko
- Primate Research Institute and
- Center for the Evolutionary Origins of Human Behavior, Kyoto University, Inuyama, Japan; and
| | - Munehiro Okamoto
- Primate Research Institute and
- Center for the Evolutionary Origins of Human Behavior, Kyoto University, Inuyama, Japan; and
| | | | - Jun-nosuke Teramae
- Department of Advanced Mathematical Sciences, Graduate School of Informatics, Kyoto University, Kyoto, Japan
| | - Takako Miyabe-Nishiwaki
- Primate Research Institute and
- Center for the Evolutionary Origins of Human Behavior, Kyoto University, Inuyama, Japan; and
| |
Collapse
|
14
|
Arnold B, Ramakrishnan R, Wright A, Wilson K, VandeVord PJ. An automated rat grimace scale for the assessment of pain. Sci Rep 2023; 13:18859. [PMID: 37914795 PMCID: PMC10620195 DOI: 10.1038/s41598-023-46123-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2023] [Accepted: 10/27/2023] [Indexed: 11/03/2023] Open
Abstract
Pain is a complex neuro-psychosocial experience that is internal and private, making it difficult to assess in both humans and animals. In pain research, animal models are prominently used, with rats among the most commonly studied. The rat grimace scale (RGS) measures four facial action units to quantify the pain behaviors of rats. However, manual recording of RGS scores is a time-consuming process that requires training. While computer vision models have been developed and utilized for various grimace scales, there are currently no models for RGS. To address this gap, this study worked to develop an automated RGS system which can detect facial action units in rat images and predict RGS scores. The automated system achieved an action unit detection precision and recall of 97%. Furthermore, the action unit RGS classifiers achieved a weighted accuracy of 81-93%. The system's performance was evaluated using a blast traumatic brain injury study, where it was compared to trained human graders. The results showed an intraclass correlation coefficient of 0.82 for the total RGS score, indicating that the system was comparable to human graders. The automated tool could enhance pain research by providing a standardized and efficient method for the assessment of RGS.
Collapse
Affiliation(s)
- Brendan Arnold
- School of Biomedical Engineering and Sciences, Virginia Tech, Blacksburg, VA, USA
| | | | - Amirah Wright
- School of Biomedical Engineering and Sciences, Virginia Tech, Blacksburg, VA, USA
| | - Kelsey Wilson
- School of Biomedical Engineering and Sciences, Virginia Tech, Blacksburg, VA, USA
| | - Pamela J VandeVord
- School of Biomedical Engineering and Sciences, Virginia Tech, Blacksburg, VA, USA.
- Veterans Affairs Medical Center, Salem, VA, USA.
- Department of Biomedical Engineering and Mechanics, Virginia Tech, 440 Kelly Hall, 325 Stanger St., Blacksburg, VA, 24060, USA.
| |
Collapse
|
15
|
Feighelstein M, Ehrlich Y, Naftaly L, Alpin M, Nadir S, Shimshoni I, Pinho RH, Luna SPL, Zamansky A. Deep learning for video-based automated pain recognition in rabbits. Sci Rep 2023; 13:14679. [PMID: 37674052 PMCID: PMC10482887 DOI: 10.1038/s41598-023-41774-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2023] [Accepted: 08/31/2023] [Indexed: 09/08/2023] Open
Abstract
Despite the wide range of uses of rabbits (Oryctolagus cuniculus) as experimental models for pain, as well as their increasing popularity as pets, pain assessment in rabbits is understudied. This study is the first to address automated detection of acute postoperative pain in rabbits. Using a dataset of video footage of n = 28 rabbits before (no pain) and after surgery (pain), we present an AI model for pain recognition using both the facial area and the body posture and reaching accuracy of above 87%. We apply a combination of 1 sec interval sampling with the Grayscale Short-Term stacking (GrayST) to incorporate temporal information for video classification at frame level and a frame selection technique to better exploit the availability of video data.
Collapse
Affiliation(s)
| | - Yamit Ehrlich
- Information Systems Department, University of Haifa, Haifa, Israel
| | - Li Naftaly
- Information Systems Department, University of Haifa, Haifa, Israel
| | - Miriam Alpin
- Faculty of Electrical Engineering, Technion, Israel Institute of Technology, Haifa, Israel
| | - Shenhav Nadir
- Faculty of Electrical Engineering, Technion, Israel Institute of Technology, Haifa, Israel
| | - Ilan Shimshoni
- Information Systems Department, University of Haifa, Haifa, Israel
| | - Renata H Pinho
- Faculty of Veterinary Medicine, University of Calgary, Calgary, Canada
| | - Stelio P L Luna
- School of Veterinary Medicine and Animal Science, São Paulo State University (UNESP), São Paulo, Brazil
| | - Anna Zamansky
- Information Systems Department, University of Haifa, Haifa, Israel.
| |
Collapse
|
16
|
Alexeenko V, Jeevaratnam K. Artificial intelligence: Is it wizardry, witchcraft, or a helping hand for an equine veterinarian? Equine Vet J 2023; 55:719-722. [PMID: 37551620 DOI: 10.1111/evj.13969] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2023] [Accepted: 06/14/2023] [Indexed: 08/09/2023]
Affiliation(s)
- Vadim Alexeenko
- School of Veterinary Medicine, University of Surrey, Surrey, UK
| | | |
Collapse
|
17
|
Feighelstein M, Henze L, Meller S, Shimshoni I, Hermoni B, Berko M, Twele F, Schütter A, Dorn N, Kästner S, Finka L, Luna SPL, Mills DS, Volk HA, Zamansky A. Explainable automated pain recognition in cats. Sci Rep 2023; 13:8973. [PMID: 37268666 DOI: 10.1038/s41598-023-35846-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2023] [Accepted: 05/24/2023] [Indexed: 06/04/2023] Open
Abstract
Manual tools for pain assessment from facial expressions have been suggested and validated for several animal species. However, facial expression analysis performed by humans is prone to subjectivity and bias, and in many cases also requires special expertise and training. This has led to an increasing body of work on automated pain recognition, which has been addressed for several species, including cats. Even for experts, cats are a notoriously challenging species for pain assessment. A previous study compared two approaches to automated 'pain'/'no pain' classification from cat facial images: a deep learning approach, and an approach based on manually annotated geometric landmarks, reaching comparable accuracy results. However, the study included a very homogeneous dataset of cats and thus further research to study generalizability of pain recognition to more realistic settings is required. This study addresses the question of whether AI models can classify 'pain'/'no pain' in cats in a more realistic (multi-breed, multi-sex) setting using a more heterogeneous and thus potentially 'noisy' dataset of 84 client-owned cats. Cats were a convenience sample presented to the Department of Small Animal Medicine and Surgery of the University of Veterinary Medicine Hannover and included individuals of different breeds, ages, sex, and with varying medical conditions/medical histories. Cats were scored by veterinary experts using the Glasgow composite measure pain scale in combination with the well-documented and comprehensive clinical history of those patients; the scoring was then used for training AI models using two different approaches. We show that in this context the landmark-based approach performs better, reaching accuracy above 77% in pain detection as opposed to only above 65% reached by the deep learning approach. Furthermore, we investigated the explainability of such machine recognition in terms of identifying facial features that are important for the machine, revealing that the region of nose and mouth seems more important for machine pain classification, while the region of ears is less important, with these findings being consistent across the models and techniques studied here.
Collapse
Affiliation(s)
| | - Lea Henze
- Department of Small Animal Medicine and Surgery, University of Veterinary Medicine Hannover, Hannover, Germany
| | - Sebastian Meller
- Department of Small Animal Medicine and Surgery, University of Veterinary Medicine Hannover, Hannover, Germany
| | - Ilan Shimshoni
- Information Systems Department, University of Haifa, Haifa, Israel
| | - Ben Hermoni
- Faculty of Electrical Engineering, Technion, Israel Institute of Technology, Haifa, Israel
| | - Michael Berko
- Faculty of Electrical Engineering, Technion, Israel Institute of Technology, Haifa, Israel
| | - Friederike Twele
- Department of Small Animal Medicine and Surgery, University of Veterinary Medicine Hannover, Hannover, Germany
| | - Alexandra Schütter
- Department of Small Animal Medicine and Surgery, University of Veterinary Medicine Hannover, Hannover, Germany
| | - Nora Dorn
- Department of Small Animal Medicine and Surgery, University of Veterinary Medicine Hannover, Hannover, Germany
| | - Sabine Kästner
- Department of Small Animal Medicine and Surgery, University of Veterinary Medicine Hannover, Hannover, Germany
| | - Lauren Finka
- Cats Protection, National Cat Centre, Chelwood Gate, Sussex, UK
| | - Stelio P L Luna
- School of Veterinary Medicine and Animal Science, São Paulo State University (Unesp), São Paulo, Brazil
| | - Daniel S Mills
- School of Life Sciences, Joseph Bank Laboratories, University of Lincoln, Lincoln, UK
| | - Holger A Volk
- Department of Small Animal Medicine and Surgery, University of Veterinary Medicine Hannover, Hannover, Germany
| | - Anna Zamansky
- Information Systems Department, University of Haifa, Haifa, Israel.
| |
Collapse
|
18
|
Explainable automated recognition of emotional states from canine facial expressions: the case of positive anticipation and frustration. Sci Rep 2022; 12:22611. [PMID: 36585439 PMCID: PMC9803655 DOI: 10.1038/s41598-022-27079-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2022] [Accepted: 12/26/2022] [Indexed: 12/31/2022] Open
Abstract
In animal research, automation of affective states recognition has so far mainly addressed pain in a few species. Emotional states remain uncharted territories, especially in dogs, due to the complexity of their facial morphology and expressions. This study contributes to fill this gap in two aspects. First, it is the first to address dog emotional states using a dataset obtained in a controlled experimental setting, including videos from (n = 29) Labrador Retrievers assumed to be in two experimentally induced emotional states: negative (frustration) and positive (anticipation). The dogs' facial expressions were measured using the Dogs Facial Action Coding System (DogFACS). Two different approaches are compared in relation to our aim: (1) a DogFACS-based approach with a two-step pipeline consisting of (i) a DogFACS variable detector and (ii) a positive/negative state Decision Tree classifier; (2) An approach using deep learning techniques with no intermediate representation. The approaches reach accuracy of above 71% and 89%, respectively, with the deep learning approach performing better. Secondly, this study is also the first to study explainability of AI models in the context of emotion in animals. The DogFACS-based approach provides decision trees, that is a mathematical representation which reflects previous findings by human experts in relation to certain facial expressions (DogFACS variables) being correlates of specific emotional states. The deep learning approach offers a different, visual form of explainability in the form of heatmaps reflecting regions of focus of the network's attention, which in some cases show focus clearly related to the nature of particular DogFACS variables. These heatmaps may hold the key to novel insights on the sensitivity of the network to nuanced pixel patterns reflecting information invisible to the human eye.
Collapse
|
19
|
Fischer-Tenhagen C, Meier J, Pohl A. "Do not look at me like that": Is the facial expression score reliable and accurate to evaluate pain in large domestic animals? A systematic review. Front Vet Sci 2022; 9:1002681. [PMID: 36561394 PMCID: PMC9763617 DOI: 10.3389/fvets.2022.1002681] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2022] [Accepted: 11/09/2022] [Indexed: 12/12/2022] Open
Abstract
Introduction Facial expression scoring has proven to be useful for pain evaluation in humans. In the last decade, equivalent scales have been developed for various animal species, including large domestic animals. The research question of this systematic review was as follows: is facial expression scoring (intervention) a valid method to evaluate pain (the outcome) in large domestic animals (population)? Method We searched two databases for relevant articles using the search string: "grimace scale" OR "facial expression" AND animal OR "farm animal" NOT "mouse" NOT "rat" NOT "laboratory animal." The risk of bias was estimated by adapting the Quality Assessment of Diagnostic Accuracy Studies (QUADAS) checklist. Results The search strategy extracted 30 articles, with the major share on equids and a considerable number on cows, pigs, and sheep. Most studies evaluated facial action units (FAUs), including the eye region, the orbital region, the cheek or the chewing muscles, the lips, the mouth, and the position of the ears. Interobserver reliability was tested in 21 studies. Overall FAU reliability was substantial, but there were differences for individual FAUs. The position of the ear had almost perfect interobserver reliability (interclass coefficient (ICC): 0.73-0.97). Validity was tested in five studies with the reported accuracy values ranging from 68.2 to 80.0%. Discussion This systematic review revealed that facial expression scores provide an easy method for learning and reliable test results to identify whether an animal is in pain or distress. Many studies lack a reference standard and a true control group. Further research is warranted to evaluate the test accuracy of facial expression scoring as a live pen side test.
Collapse
Affiliation(s)
- Carola Fischer-Tenhagen
- German Centre for the Protection of Laboratory Animals (Bf3R), German Federal Institute for Risk Assessment (BfR), Berlin, Germany,*Correspondence: Carola Fischer-Tenhagen
| | - Jennifer Meier
- German Centre for the Protection of Laboratory Animals (Bf3R), German Federal Institute for Risk Assessment (BfR), Berlin, Germany
| | - Alina Pohl
- Clinic of Animal Reproduction, Freie Universität Berlin, Berlin, Germany
| |
Collapse
|
20
|
Going Deeper than Tracking: A Survey of Computer-Vision Based Recognition of Animal Pain and Emotions. Int J Comput Vis 2022. [DOI: 10.1007/s11263-022-01716-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
Abstract
AbstractAdvances in animal motion tracking and pose recognition have been a game changer in the study of animal behavior. Recently, an increasing number of works go ‘deeper’ than tracking, and address automated recognition of animals’ internal states such as emotions and pain with the aim of improving animal welfare, making this a timely moment for a systematization of the field. This paper provides a comprehensive survey of computer vision-based research on recognition of pain and emotional states in animals, addressing both facial and bodily behavior analysis. We summarize the efforts that have been presented so far within this topic—classifying them across different dimensions, highlight challenges and research gaps, and provide best practice recommendations for advancing the field, and some future directions for research.
Collapse
|
21
|
Gris VN, Broche N, Kaneko A, Okamoto M, Suzuki J, Mills DS, Miyabe-Nishiwaki T. Investigating subtle changes in facial expression to assess acute pain in Japanese macaques. Sci Rep 2022; 12:19675. [PMID: 36385151 PMCID: PMC9669003 DOI: 10.1038/s41598-022-23595-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2022] [Accepted: 11/02/2022] [Indexed: 11/17/2022] Open
Abstract
Changes in facial expression provide cues for assessing emotional states in mammals and may provide non-verbal signals of pain. This study uses geometric morphometrics (GMM) to explore the facial shape variation in female Japanese macaques who underwent experimental laparotomy. Face image samples were collected from video footage of fourteen macaques before surgery and 1, 3, and 7 days after the procedure. Image samples in the pre-surgical condition were considered pain-free, and facial expressions emerging after surgery were investigated as potential indicators of pain. Landmarks for shape analysis were selected based on the underlying facial musculature and their corresponding facial action units and then annotated in 324 pre-surgical and 750 post-surgical images. The expression of pain is likely to vary between individuals. Tightly closed eyelids or squeezed eyes and lip tension were the most commonly observed facial changes on day 1 after surgery (p < 0.01974). A good overall inter-rater reliability [ICC = 0.99 (95% CI 0.75-1.0)] was observed with the method. The study emphasizes the importance of individualized assessment and provides a better understanding of facial cues to pain for captive macaque care.
Collapse
Affiliation(s)
- Vanessa N Gris
- Primate Research Institute, Kyoto University, Inuyama, Japan
- Center for the Evolutionary Origins of Human Behavior, Kyoto University, 41-2 Kanrin, Inuyama, Aichi, 484-8506, Japan
| | - Nelson Broche
- Primate Research Institute, Kyoto University, Inuyama, Japan
- Center for the Evolutionary Origins of Human Behavior, Kyoto University, 41-2 Kanrin, Inuyama, Aichi, 484-8506, Japan
| | - Akihisa Kaneko
- Primate Research Institute, Kyoto University, Inuyama, Japan
- Center for the Evolutionary Origins of Human Behavior, Kyoto University, 41-2 Kanrin, Inuyama, Aichi, 484-8506, Japan
| | - Munehiro Okamoto
- Primate Research Institute, Kyoto University, Inuyama, Japan
- Center for the Evolutionary Origins of Human Behavior, Kyoto University, 41-2 Kanrin, Inuyama, Aichi, 484-8506, Japan
| | - Juri Suzuki
- Primate Research Institute, Kyoto University, Inuyama, Japan
| | - Daniel S Mills
- Department of Life Sciences, University of Lincoln, Lincoln, UK
| | - Takako Miyabe-Nishiwaki
- Primate Research Institute, Kyoto University, Inuyama, Japan.
- Center for the Evolutionary Origins of Human Behavior, Kyoto University, 41-2 Kanrin, Inuyama, Aichi, 484-8506, Japan.
| |
Collapse
|
22
|
Aulehner K, Leenaars C, Buchecker V, Stirling H, Schönhoff K, King H, Häger C, Koska I, Jirkof P, Bleich A, Bankstahl M, Potschka H. Grimace scale, burrowing, and nest building for the assessment of post-surgical pain in mice and rats-A systematic review. Front Vet Sci 2022; 9:930005. [PMID: 36277074 PMCID: PMC9583882 DOI: 10.3389/fvets.2022.930005] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2022] [Accepted: 08/22/2022] [Indexed: 11/04/2022] Open
Abstract
Several studies suggested an informative value of behavioral and grimace scale parameters for the detection of pain. However, the robustness and reliability of the parameters as well as the current extent of implementation are still largely unknown. In this study, we aimed to systematically analyze the current evidence-base of grimace scale, burrowing, and nest building for the assessment of post-surgical pain in mice and rats. The following platforms were searched for relevant articles: PubMed, Embase via Ovid, and Web of Science. Only full peer-reviewed studies that describe the grimace scale, burrowing, and/or nest building as pain parameters in the post-surgical phase in mice and/or rats were included. Information about the study design, animal characteristics, intervention characteristics, and outcome measures was extracted from identified publications. In total, 74 papers were included in this review. The majority of studies have been conducted in young adult C57BL/6J mice and Sprague Dawley and Wistar rats. While there is an apparent lack of information about young animals, some studies that analyzed the grimace scale in aged rats were identified. The majority of studies focused on laparotomy-associated pain. Only limited information is available about other types of surgical interventions. While an impact of surgery and an influence of analgesia were rather consistently reported in studies focusing on grimace scales, the number of studies that assessed respective effects was rather low for nest building and burrowing. Moreover, controversial findings were evident for the impact of analgesics on post-surgical nest building activity. Regarding analgesia, a monotherapeutic approach was identified in the vast majority of studies with non-steroidal anti-inflammatory (NSAID) drugs and opioids being most commonly used. In conclusion, most evidence exists for grimace scales, which were more frequently used to assess post-surgical pain in rodents than the other behavioral parameters. However, our findings also point to relevant knowledge gaps concerning the post-surgical application in different strains, age levels, and following different surgical procedures. Future efforts are also necessary to directly compare the sensitivity and robustness of different readout parameters applied for the assessment of nest building and burrowing activities.
Collapse
Affiliation(s)
- Katharina Aulehner
- Institute of Pharmacology, Toxicology and Pharmacy, Ludwig-Maximilians-University, Munich, Germany
| | - Cathalijn Leenaars
- Institute for Laboratory Animal Science, Hannover Medical School, Hanover, Germany
| | - Verena Buchecker
- Institute of Pharmacology, Toxicology and Pharmacy, Ludwig-Maximilians-University, Munich, Germany
| | - Helen Stirling
- Institute of Pharmacology, Toxicology and Pharmacy, Ludwig-Maximilians-University, Munich, Germany
| | - Katharina Schönhoff
- Institute of Pharmacology, Toxicology and Pharmacy, Ludwig-Maximilians-University, Munich, Germany
| | - Hannah King
- Institute of Pharmacology, Toxicology and Pharmacy, Ludwig-Maximilians-University, Munich, Germany
| | - Christine Häger
- Institute for Laboratory Animal Science, Hannover Medical School, Hanover, Germany
| | - Ines Koska
- Institute of Pharmacology, Toxicology and Pharmacy, Ludwig-Maximilians-University, Munich, Germany
| | - Paulin Jirkof
- Office for Animal Welfare and 3Rs, University of Zurich, Zurich, Switzerland
| | - André Bleich
- Institute for Laboratory Animal Science, Hannover Medical School, Hanover, Germany
| | - Marion Bankstahl
- Institute for Laboratory Animal Science, Hannover Medical School, Hanover, Germany
| | - Heidrun Potschka
- Institute of Pharmacology, Toxicology and Pharmacy, Ludwig-Maximilians-University, Munich, Germany
| |
Collapse
|
23
|
Feighelstein M, Shimshoni I, Finka LR, Luna SPL, Mills DS, Zamansky A. Automated recognition of pain in cats. Sci Rep 2022; 12:9575. [PMID: 35688852 PMCID: PMC9187730 DOI: 10.1038/s41598-022-13348-1] [Citation(s) in RCA: 24] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2022] [Accepted: 05/23/2022] [Indexed: 11/09/2022] Open
Abstract
Facial expressions in non-human animals are closely linked to their internal affective states, with the majority of empirical work focusing on facial shape changes associated with pain. However, existing tools for facial expression analysis are prone to human subjectivity and bias, and in many cases also require special expertise and training. This paper presents the first comparative study of two different paths towards automatizing pain recognition in facial images of domestic short haired cats (n = 29), captured during ovariohysterectomy at different time points corresponding to varying intensities of pain. One approach is based on convolutional neural networks (ResNet50), while the other-on machine learning models based on geometric landmarks analysis inspired by species specific Facial Action Coding Systems (i.e. catFACS). Both types of approaches reach comparable accuracy of above 72%, indicating their potential usefulness as a basis for automating cat pain detection from images.
Collapse
Affiliation(s)
| | - Ilan Shimshoni
- Information Systems Department, University of Haifa, Haifa, Israel
| | - Lauren R Finka
- School of Veterinary Medicine and Science, The University of Nottingham, Nottingham, UK
| | - Stelio P L Luna
- Department of Veterinary Surgery and Animal Reproduction, School of Veterinary Medicine and Animal Science, São Paulo State University (Unesp), Botucatu, São Paulo, Brazil
| | - Daniel S Mills
- School of Life Sciences, Joseph Bank Laboratories, University of Lincoln, Lincoln, UK
| | - Anna Zamansky
- Information Systems Department, University of Haifa, Haifa, Israel.
| |
Collapse
|
24
|
Sharing pain: Using pain domain transfer for video recognition of low grade orthopedic pain in horses. PLoS One 2022; 17:e0263854. [PMID: 35245288 PMCID: PMC8896717 DOI: 10.1371/journal.pone.0263854] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2021] [Accepted: 01/26/2022] [Indexed: 12/16/2022] Open
Abstract
Orthopedic disorders are common among horses, often leading to euthanasia, which often could have been avoided with earlier detection. These conditions often create varying degrees of subtle long-term pain. It is challenging to train a visual pain recognition method with video data depicting such pain, since the resulting pain behavior also is subtle, sparsely appearing, and varying, making it challenging for even an expert human labeller to provide accurate ground-truth for the data. We show that a model trained solely on a dataset of horses with acute experimental pain (where labeling is less ambiguous) can aid recognition of the more subtle displays of orthopedic pain. Moreover, we present a human expert baseline for the problem, as well as an extensive empirical study of various domain transfer methods and of what is detected by the pain recognition method trained on clean experimental pain in the orthopedic dataset. Finally, this is accompanied with a discussion around the challenges posed by real-world animal behavior datasets and how best practices can be established for similar fine-grained action recognition tasks. Our code is available at https://github.com/sofiabroome/painface-recognition.
Collapse
|