1
|
Florkiewicz BN, Lazebnik T. Combinatorics and complexity of chimpanzee (Pan troglodytes) facial signals. Anim Cogn 2025; 28:34. [PMID: 40304773 PMCID: PMC12043769 DOI: 10.1007/s10071-025-01955-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2025] [Revised: 04/15/2025] [Accepted: 04/16/2025] [Indexed: 05/02/2025]
Abstract
There have been shifts toward more systematic and standardized methods for studying non-human primate facial signals, thanks to advancements like animalFACS. Additionally, there have been calls to better integrate the study of both facial and gestural communication in terms of theory and methodology. However, few studies have taken this important integrative step. By doing so, researchers could gain greater insight into how the physical flexibility of facial signals affects social flexibility. Our study combines both approaches to examine the relationship between the flexibility of physical form and the social function of chimpanzee facial "gestures". We used chimpFACS along with established gestural ethograms that provide insights into four key gesture properties and their associated variables documented in chimpanzee gestures. We specifically investigated how the combinatorics (i.e., the different combinations of facial muscle movements) and complexity (measured by the number of discrete facial muscle movements) of chimpanzee facial signals varied based on: (1) how many gesture variables they exhibit; (2) the presence of a specific goal; and (3) the context in which they were produced. Our findings indicate that facial signals produced with vocalizations exhibit fewer gesture variables, rarely align with specific goals, and exhibit reduced contextual flexibility. Furthermore, facial signals that include additional visual movements (such as those of the head) and other visual signals (like manual gestures) exhibit more gestural variables, are frequently aligned with specific goals, and exhibit greater contextual flexibility. Finally, we discovered that facial signals become more morphologically complex when they exhibit a greater number of gesture variables. Our findings indicate that facial "gesturing" significantly enhanced the facial signaling repertoire of chimpanzees, offering insights into the evolution of complex communication systems like human language.
Collapse
Affiliation(s)
| | - Teddy Lazebnik
- Department of Mathematics, Ariel University, Ariel, Israel.
- Department of Cancer Biology, Cancer Institute, University of College London, London, UK.
| |
Collapse
|
2
|
Mota-Rojas D, Whittaker AL, Bienboire-Frosini C, Buenhombre J, Mora-Medina P, Domínguez-Oliva A, Martínez-Burnes J, Hernández-Avalos I, Olmos-Hernández A, Verduzco-Mendoza A, Casas-Alvarado A, Lezama-García K, Grandin T. The neurobiological basis of emotions and their connection to facial expressions in non-human mammals: insights in nonverbal communication. Front Vet Sci 2025; 12:1541615. [PMID: 40125317 PMCID: PMC11926555 DOI: 10.3389/fvets.2025.1541615] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2024] [Accepted: 02/19/2025] [Indexed: 03/25/2025] Open
Abstract
Recognizing that nonhuman animals are sentient beings has increased interest in studying their emotional state. Similar to humans, research has shown that some nonhuman mammals can modify facial expressions by contraction/relaxation of facial muscles according to their affective state. From a neurophysiological perspective, emotions are processed in several brain structures, mainly from the limbic system, such as the hypothalamus, hypophysis, hippocampus, prefrontal cortex, and amygdala. The converged pathways between the amygdala, the motor cortex, and its projections to the facial nerve control the movement of facial or mimetic muscles. Thus, facial expression is suggested to reflect the internal emotional state and could serve as an essential mode of nonverbal communication in mammals. In humans, the Facial Action Coding System (FACS) is a method that objectively analyzes facial movements using an anatomical base. In veterinary medicine, AnimalFACS is an adaptation of this system to eight animal species, including domestic animals (dogs, cats, and horses) and nonhuman primates (chimpanzees, orangutans, gibbons, macaques, and common marmosets). Considering these coded facial movements, current research aims to associate certain facial expressions with the animals' emotional states and affective contexts. Thus, this review aims to discuss recent findings associated with the neurobiology of emotions and facial expressions in non-human mammals, using AnimalFACS to understand nonverbal communication. Characterizing each facial expression according to different contexts might help identify if the animal is expressing a positive or negative emotional response to the event, which can improve nonverbal human-animal communication.
Collapse
Affiliation(s)
- Daniel Mota-Rojas
- Neurophysiology, Behavior and Animal Welfare Assessment, DPAA, Universidad Autónoma Metropolitana (UAM), Mexico City, Mexico
| | - Alexandra L. Whittaker
- School of Animal and Veterinary Sciences, University of Adelaide, Roseworthy Campus, Roseworthy, SA, Australia
| | | | - Jhon Buenhombre
- Faculty of Agricultural Sciences, Animal Welfare and Ethology Specialization, Fundación Universitaria Agraria de Colombia – UNIAGRARIA, Bogotá, Colombia
| | - Patricia Mora-Medina
- Facultad de Estudios Superiores Cuautitlán, Universidad Nacional Autónoma de México (UNAM), Cuautitlán, Mexico
| | - Adriana Domínguez-Oliva
- Neurophysiology, Behavior and Animal Welfare Assessment, DPAA, Universidad Autónoma Metropolitana (UAM), Mexico City, Mexico
| | - Julio Martínez-Burnes
- Instituto de Ecología Aplicada, Facultad de Medicina Veterinaria y Zootecnia, Universidad Autónoma de Tamaulipas, Victoria, Mexico
| | - Ismael Hernández-Avalos
- Facultad de Estudios Superiores Cuautitlán, Universidad Nacional Autónoma de México (UNAM), Cuautitlán, Mexico
| | - Adriana Olmos-Hernández
- Division of Biotechnology-Bioterio and Experimental Surgery, Instituto Nacional de Rehabilitación Luis Guillermo Ibarra Ibarra (INR-LGII), Mexico City, Mexico
| | - Antonio Verduzco-Mendoza
- Division of Biotechnology-Bioterio and Experimental Surgery, Instituto Nacional de Rehabilitación Luis Guillermo Ibarra Ibarra (INR-LGII), Mexico City, Mexico
| | - Alejandro Casas-Alvarado
- Neurophysiology, Behavior and Animal Welfare Assessment, DPAA, Universidad Autónoma Metropolitana (UAM), Mexico City, Mexico
| | - Karina Lezama-García
- Neurophysiology, Behavior and Animal Welfare Assessment, DPAA, Universidad Autónoma Metropolitana (UAM), Mexico City, Mexico
| | - Temple Grandin
- Department of Animal Sciences, Colorado State University, Fort Collins, CO, United States
| |
Collapse
|
3
|
Correia-Caeiro C, Costa R, Hayashi M, Burrows A, Pater J, Miyabe-Nishiwaki T, Richardson JL, Robbins MM, Waller B, Liebal K. GorillaFACS: The Facial Action Coding System for the Gorilla spp. PLoS One 2025; 20:e0308790. [PMID: 39874277 PMCID: PMC11774405 DOI: 10.1371/journal.pone.0308790] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2024] [Accepted: 07/30/2024] [Indexed: 01/30/2025] Open
Abstract
The Facial Action Coding System (FACS) is an objective observation tool for measuring human facial behaviour. It avoids subjective attributions of meaning by objectively measuring independent movements linked to facial muscles, called Action Units (AUs). FACS has been adapted to 11 other taxa, including most apes, macaques and domestic animals, but not yet gorillas. To carry out cross species studies of facial expressions within and beyond apes, gorillas need to be included in such studies. Hence, we developed the GorillaFACS for the Gorilla spp. We followed similar methodology as previous FACS: First, we examined the facial muscular plan of the gorilla. Second, we analysed gorilla videos in a wide variety of contexts to identify their spontaneous facial movements. Third, we classified the individual facial movements according to appearance changes produced by the corresponding underlying musculature. A diverse repertoire of 42 facial movements was identified in the gorilla, including 28 AUs and 14 Action Descriptors, with several new movements not identified in the HumanFACS. Although some of the movements in gorillas differ from humans, the total number of AUs is comparable to the HumanFACS (32 AUs). Importantly, the gorilla's range of facial movements was larger than expected, suggesting a more relevant role in social interactions than what was previously assumed. GorillaFACS is a scientific tool to measure facial movements, and thus, will allow us to better understand the gorilla's expressions and communication. Furthermore, GorillaFACS has the potential be used as an important tool to evaluate this species welfare, particularly in settings of close proximity to humans.
Collapse
Affiliation(s)
- Catia Correia-Caeiro
- Human Biology & Primate Cognition Department, Institute of Biology, Leipzig University, Leipzig, Germany
- Comparative Cultural Psychology, Max Planck Institute for Evolutionary Anthropology, Leipzig, Germany
| | - Raquel Costa
- Research Department, Japan Monkey Center, Inuyama, Japan
- Primate Cognition Research Group, Lisbon, Portugal
- Mulheres pela Primatologia, Florianópolis, Brazil
| | - Misato Hayashi
- Research Department, Japan Monkey Center, Inuyama, Japan
| | - Anne Burrows
- Department of Physical Therapy, Duquesne University, Pittsburgh, PA, United States of America
- Department of Anthropology, University of Pittsburgh, Pittsburgh, PA, United States of America
| | - Jordan Pater
- Department of Physical Therapy, Duquesne University, Pittsburgh, PA, United States of America
| | - Takako Miyabe-Nishiwaki
- Center for the Evolutionary Origins of Human Behavior (EHuB), Kyoto University, Inuyama, Japan
| | - Jack L. Richardson
- Center for the Advanced Study of Human Paleobiology, Department of Anthropology, The George Washington University, Washington, DC, United States of America
| | - Martha M. Robbins
- Department of Primate Behavior and Evolution, Max Planck Institute for Evolutionary Anthropology, Leipzig, Germany
| | - Bridget Waller
- Department of Psychology, Nottingham Trent University, Nottingham, United Kingdom
| | - Katja Liebal
- Human Biology & Primate Cognition Department, Institute of Biology, Leipzig University, Leipzig, Germany
- Comparative Cultural Psychology, Max Planck Institute for Evolutionary Anthropology, Leipzig, Germany
| |
Collapse
|
4
|
Mahmoud A, Scott L, Florkiewicz BN. Examining Mammalian facial behavior using Facial Action Coding Systems (FACS) and combinatorics. PLoS One 2025; 20:e0314896. [PMID: 39869619 PMCID: PMC11771922 DOI: 10.1371/journal.pone.0314896] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2024] [Accepted: 11/18/2024] [Indexed: 01/29/2025] Open
Abstract
There has been an increased interest in standardized approaches to coding facial movement in mammals. Such approaches include Facial Action Coding Systems (FACS), where individuals are trained to identify discrete facial muscle movements that combine to create a facial configuration. Some studies have utilized FACS to analyze facial signaling, recording the quantity of morphologically distinct facial signals a species can generate. However, it is unclear whether these numbers represent the total number of facial muscle movement combinations (which we refer to as facial configurations) that each species is capable of producing. If unobserved combinations of facial muscle movements are communicative in nature, it is crucial to identify them, as this information is important for testing research hypotheses related to the evolution of complex communication among mammals. Our study aimed to assess how well the existing literature represents the potential range of facial signals in two previously studied species: chimpanzees (Pan troglodytes) and domesticated cats (Felis silvestris catus). We adhered to the coding guidelines outlined in the FACS manuals, which are based on the anatomical constraints and capabilities of each mammal's face, to create our comprehensive list of all potential facial configurations. Using this approach, we found that chimpanzees and domesticated cats may be capable of producing thousands of facial configurations, many of which have not yet been documented in the existing research literature. It is plausible that some of these facial configurations are communicative and could be discovered with further research and video recording. In addition to our findings having significant implications for future research on the communicative complexity of mammals, it can also assist researchers in evaluating FACS coding accuracy.
Collapse
Affiliation(s)
- Aisha Mahmoud
- Department of Computer and Data Science, Lyon College, Batesville, Arkansas, United States of America
| | - Lauren Scott
- School of Medicine, University of Kansas Medical Center, Kansas, KS, United States of America
| | | |
Collapse
|
5
|
Coye C, Caspar KR, Patel-Grosz P. Dance displays in gibbons: biological and linguistic perspectives on structured, intentional, and rhythmic body movement. Primates 2025; 66:61-73. [PMID: 39365409 PMCID: PMC11735528 DOI: 10.1007/s10329-024-01154-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2023] [Accepted: 09/10/2024] [Indexed: 10/05/2024]
Abstract
Female crested gibbons (genus Nomascus) perform conspicuous sequences of twitching movements involving the rump and extremities. However, these dances have attracted little scientific attention and their structure and meaning remain largely obscure. Here we analyse close-range video recordings of captive crested gibbons, extracting descriptions of dance in four species (N. annamensis, N. gabriellae, N. leucogenys and N. siki). In addition, we report results from a survey amongst relevant professionals clarifying behavioural contexts of dance in captive and wild crested gibbons. Our results demonstrate that dances in Nomascus represent a common and intentional form of visual communication restricted to sexually mature females. Whilst primarily used as a proceptive signal to solicit copulation, dances occur in a wide range of contexts related to arousal and/or frustration in captivity. A linguistically informed view of this sequential behaviour demonstrates that movement within dances is organized in groups and follows an isochronous rhythm - patterns not described for visual displays in other non-human primates. We argue that applying the concept of dance to gibbons allows us to expand our understanding of communication in non-human primates and to develop hypotheses on the rules and regularities characterising it. We propose that crested gibbon dances likely evolved from less elaborate rhythmic proceptive signals, similar to those found in siamangs. Although dance displays in humans and crested gibbons share a number of key characteristics, they cannot be assumed to be homologous. Nevertheless, gibbon dances represent a striking model behaviour to investigate the use of complex gestural signals in hominoid primates.
Collapse
Affiliation(s)
| | - Kai R Caspar
- Institute for Cell Biology, Heinrich Heine University, Düsseldorf, Germany.
- Department of Game Management and Wildlife Biology, Faculty of Forestry and Wood Sciences, Czech University of Life Sciences, Prague, Czech Republic.
| | | |
Collapse
|
6
|
Martvel G, Scott L, Florkiewicz B, Zamansky A, Shimshoni I, Lazebnik T. Computational investigation of the social function of domestic cat facial signals. Sci Rep 2024; 14:27533. [PMID: 39528681 PMCID: PMC11554805 DOI: 10.1038/s41598-024-79216-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2024] [Accepted: 11/07/2024] [Indexed: 11/16/2024] Open
Abstract
There is growing interest in the facial signals of domestic cats. Domestication may have shifted feline social dynamics towards a greater emphasis on facial signals that promote affiliative bonding. Most studies have focused on cat facial signals during human interactions or in response to pain. Research on intraspecific facial communication in cats has predominantly examined non-affiliative social interactions. A recent study by Scott and Florkiewicz1 demonstrated significant differences between cats' facial signals during affiliative and non-affiliative intraspecific interactions. This follow-up study applies computational approaches to make two main contributions. First, we develop a machine learning classifier for affiliative/non-affiliative interactions based on manual CatFACS codings and automatically detected facial landmarks, reaching above 77% in CatFACS codings and 68% in landmarks by integrating a temporal dimension. Secondly, we introduce novel measures for rapid facial mimicry based on CatFACS coding. Our analysis suggests that domestic cats exhibit more rapid facial mimicry in affiliative contexts than non-affiliative ones, which is consistent with the proposed function of mimicry. Moreover, we found that ear movements (such as EAD103 and EAD104) are highly prone to rapid facial mimicry. Our research introduces new possibilities for analyzing cat facial signals and exploring shared moods with innovative AI-based approaches.
Collapse
Affiliation(s)
- George Martvel
- Information Systems Department, University of Haifa, Haifa, Israel
| | - Lauren Scott
- School of Medicine, University of Kansas Medical Center, Kansas, USA
| | | | - Anna Zamansky
- Information Systems Department, University of Haifa, Haifa, Israel.
| | - Ilan Shimshoni
- Information Systems Department, University of Haifa, Haifa, Israel
| | - Teddy Lazebnik
- Department of Mathematics, Ariel University, Ariel, Israel
- Department of Cancer Biology, Cancer Institute, University College London, London, UK
| |
Collapse
|
7
|
Richard JT, Pellegrini I, Levine R. Belugas (Delphinapterus leucas) create facial displays during social interactions by changing the shape of their melons. Anim Cogn 2024; 27:7. [PMID: 38429515 PMCID: PMC10907495 DOI: 10.1007/s10071-024-01843-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2023] [Revised: 11/23/2023] [Accepted: 11/27/2023] [Indexed: 03/03/2024]
Abstract
Beluga whales are considered unique among odontocetes in their ability to visibly alter the appearance of their head by changing the shape of the melon, but only anecdotal observations are available to evaluate the use or potential function of these melon shapes. This study of belugas in professionally managed care aimed to establish an ethogram for the repertoire of categorizable melon shapes and then evaluate their potential function as intentional communication signals by determining if they were produced and elaborated during social interactions of varying behavioral contexts while in the line of sight of a recipient. Five different melon shapes were reliably identified in video observations of the primary study population (n = 4) and externally validated in a second aquarium population (n = 51). Among the 2570 melon shapes observed from the primary study subjects, melon shapes occurred 34 × more frequently during social interactions (1.72 per minute) than outside of social interactions (0.05 per minute). Melon shapes occurring during social interactions were performed within the line of sight of a recipient 93.6% of the time. The frequency of occurrence of the different melon shapes varied across behavioral contexts. Elaboration of melon shapes through extended duration and the occurrence of concurrent open mouth displays varied by shape type and across behavioral contexts. Melon shapes seem to function as visual displays, with some characteristics of intentional communication. This ability could yield adaptive benefits to belugas, given their complex social structure and hypothesized mating system that emphasizes pre-copulatory female mate choice.
Collapse
Affiliation(s)
- Justin T Richard
- Department of Fisheries, Animal and Veterinary Science, University of Rhode Island, Kingston, RI, 02881, USA.
| | - Isabelle Pellegrini
- Department of Fisheries, Animal and Veterinary Science, University of Rhode Island, Kingston, RI, 02881, USA
| | - Rachael Levine
- Department of Fisheries, Animal and Veterinary Science, University of Rhode Island, Kingston, RI, 02881, USA
| |
Collapse
|
8
|
Scott L, Florkiewicz BN. Feline faces: Unraveling the social function of domestic cat facial signals. Behav Processes 2023; 213:104959. [PMID: 37858844 DOI: 10.1016/j.beproc.2023.104959] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2023] [Revised: 10/12/2023] [Accepted: 10/15/2023] [Indexed: 10/21/2023]
Abstract
Lately, there has been a growing interest in studying domestic cat facial signals, but most of this research has centered on signals produced during human-cat interactions or pain. The available research on intraspecific facial signaling with domesticated cats has largely focused on non-affiliative social interactions. However, the transition to intraspecific sociality through domestication could have resulted in a greater reliance on affiliative facial signals that aid with social bonding. Our study aimed to document the various facial signals that cats produce during affiliative and non-affiliative intraspecific interactions. Given the close relationship between the physical form and social function of mammalian facial signals, we predicted that affiliative and non-affiliative facial signals would have noticeable differences in their physical morphology. We observed the behavior of 53 adult domestic shorthair cats at CatCafé Lounge in Los Angeles, CA. Using Facial Action Coding Systems designed for cats, we compared the complexity and compositionality of facial signals produced in affiliative and non-affiliative contexts. To measure complexity and compositionality, we examined the number and types of facial muscle movements (AUs) observed in each signal. We found that compositionality, rather than complexity, was significantly associated with the social function of intraspecific facial signals. Our findings indicate that domestication likely had a significant impact on the development of intraspecific facial signaling repertoires in cats.
Collapse
Affiliation(s)
- Lauren Scott
- School of Medicine, University of Kansas Medical Center, KS, USA
| | | |
Collapse
|
9
|
Cerrito P, DeCasien AR. The expression of care: Alloparental care frequency predicts neural control of facial muscles in primates. Evolution 2021; 75:1727-1737. [PMID: 34019303 DOI: 10.1111/evo.14275] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2020] [Revised: 05/02/2021] [Accepted: 05/04/2021] [Indexed: 11/30/2022]
Abstract
The adaptive value of facial expressions has been debated in evolutionary biology ever since Darwin's seminal work. Among mammals, primates, including humans, exhibit the most intricate facial displays. Although previous work has focused on the role of sociality in the evolution of primate facial expressions, this relationship has not been verified in a wide sample of species. Here, we examine the relationship between allomaternal care (paternal or alloparental) and the morphology of three orofacial brainstem nuclei (facial; trigeminal motor; hypoglossal) across primates to test the hypothesis that allomaternal care explains variation in the complexity of facial expressions, proxied by relative facial nucleus size and neuropil fraction. The latter represents the proportion of synaptically dense tissue and may, therefore, correlate with dexterity. We find that alloparental care frequency predicts relative neuropil fraction of the facial nucleus, even after controlling for social system organization, whereas allomaternal care is not associated with the trigeminal motor or hypoglossal nuclei. Overall, this work suggests that alloparenting requires increased facial dexterity to facilitate nonverbal communication between infants and their nonparent caregivers and/or between caregivers. Accordingly, alloparenting and complex facial expressions are likely to have coevolved in primates.
Collapse
Affiliation(s)
- Paola Cerrito
- Department of Anthropology, New York University, New York, New York, 10003.,New York Consortium in Evolutionary Primatology, New York, New York, 10024.,Department of Molecular Pathobiology, New York University College of Dentistry, New York, New York, 10010
| | - Alex R DeCasien
- Department of Anthropology, New York University, New York, New York, 10003.,New York Consortium in Evolutionary Primatology, New York, New York, 10024
| |
Collapse
|
10
|
Roth TS, Samara I, Tan J, Prochazkova E, Kret ME. A comparative framework of inter-individual coordination and pair-bonding. Curr Opin Behav Sci 2021. [DOI: 10.1016/j.cobeha.2021.03.005] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/19/2022]
|
11
|
Measuring the evolution of facial ‘expression’ using multi-species FACS. Neurosci Biobehav Rev 2020; 113:1-11. [DOI: 10.1016/j.neubiorev.2020.02.031] [Citation(s) in RCA: 38] [Impact Index Per Article: 7.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2019] [Revised: 01/30/2020] [Accepted: 02/23/2020] [Indexed: 11/24/2022]
|