1
|
Pu L, Coppieters MW, Smalbrugge M, Jones C, Byrnes J, Todorovic M, Moyle W. Authors' response to Hughes et al. (2024). J Adv Nurs 2025; 81:2844-2847. [PMID: 38738907 DOI: 10.1111/jan.16226] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2024] [Accepted: 04/25/2024] [Indexed: 05/14/2024]
Affiliation(s)
- Lihui Pu
- Department of Internal Medicine, Section Nursing Science, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
- School of Nursing and Midwifery, Griffith University, Brisbane, Australia
| | - Michel W Coppieters
- School of Health Sciences and Social Work, Griffith University, Brisbane, Australia
- Amsterdam Movement Sciences - Program Musculoskeletal Health, Faculty of Behavioral and Movement Sciences, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands
| | - Martin Smalbrugge
- Department of Medicine for Older People, Amsterdam UMC, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands
- Aging and Later Life, Amsterdam Public Health Research Institute, Amsterdam, The Netherlands
| | - Cindy Jones
- Faculty of Health Sciences and Medicine, Bond University, Gold Coast, Queensland, Australia
| | - Joshua Byrnes
- Center for Applied Health Economics, School of Medicine and Dentistry, Griffith University, Brisbane, Queensland, Australia
| | - Michael Todorovic
- School of Nursing and Midwifery, Griffith University, Brisbane, Australia
- Faculty of Health Sciences and Medicine, Bond University, Gold Coast, Queensland, Australia
| | - Wendy Moyle
- School of Nursing and Midwifery, Griffith University, Brisbane, Australia
| |
Collapse
|
2
|
Tan CW, Du T, Teo JC, Chan DXH, Kong WM, Sng BL. Automated pain detection using facial expression in adult patients with a customized spatial temporal attention long short-term memory (STA-LSTM) network. Sci Rep 2025; 15:13429. [PMID: 40251301 PMCID: PMC12008390 DOI: 10.1038/s41598-025-97885-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2024] [Accepted: 04/08/2025] [Indexed: 04/20/2025] Open
Abstract
Self-reported pain scores are often used for pain assessments and require effective communication. On the other hand, observer-based assessments are resource-intensive and require training. We developed an automated system to assess pain intensity in adult patients based on changes in facial expression. We recruited adult patients undergoing surgery or interventional pain procedures in two public healthcare institutions in Singapore. The patients' facial expressions were videotaped from a frontal view with varying body poses using a customized mobile application. The collected videos were trimmed into multiple 1 s clips and categorized into three levels of pain: no pain, mild pain, or significant pain. A total of 468 facial key points were extracted from each video frame. A customized spatial temporal attention long short-term memory (STA-LSTM) deep learning network was trained and validated using the extracted keypoints to detect pain levels by analyzing facial expressions in both the spatial and temporal domains. Model performance was evaluated using accuracy, sensitivity, recall, and F1-score. Two hundred patients were recruited, with 2008 videos collected for further clipping into 10,274 1 s clips. Videos from 160 patients (7599 clips) were used for STA-LSTM training, while the remaining 40 patients' videos (2675 clips) were set aside for validation. By differentiating the polychromous levels of pain (no pain versus mild pain versus significant pain requiring clinical intervention), we reported the optimal performance of STA-LSTM model, with accuracy, sensitivity, recall, and F1-score all at 0.8660. Our proposed solution has the potential to facilitate objective pain assessment in clinical settings through the developed STA-LSTM model, enabling healthcare professionals and caregivers to perform pain assessments effectively in both inpatient and outpatient settings.
Collapse
Affiliation(s)
- Chin Wen Tan
- Department of Women's Anesthesia, KK Women's and Children's Hospital, 100 Bukit Timah Road, Singapore, 229899, Singapore.
- Anesthesiology and Perioperative Sciences Academic Clinical Program, Duke-NUS Medical School, 8 College Road, Singapore, 169857, Singapore.
| | - Tiehua Du
- Biomedical Engineering and Materials Group, Nanyang Polytechnic, 180 Ang Mo Kio Avenue 8, Singapore, 569830, Singapore
| | - Jing Chun Teo
- Digital Integration Medical Innovation and Care Transformation, KK Women's and Children's Hospital, 100 Bukit Timah Road, Singapore, 229899, Singapore
| | - Diana Xin Hui Chan
- Anesthesiology and Perioperative Sciences Academic Clinical Program, Duke-NUS Medical School, 8 College Road, Singapore, 169857, Singapore
- Department of Anesthesiology, Singapore General Hospital, SingHealth Tower, 10 Hospital Boulevard #19-01, Singapore, 168582, Singapore
| | - Wai Ming Kong
- Biomedical Engineering and Materials Group, Nanyang Polytechnic, 180 Ang Mo Kio Avenue 8, Singapore, 569830, Singapore
| | - Ban Leong Sng
- Department of Women's Anesthesia, KK Women's and Children's Hospital, 100 Bukit Timah Road, Singapore, 229899, Singapore
- Anesthesiology and Perioperative Sciences Academic Clinical Program, Duke-NUS Medical School, 8 College Road, Singapore, 169857, Singapore
| |
Collapse
|
3
|
Ma C, Wang C, Zhu D, Chen M, Zhang M, He J. The Investigation of the Relationship Between Individual Pain Perception, Brain Electrical Activity, and Facial Expression Based on Combined EEG and Facial EMG Analysis. J Pain Res 2025; 18:21-32. [PMID: 39776765 PMCID: PMC11705972 DOI: 10.2147/jpr.s477658] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2024] [Accepted: 12/11/2024] [Indexed: 01/11/2025] Open
Abstract
Purpose Pain is a multidimensional, unpleasant emotional and sensory experience, and accurately assessing its intensity is crucial for effective management. However, individuals with cognitive impairments or language deficits may struggle to accurately report their pain. EEG provides insight into the neurological aspects of pain, while facial EMG captures the sensory and peripheral muscle responses. Our objective is to explore the relationship between individual pain perception, brain activity, and facial expressions through a combined analysis of EEG and facial EMG, aiming to provide an objective and multidimensional approach to pain assessment. Methods We investigated pain perception in response to electrical stimulation of the middle finger in 26 healthy subjects. The 32-channel EEG and 3-channel facial EMG signals were simultaneously recorded during a pain rating task. Group difference and correlation analysis were employed to investigate the relationship between individual pain perception, EEG, and facial EMG. The general linear model (GLM) was used for multidimensional pain assessment. Results The EEG analysis revealed that painful stimuli induced N2-P2 complex waveforms and gamma oscillations, with substantial variability in response to different stimuli. The facial EMG signals also demonstrated significant differences and variability correlated with subjective pain ratings. A combined analysis of EEG and facial EMG data using a general linear model indicated that both N2-P2 complex waveforms and the zygomatic muscle responses significantly contributed to pain assessment. Conclusion Facial EMG signals provide pain descriptions which are not sufficiently captured by EEG signals, and integrating both signals offers a more comprehensive understanding of pain perception. Our study underscores the potential of multimodal neurophysiological measurements in pain perception, offering a more comprehensive framework for evaluating pain.
Collapse
Affiliation(s)
- Chaozong Ma
- Department of Rehabilitation Medicine, First Affiliated Hospital of Xi’an Jiaotong University, Xi’an, People’s Republic of China
- Military Medical Psychology School, Fourth Military Medical University, Xi’an, People’s Republic of China
| | - Chenxi Wang
- Center for Brain Imaging, School of Life Science and Technology, Xidian University, Xi’an, People’s Republic of China
| | - Dan Zhu
- Department of Rehabilitation Medicine, First Affiliated Hospital of Xi’an Jiaotong University, Xi’an, People’s Republic of China
| | - Mingfang Chen
- Department of Rehabilitation Medicine, First Affiliated Hospital of Xi’an Jiaotong University, Xi’an, People’s Republic of China
| | - Ming Zhang
- Department of Rehabilitation Medicine, First Affiliated Hospital of Xi’an Jiaotong University, Xi’an, People’s Republic of China
- Department of Medical Imaging, First Affiliated Hospital of Xi’an Jiaotong University, Xi’an, People’s Republic of China
| | - Juan He
- Department of Rehabilitation Medicine, First Affiliated Hospital of Xi’an Jiaotong University, Xi’an, People’s Republic of China
| |
Collapse
|
4
|
Pu L, Coppieters MW, Smalbrugge M, Jones C, Byrnes J, Todorovic M, Moyle W. Associations between facial expressions and observational pain in residents with dementia and chronic pain. J Adv Nurs 2024; 80:3846-3855. [PMID: 38334268 DOI: 10.1111/jan.16063] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2023] [Revised: 12/13/2023] [Accepted: 01/07/2024] [Indexed: 02/10/2024]
Abstract
AIM To identify specific facial expressions associated with pain behaviors using the PainChek application in residents with dementia. DESIGN This is a secondary analysis from a study exploring the feasibility of PainChek to evaluate the effectiveness of a social robot (PARO) intervention on pain for residents with dementia from June to November 2021. METHODS Participants experienced PARO individually five days per week for 15 min (once or twice) per day for three consecutive weeks. The PainChek app assessed each resident's pain levels before and after each session. The association between nine facial expressions and the adjusted PainChek scores was analyzed using a linear mixed model. RESULTS A total of 1820 assessments were completed with 46 residents. Six facial expressions were significantly associated with a higher adjusted PainChek score. Horizontal mouth stretch showed the strongest association with the score, followed by brow lowering parting lips, wrinkling of the nose, raising of the upper lip and closing eyes. However, the presence of cheek raising, tightening of eyelids and pulling at the corner lip were not significantly associated with the score. Limitations of using the PainChek app were identified. CONCLUSION Six specific facial expressions were associated with observational pain scores in residents with dementia. Results indicate that automated real-time facial analysis is a promising approach to assessing pain in people with dementia. However, it requires further validation by human observers before it can be used for decision-making in clinical practice. IMPACT Pain is common in people with dementia, while assessing pain is challenging in this group. This study generated new evidence of facial expressions of pain in residents with dementia. Results will inform the development of valid artificial intelligence-based algorithms that will support healthcare professionals in identifying pain in people with dementia in clinical situations. REPORTING METHOD The study adheres to the CONSORT reporting guidelines. PATIENT OR PUBLIC CONTRIBUTION One resident with dementia and two family members of people with dementia were consulted and involved in the study design, where they provided advice on the protocol, information sheets and consent forms, and offered valuable insights to ensure research quality and relevance. TRIAL REGISTRATION Australian and New Zealand Clinical Trials Registry number (ACTRN12621000837820).
Collapse
Affiliation(s)
- Lihui Pu
- Menzies Health Institute Queensland, Griffith University, Brisbane, Queensland, Australia
- School of Nursing and Midwifery, Griffith University, Brisbane, Queensland, Australia
| | - Michel W Coppieters
- Menzies Health Institute Queensland, Griffith University, Brisbane, Queensland, Australia
- School of Health Sciences and Social Work, Griffith University, Brisbane, Queensland, Australia
- Amsterdam Movement Sciences - Program Musculoskeletal Health, Faculty of Behavioral and Movement Sciences, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands
| | - Martin Smalbrugge
- Department of Medicine for Older People, Amsterdam UMC, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands
- Amsterdam Public Health Research Institute, Aging & Later Life, Amsterdam, The Netherlands
| | - Cindy Jones
- Menzies Health Institute Queensland, Griffith University, Brisbane, Queensland, Australia
- Faculty of Health Sciences & Medicine, Bond University, Gold Coast, Queensland, Australia
| | - Joshua Byrnes
- Menzies Health Institute Queensland, Griffith University, Brisbane, Queensland, Australia
- Centre for Applied Health Economics, School of Medicine and Dentistry, Griffith University, Brisbane, Queensland, Australia
| | - Michael Todorovic
- Menzies Health Institute Queensland, Griffith University, Brisbane, Queensland, Australia
- School of Nursing and Midwifery, Griffith University, Brisbane, Queensland, Australia
- Faculty of Health Sciences & Medicine, Bond University, Gold Coast, Queensland, Australia
| | - Wendy Moyle
- Menzies Health Institute Queensland, Griffith University, Brisbane, Queensland, Australia
- School of Nursing and Midwifery, Griffith University, Brisbane, Queensland, Australia
| |
Collapse
|
5
|
Lloyd EP, Summers KM, Gunderson CA, Weesner RE, Ten Brinke L, Hugenberg K, McConnell AR. Denver pain authenticity stimulus set (D-PASS). Behav Res Methods 2024; 56:2992-3008. [PMID: 37993672 PMCID: PMC11109019 DOI: 10.3758/s13428-023-02283-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/31/2023] [Indexed: 11/24/2023]
Abstract
We introduce the Denver Pain Authenticity Stimulus Set (D-PASS), a free resource containing 315 videos of 105 unique individuals expressing authentic and posed pain. All expressers were recorded displaying one authentic (105; pain was elicited via a pressure algometer) and two posed (210) expressions of pain (one posed expression recorded before [posed-unrehearsed] and one recorded after [posed-rehearsed] the authentic pain expression). In addition to authentic and posed pain videos, the database includes an accompanying codebook including metrics assessed at the expresser and video levels (e.g., Facial Action Coding System metrics for each video controlling for neutral images of the expresser), expressers' pain threshold and pain tolerance values, averaged pain detection performance by naïve perceivers who viewed the videos (e.g., accuracy, response bias), neutral images of each expresser, and face characteristic rating data for neutral images of each expresser (e.g., attractiveness, trustworthiness). The stimuli and accompanying codebook can be accessed for academic research purposes from https://digitalcommons.du.edu/lsdl_dpass/1/ . The relatively large number of stimuli allow for consideration of expresser-level variability in analyses and enable more advanced statistical approaches (e.g., signal detection analyses). Furthermore, the large number of Black (n = 41) and White (n = 56) expressers permits investigations into the role of race in pain expression, perception, and authenticity detection. Finally, the accompanying codebook may provide pilot data for novel investigations in the intergroup or pain sciences.
Collapse
Affiliation(s)
- E Paige Lloyd
- Department of Psychology, University of Denver, 2155 South Race Street, Denver, CO, 80210, USA.
| | - Kevin M Summers
- Department of Psychology, University of Denver, 2155 South Race Street, Denver, CO, 80210, USA
| | - Christopher A Gunderson
- Department of Psychology, University of Denver, 2155 South Race Street, Denver, CO, 80210, USA
| | - Rachael E Weesner
- Psychiatry Residency Program, University of Colorado, Denver, CO, USA
| | - Leanne Ten Brinke
- Department of Psychology, University of British Columbia - Okanagan, Kelowna, Canada
| | - Kurt Hugenberg
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN, USA
| | | |
Collapse
|
6
|
Saumure C, Plouffe-Demers MP, Fiset D, Cormier S, Zhang Y, Sun D, Feng M, Luo F, Kunz M, Blais C. Differences Between East Asians and Westerners in the Mental Representations and Visual Information Extraction Involved in the Decoding of Pain Facial Expression Intensity. AFFECTIVE SCIENCE 2023; 4:332-349. [PMID: 37293682 PMCID: PMC10153781 DOI: 10.1007/s42761-023-00186-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/30/2022] [Accepted: 03/14/2023] [Indexed: 06/10/2023]
Abstract
Effectively communicating pain is crucial for human beings. Facial expressions are one of the most specific forms of behavior associated with pain, but the way culture shapes expectations about the intensity with which pain is typically facially conveyed, and the visual strategies deployed to decode pain intensity in facial expressions, is poorly understood. The present study used a data-driven approach to compare two cultures, namely East Asians and Westerners, with respect to their mental representations of pain facial expressions (experiment 1, N=60; experiment 2, N=74) and their visual information utilization during the discrimination of facial expressions of pain of different intensities (experiment 3; N=60). Results reveal that compared to Westerners, East Asians expect more intense pain expressions (experiments 1 and 2), need more signal, and do not rely as much as Westerners on core facial features of pain expressions to discriminate between pain intensities (experiment 3). Together, those findings suggest that cultural norms regarding socially accepted pain behaviors shape the expectations about pain facial expressions and decoding visual strategies. Furthermore, they highlight the complexity of emotional facial expressions and the importance of studying pain communication in multicultural settings. Supplementary Information The online version contains supplementary material available at 10.1007/s42761-023-00186-1.
Collapse
Affiliation(s)
- Camille Saumure
- Département de Psychoéducation et de Psychologie, Université du Québec en Outaouais, CP 1250 succ. Hull, Gatineau, J8X 3X7 Canada
| | - Marie-Pier Plouffe-Demers
- Département de Psychoéducation et de Psychologie, Université du Québec en Outaouais, CP 1250 succ. Hull, Gatineau, J8X 3X7 Canada
- Département de Psychologie, Université du Québec à Montréal, CP 8888 succ. Centre-ville, Montréal, Québec) H3C 3P8 Canada
| | - Daniel Fiset
- Département de Psychoéducation et de Psychologie, Université du Québec en Outaouais, CP 1250 succ. Hull, Gatineau, J8X 3X7 Canada
| | - Stéphanie Cormier
- Département de Psychoéducation et de Psychologie, Université du Québec en Outaouais, CP 1250 succ. Hull, Gatineau, J8X 3X7 Canada
| | - Ye Zhang
- Institute of Psychological Science, Hangzhou Normal University, Hangzhou, Zhejiang China
| | - Dan Sun
- Institute of Psychological Science, Hangzhou Normal University, Hangzhou, Zhejiang China
- Department of Psychology, Utrecht University, Utrecht, The Netherlands
| | - Manni Feng
- Institute of Psychological Science, Hangzhou Normal University, Hangzhou, Zhejiang China
| | - Feifan Luo
- Institute of Psychological Science, Hangzhou Normal University, Hangzhou, Zhejiang China
| | - Miriam Kunz
- Department of Medical Psychology & Sociology, University of Augsburg, Augsburg, Germany
| | - Caroline Blais
- Département de Psychoéducation et de Psychologie, Université du Québec en Outaouais, CP 1250 succ. Hull, Gatineau, J8X 3X7 Canada
| |
Collapse
|
7
|
Dildine TC, Amir CM, Parsons J, Atlas LY. How Pain-Related Facial Expressions Are Evaluated in Relation to Gender, Race, and Emotion. AFFECTIVE SCIENCE 2023; 4:350-369. [PMID: 37293681 PMCID: PMC9982800 DOI: 10.1007/s42761-023-00181-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/13/2022] [Accepted: 01/24/2023] [Indexed: 03/06/2023]
Abstract
Inequities in pain assessment are well-documented; however, the psychological mechanisms underlying such biases are poorly understood. We investigated potential perceptual biases in the judgments of faces displaying pain-related movements. Across five online studies, 956 adult participants viewed images of computer-generated faces ("targets") that varied in features related to race (Black and White) and gender (women and men). Target identity was manipulated across participants, and each target had equivalent facial movements that displayed varying intensities of movement in facial action-units related to pain (Studies 1-4) or pain and emotion (Study 5). On each trial, participants provided categorical judgments as to whether a target was in pain (Studies 1-4) or which expression the target displayed (Study 5) and then rated the perceived intensity of the expression. Meta-analyses of Studies 1-4 revealed that movement intensity was positively associated with both categorizing a trial as painful and perceived pain intensity. Target race and gender did not consistently affect pain-related judgments, contrary to well-documented clinical inequities. In Study 5, in which pain was equally likely relative to other emotions, pain was the least frequently selected emotion (5%). Our results suggest that perceivers can utilize facial movements to evaluate pain in other individuals, but perceiving pain may depend on contextual factors. Furthermore, assessments of computer-generated, pain-related facial movements online do not replicate sociocultural biases observed in the clinic. These findings provide a foundation for future studies comparing CGI and real images of pain and emphasize the need for further work on the relationship between pain and emotion. Supplementary Information The online version contains supplementary material available at 10.1007/s42761-023-00181-6.
Collapse
Affiliation(s)
- Troy C. Dildine
- National Center for Complementary and Integrative Health, National Institutes of Health, 10, Center Drive, Bethesda, MD 20892 USA
- Department of Clinical Neuroscience, Karolinska Institute, 171 77 Solna, Sweden
| | - Carolyn M. Amir
- National Center for Complementary and Integrative Health, National Institutes of Health, 10, Center Drive, Bethesda, MD 20892 USA
| | - Julie Parsons
- National Center for Complementary and Integrative Health, National Institutes of Health, 10, Center Drive, Bethesda, MD 20892 USA
| | - Lauren Y. Atlas
- National Center for Complementary and Integrative Health, National Institutes of Health, 10, Center Drive, Bethesda, MD 20892 USA
- National Institute of Mental Health, National Institutes of Health, Bethesda, MD 20892 USA
- National Institute on Drug Abuse, National Institutes of Health, Baltimore, MD 21224 USA
| |
Collapse
|
8
|
Determination of “Neutral”–“Pain”, “Neutral”–“Pleasure”, and “Pleasure”–“Pain” Affective State Distances by Using AI Image Analysis of Facial Expressions. TECHNOLOGIES 2022. [DOI: 10.3390/technologies10040075] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/27/2023]
Abstract
(1) Background: In addition to verbalizations, facial expressions advertise one’s affective state. There is an ongoing debate concerning the communicative value of the facial expressions of pain and of pleasure, and to what extent humans can distinguish between these. We introduce a novel method of analysis by replacing human ratings with outputs from image analysis software. (2) Methods: We use image analysis software to extract feature vectors of the facial expressions neutral, pain, and pleasure displayed by 20 actresses. We dimension-reduced these feature vectors, used singular value decomposition to eliminate noise, and then used hierarchical agglomerative clustering to detect patterns. (3) Results: The vector norms for pain–pleasure were rarely less than the distances pain–neutral and pleasure–neutral. The pain–pleasure distances were Weibull-distributed and noise contributed 10% to the signal. The noise-free distances clustered in four clusters and two isolates. (4) Conclusions: AI methods of image recognition are superior to human abilities in distinguishing between facial expressions of pain and pleasure. Statistical methods and hierarchical clustering offer possible explanations as to why humans fail. The reliability of commercial software, which attempts to identify facial expressions of affective states, can be improved by using the results of our analyses.
Collapse
|
9
|
Simulating dynamic facial expressions of pain from visuo-haptic interactions with a robotic patient. Sci Rep 2022; 12:4200. [PMID: 35273296 PMCID: PMC8913843 DOI: 10.1038/s41598-022-08115-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2021] [Accepted: 02/28/2022] [Indexed: 01/28/2023] Open
Abstract
Medical training simulators can provide a safe and controlled environment for medical students to practice their physical examination skills. An important source of information for physicians is the visual feedback of involuntary pain facial expressions in response to physical palpation on an affected area of a patient. However, most existing robotic medical training simulators that can capture physical examination behaviours in real-time cannot display facial expressions and comprise a limited range of patient identities in terms of ethnicity and gender. Together, these limitations restrict the utility of medical training simulators because they do not provide medical students with a representative sample of pain facial expressions and face identities, which could result in biased practices. Further, these limitations restrict the utility of such medical simulators to detect and correct early signs of bias in medical training. Here, for the first time, we present a robotic system that can simulate facial expressions of pain in response to palpations, displayed on a range of patient face identities. We use the unique approach of modelling dynamic pain facial expressions using a data-driven perception-based psychophysical method combined with the visuo-haptic inputs of users performing palpations on a robot medical simulator. Specifically, participants performed palpation actions on the abdomen phantom of a simulated patient, which triggered the real-time display of six pain-related facial Action Units (AUs) on a robotic face (MorphFace), each controlled by two pseudo randomly generated transient parameters: rate of change \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$\beta $$\end{document}β and activation delay \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$\tau $$\end{document}τ. Participants then rated the appropriateness of the facial expression displayed in response to their palpations on a 4-point scale from “strongly disagree” to “strongly agree”. Each participant (\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$n=16$$\end{document}n=16, 4 Asian females, 4 Asian males, 4 White females and 4 White males) performed 200 palpation trials on 4 patient identities (Black female, Black male, White female and White male) simulated using MorphFace. Results showed facial expressions rated as most appropriate by all participants comprise a higher rate of change and shorter delay from upper face AUs (around the eyes) to those in the lower face (around the mouth). In contrast, we found that transient parameter values of most appropriate-rated pain facial expressions, palpation forces, and delays between palpation actions varied across participant-simulated patient pairs according to gender and ethnicity. These findings suggest that gender and ethnicity biases affect palpation strategies and the perception of pain facial expressions displayed on MorphFace. We anticipate that our approach will be used to generate physical examination models with diverse patient demographics to reduce erroneous judgments in medical students, and provide focused training to address these errors.
Collapse
|
10
|
Emerson AJ, Chandler LE, Oxendine RH, Huff CM, Harris GM, Baxter GD, Wonsetler Jones EC. Systematic review of clinical decision-makers’ attitudes, beliefs, and biases that contribute to a marginalized process of care in persistent musculoskeletal pain. Part II: case vignettes. PHYSICAL THERAPY REVIEWS 2021. [DOI: 10.1080/10833196.2021.2000289] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Affiliation(s)
- Alicia J. Emerson
- Department of Physical Therapy, Congdon School of Health Sciences, High Point University, High Point, NC, USA
- Centre for Health, Activity, and Rehabilitation Research, University of Otago, Dunedin, New Zealand
| | - Lauren E. Chandler
- Department of Physical Therapy, Congdon School of Health Sciences, High Point University, High Point, NC, USA
| | - Riley H. Oxendine
- Department of Physical Therapy, Congdon School of Health Sciences, High Point University, High Point, NC, USA
| | - Corey M. Huff
- Department of Physical Therapy, Congdon School of Health Sciences, High Point University, High Point, NC, USA
| | - Gabrielle M. Harris
- Department of Physical Therapy, Congdon School of Health Sciences, High Point University, High Point, NC, USA
| | - G. David Baxter
- Centre for Health, Activity, and Rehabilitation Research, University of Otago, Dunedin, New Zealand
| | - Elizabeth C. Wonsetler Jones
- Department of Physical Therapy, Congdon School of Health Sciences, High Point University, High Point, NC, USA
- Department of Public Health and Community Medicine, Tufts University, Boston, MA, USA
| |
Collapse
|
11
|
Observing Pain in Individuals with Cognitive Impairment: A Pilot Comparison Attempt across Countries and across Different Types of Cognitive Impairment. Brain Sci 2021; 11:brainsci11111455. [PMID: 34827454 PMCID: PMC8615509 DOI: 10.3390/brainsci11111455] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2021] [Revised: 10/12/2021] [Accepted: 10/28/2021] [Indexed: 01/23/2023] Open
Abstract
Facial expression is a key aspect in observational scales developed to improve pain assessment in individuals with cognitive impairments. Although these scales are used internationally in individuals with different types of cognitive impairments, it is not known whether observing facial expressions of pain might differ between regions or between different types of cognitive impairments. In a pilot study, facial responses to standardized experimental pressure pain were assessed among individuals with different types of cognitive impairments (dementia, mild cognitive impairment, Huntington’s disease, and intellectual disability) from different countries (Denmark, Germany, Italy, Israel, and Spain) and were analyzed using facial descriptors from the PAIC scale (Pain Assessment in Impaired Cognition). We found high inter-rater reliability between observers from different countries. Moreover, facial responses to pain did not differ between individuals with dementia from different countries (Denmark, Germany, and Spain). However, the type of cognitive impairment had a significant impact; with individuals with intellectual disability (all being from Israel) showing the strongest facial responses. Our pilot data suggest that the country of origin does not strongly affect how pain is facially expressed or how facial responses are being scored. However, the type of cognitive impairment showed a clear effect in our pilot study, with elevated facial responses in individuals with intellectual disability.
Collapse
|
12
|
The Delaware Pain Database: a set of painful expressions and corresponding norming data. Pain Rep 2020; 5:e853. [PMID: 33134750 PMCID: PMC7587421 DOI: 10.1097/pr9.0000000000000853] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2020] [Revised: 08/04/2020] [Accepted: 08/05/2020] [Indexed: 11/26/2022] Open
Abstract
Introduction Facial expressions of pain serve an essential social function by communicating suffering and soliciting aid. Accurate visual perception of painful expressions is critical because the misperception of pain signals can have serious clinical and social consequences. Therefore, it is essential that researchers have access to high-quality, diverse databases of painful expressions to better understand accuracy and bias in pain perception. Objectives This article describes the development of a large-scale face stimulus database focusing on expressions of pain. Methods We collected and normed a database of images of models posing painful facial expressions. We also characterized these stimuli in terms of the presence of a series of pain-relevant facial action units. In addition to our primary database of posed expressions, we provide a separate database of computer-rendered expressions of pain that may be applied to any neutral face photograph. Results The resulting database comprises 229 unique (and now publicly available) painful expressions. To the best of our knowledge, there are no existing databases of this size, quality, or diversity in terms of race, gender, and expression intensity. We provide evidence for the reliability of expressions and evaluations of pain within these stimuli, as well as a full characterization of this set along dimensions relevant to pain such as perceived status, strength, and dominance. Moreover, our second database complements the primary set in terms of experimental control and precision. Conclusion These stimuli will facilitate reproducible research in both experimental and clinical domains into the mechanisms supporting accuracy and bias in pain perception and care.
Collapse
|
13
|
Deska JC, Kunstman JW, Bernstein MJ, Ogungbadero T, Hugenberg K. Black racial phenotypicality shapes social pain and support judgments. JOURNAL OF EXPERIMENTAL SOCIAL PSYCHOLOGY 2020. [DOI: 10.1016/j.jesp.2020.103998] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/22/2023]
|
14
|
|