1
|
Scala M, Sánchez-Reolid D, Sánchez-Reolid R, Fernández-Sotos P, Romero-Ferreiro V, Alvarez-Mon MÁ, Lahera G, Fanelli G, Serretti A, Fabbri C, Fernández-Caballero A, Rodriguez-Jimenez R. Differences in emotion recognition between nonimmersive versus immersive virtual reality: preliminary findings in schizophrenia and bipolar disorder. Int Clin Psychopharmacol 2024:00004850-990000000-00153. [PMID: 39641922 DOI: 10.1097/yic.0000000000000576] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/07/2024]
Abstract
Deficits in social cognition may impair emotional processing and facial emotional recognition (FER) in patients with bipolar disorder (BD) and schizophrenia. FER is generally explored using photographs or images of static faces that do not fully capture the complexity of real-life facial stimuli. To overcome this limitation, we developed a set of dynamic virtual faces depicting six basic emotions (i.e. happiness, sadness, anger, fear, disgust, and surprise) and a neutral expression suitable for presentation in immersive and nonimmersive virtual realities. This study presents preliminary findings on the differences in FER accuracy from a frontal view between immersive and nonimmersive virtual realities among patients experiencing a relapse of schizophrenia ( n = 10), a manic phase of BD ( n = 10), and a group of healthy controls (HCs) ( n = 10). As a secondary objective, we compare the FER accuracy across these three groups. Patients with schizophrenia and BD showed similar accuracy in recognizing emotions in immersive and nonimmersive virtual reality settings. However, patients with schizophrenia exhibited lower FER accuracy than HCs in both settings. Individuals with BD showed intermediate accuracy between those with schizophrenia and HCs, although these differences were not statistically significant. Notably, recognition of negative emotions was significantly impaired in both groups of patients.
Collapse
Affiliation(s)
- Mauro Scala
- Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
- Department of Psychiatry, Health Research Institute Hospital 12 de Octubre (imas12)
- Department of Legal Medicine, Psychiatry and Pathology, Complutense University of Madrid (UCM)
- Department of Psychology, Faculty of Biomedical and Health Sciences, European University of Madrid (UEM), Madrid
| | - Daniel Sánchez-Reolid
- Instituto de Investigación en Informática de Albacete, Universidad de Castilla-La Mancha (UCLM), Albacete
- CIBERSAM-ISCIII (Biomedical Research Networking Centre in Mental Health), Madrid
| | - Roberto Sánchez-Reolid
- Instituto de Investigación en Informática de Albacete, Universidad de Castilla-La Mancha (UCLM), Albacete
- CIBERSAM-ISCIII (Biomedical Research Networking Centre in Mental Health), Madrid
- Departamento de Sistemas Informáticos, Universidad de Castilla-La Mancha (UCLM), Albacete
| | - Patricia Fernández-Sotos
- CIBERSAM-ISCIII (Biomedical Research Networking Centre in Mental Health), Madrid
- Department of Psychiatry and Mental Health of Cartagena, Cartagena
| | - Verónica Romero-Ferreiro
- Department of Psychiatry, Health Research Institute Hospital 12 de Octubre (imas12)
- Department of Psychology, Faculty of Biomedical and Health Sciences, European University of Madrid (UEM), Madrid
- CIBERSAM-ISCIII (Biomedical Research Networking Centre in Mental Health), Madrid
| | - Miguel Ángel Alvarez-Mon
- CIBERSAM-ISCIII (Biomedical Research Networking Centre in Mental Health), Madrid
- Department of Medicine and Medical Specialities, University of Alcala, Alcala de Henares
- Department of Psychiatry and Mental Health, Infanta Leonor University Hospital
- Ramón y Cajal Institute of Sanitary Research (IRYCIS), Madrid, Spain
| | - Guillermo Lahera
- CIBERSAM-ISCIII (Biomedical Research Networking Centre in Mental Health), Madrid
- Department of Medicine and Medical Specialities, University of Alcala, Alcala de Henares
- Ramón y Cajal Institute of Sanitary Research (IRYCIS), Madrid, Spain
| | - Giuseppe Fanelli
- Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
- Department of Human Genetics, Radboud University Medical Center, Donders Institute for Brain, Cognition and Behaviour, Nijmegen, The Netherlands
| | - Alessandro Serretti
- Department of Medicine and Surgery, Kore University of Enna, Enna and
- Oasi Research Institute-IRCCS, Troina, Italy
| | - Chiara Fabbri
- Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, Bologna, Italy
| | - Antonio Fernández-Caballero
- Instituto de Investigación en Informática de Albacete, Universidad de Castilla-La Mancha (UCLM), Albacete
- CIBERSAM-ISCIII (Biomedical Research Networking Centre in Mental Health), Madrid
- Departamento de Sistemas Informáticos, Universidad de Castilla-La Mancha (UCLM), Albacete
| | - Roberto Rodriguez-Jimenez
- Department of Psychiatry, Health Research Institute Hospital 12 de Octubre (imas12)
- Department of Legal Medicine, Psychiatry and Pathology, Complutense University of Madrid (UCM)
- CIBERSAM-ISCIII (Biomedical Research Networking Centre in Mental Health), Madrid
| |
Collapse
|
2
|
Tian Z, Albakry NS, Du Y. Illumination Intelligent Adaptation and Analysis Framework: A comprehensive solution for enhancing nighttime driving fatigue monitoring. PLoS One 2024; 19:e0308201. [PMID: 39141655 PMCID: PMC11324097 DOI: 10.1371/journal.pone.0308201] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/01/2024] [Accepted: 07/19/2024] [Indexed: 08/16/2024] Open
Abstract
Nighttime driving presents a critical challenge to road safety due to insufficient lighting and increased risk of driver fatigue. Existing methods for monitoring driver fatigue, mainly focusing on behavioral analysis and biometric monitoring, face significant challenges under low-light conditions. Their effectiveness, especially in dynamic lighting environments, is limited by their dependency on specific environmental conditions and active driver participation, leading to reduced accuracy and practicality in real-world scenarios. This study introduces a novel 'Illumination Intelligent Adaptation and Analysis Framework (IIAAF)', aimed at addressing these limitations and enhancing the accuracy and practicality of driver fatigue monitoring under nighttime low-light conditions. The IIAAF framework employs a multidimensional technology integration, including comprehensive body posture analysis and facial fatigue feature detection, per-pixel dynamic illumination adjustment technology, and a light variation feature learning system based on Convolutional Neural Networks (CNN) and time-series analysis. Through this integrated approach, the framework is capable of accurately capturing subtle fatigue signals in nighttime driving environments and adapting in real-time to rapid changes in lighting conditions. Experimental results on two independent datasets indicate that the IIAAF framework significantly improves the accuracy of fatigue detection under nighttime low-light conditions. This breakthrough not only enhances the effectiveness of driving assistance systems but also provides reliable scientific support for reducing the risk of accidents caused by fatigued driving. These research findings have significant theoretical and practical implications for advancing intelligent driving assistance technology and improving nighttime road safety.
Collapse
Affiliation(s)
- Zenghui Tian
- Faculty of Art, Sustainability & Creative Industry, Sultan Idris Education University, Tanjung Malim, Perak, Malaysia
| | - Nur Safinas Albakry
- Faculty of Art, Sustainability & Creative Industry, Sultan Idris Education University, Tanjung Malim, Perak, Malaysia
| | - Yinghui Du
- Faculty of Art, Sustainability & Creative Industry, Sultan Idris Education University, Tanjung Malim, Perak, Malaysia
| |
Collapse
|
3
|
González-Gualda LM, Vicente-Querol MA, García AS, Molina JP, Latorre JM, Fernández-Sotos P, Fernández-Caballero A. An exploratory study of the effect of age and gender on face scanning during affect recognition in immersive virtual reality. Sci Rep 2024; 14:5553. [PMID: 38448515 PMCID: PMC10918108 DOI: 10.1038/s41598-024-55774-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/01/2023] [Accepted: 02/26/2024] [Indexed: 03/08/2024] Open
Abstract
A person with impaired emotion recognition is not able to correctly identify facial expressions represented by other individuals. The aim of the present study is to assess eyes gaze and facial emotion recognition in a healthy population using dynamic avatars in immersive virtual reality (IVR). For the first time, the viewing of each area of interest of the face in IVR is studied by gender and age. This work in healthy people is conducted to assess the future usefulness of IVR in patients with deficits in the recognition of facial expressions. Seventy-four healthy volunteers participated in the study. The materials used were a laptop computer, a game controller, and a head-mounted display. Dynamic virtual faces randomly representing the six basic emotions plus neutral expression were used as stimuli. After the virtual human represented an emotion, a response panel was displayed with the seven possible options. Besides storing the hits and misses, the software program internally divided the faces into different areas of interest (AOIs) and recorded how long participants looked at each AOI. As regards the overall accuracy of the participants' responses, hits decreased from the youngest to the middle-aged and older adults. Also, all three groups spent the highest percentage of time looking at the eyes, but younger adults had the highest percentage. It is also noteworthy that attention to the face compared to the background decreased with age. Moreover, the hits between women and men were remarkably similar and, in fact, there were no statistically significant differences between them. In general, men paid more attention to the eyes than women, but women paid more attention to the forehead and mouth. In contrast to previous work, our study indicates that there are no differences between men and women in facial emotion recognition. Moreover, in line with previous work, the percentage of face viewing time for younger adults is higher than for older adults. However, contrary to earlier studies, older adults look more at the eyes than at the mouth.Consistent with other studies, the eyes are the AOI with the highest percentage of viewing time. For men the most viewed AOI is the eyes for all emotions in both hits and misses. Women look more at the eyes for all emotions, except for joy, fear, and anger on hits. On misses, they look more into the eyes for almost all emotions except surprise and fear.
Collapse
Affiliation(s)
- Luz M González-Gualda
- Servicio de Salud de Castilla-La Mancha, Complejo Hospitalario Universitario de Albacete, Servicio de Salud Mental, 02004, Albacete, Spain
| | - Miguel A Vicente-Querol
- Neurocognition and Emotion Unit, Instituto de Investigación en Informática de Albacete, 02071, Albacete, Spain
| | - Arturo S García
- Neurocognition and Emotion Unit, Instituto de Investigación en Informática de Albacete, 02071, Albacete, Spain
- Departmento de Sistemas Informáticos, Universidad de Castilla-La Mancha, 02071, Albacete, Spain
| | - José P Molina
- Neurocognition and Emotion Unit, Instituto de Investigación en Informática de Albacete, 02071, Albacete, Spain
- Departmento de Sistemas Informáticos, Universidad de Castilla-La Mancha, 02071, Albacete, Spain
| | - José M Latorre
- Departmento de Psicología, Universidad de Castilla-La Mancha, 02071, Albacete, Spain
| | - Patricia Fernández-Sotos
- Servicio de Salud de Castilla-La Mancha, Complejo Hospitalario Universitario de Albacete, Servicio de Salud Mental, 02004, Albacete, Spain
- CIBERSAM-ISCIII (Biomedical Research Networking Centre in Mental Health, Instituto de Salud Carlos III), 28016, Madrid, Spain
| | - Antonio Fernández-Caballero
- Neurocognition and Emotion Unit, Instituto de Investigación en Informática de Albacete, 02071, Albacete, Spain.
- Departmento de Sistemas Informáticos, Universidad de Castilla-La Mancha, 02071, Albacete, Spain.
- CIBERSAM-ISCIII (Biomedical Research Networking Centre in Mental Health, Instituto de Salud Carlos III), 28016, Madrid, Spain.
| |
Collapse
|
4
|
Vicente-Querol MA, Fernández-Caballero A, González P, González-Gualda LM, Fernández-Sotos P, Molina JP, García AS. Effect of Action Units, Viewpoint and Immersion on Emotion Recognition Using Dynamic Virtual Faces. Int J Neural Syst 2023; 33:2350053. [PMID: 37746831 DOI: 10.1142/s0129065723500533] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/26/2023]
Abstract
Facial affect recognition is a critical skill in human interactions that is often impaired in psychiatric disorders. To address this challenge, tests have been developed to measure and train this skill. Recently, virtual human (VH) and virtual reality (VR) technologies have emerged as novel tools for this purpose. This study investigates the unique contributions of different factors in the communication and perception of emotions conveyed by VHs. Specifically, it examines the effects of the use of action units (AUs) in virtual faces, the positioning of the VH (frontal or mid-profile), and the level of immersion in the VR environment (desktop screen versus immersive VR). Thirty-six healthy subjects participated in each condition. Dynamic virtual faces (DVFs), VHs with facial animations, were used to represent the six basic emotions and the neutral expression. The results highlight the important role of the accurate implementation of AUs in virtual faces for emotion recognition. Furthermore, it is observed that frontal views outperform mid-profile views in both test conditions, while immersive VR shows a slight improvement in emotion recognition. This study provides novel insights into the influence of these factors on emotion perception and advances the understanding and application of these technologies for effective facial emotion recognition training.
Collapse
Affiliation(s)
- Miguel A Vicente-Querol
- Instituto de Investigación en Informática, Universidad de Castilla-La Mancha, Albacete 02071, Spain
| | - Antonio Fernández-Caballero
- Instituto de Investigación en Informática, Universidad de Castilla-La Mancha, Albacete 02071, Spain
- Departamento de Sistemas Informáticos, Universidad de Castilla-La Mancha, Albacete 02071, Spain
- Biomedical Research Networking Centre in Mental Health, Instituto de Salud Carlos III, Madrid 28029, Spain
| | - Pascual González
- Instituto de Investigación en Informática, Universidad de Castilla-La Mancha, Albacete 02071, Spain
- Departamento de Sistemas Informáticos, Universidad de Castilla-La Mancha, Albacete 02071, Spain
- Biomedical Research Networking Centre in Mental Health, Instituto de Salud Carlos III, Madrid 28029, Spain
| | - Luz M González-Gualda
- Servicio de Salud Mental, Complejo Hospitalario, Universitario de Albacete, Albacete 02004, Spain
| | - Patricia Fernández-Sotos
- Biomedical Research Networking Centre in Mental Health, Instituto de Salud Carlos III, Madrid 28029, Spain
- Servicio de Salud Mental, Complejo Hospitalario, Universitario de Albacete, Albacete 02004, Spain
| | - José P Molina
- Instituto de Investigación en Informática, Universidad de Castilla-La Mancha, Albacete 02071, Spain
- Departamento de Sistemas Informáticos, Universidad de Castilla-La Mancha, Albacete 02071, Spain
| | - Arturo S García
- Instituto de Investigación en Informática, Universidad de Castilla-La Mancha, Albacete 02071, Spain
- Departamento de Sistemas Informáticos, Universidad de Castilla-La Mancha, Albacete 02071, Spain
| |
Collapse
|
5
|
Monferrer M, García AS, Ricarte JJ, Montes MJ, Fernández-Caballero A, Fernández-Sotos P. Facial emotion recognition in patients with depression compared to healthy controls when using human avatars. Sci Rep 2023; 13:6007. [PMID: 37045889 PMCID: PMC10097677 DOI: 10.1038/s41598-023-31277-5] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2022] [Accepted: 03/09/2023] [Indexed: 04/14/2023] Open
Abstract
The negative, mood-congruent cognitive bias described in depression, as well as excessive rumination, have been found to interfere with emotional processing. This study focuses on the assessment of facial recognition of emotions in patients with depression through a new set of dynamic virtual faces (DVFs). The sample consisted of 54 stable patients compared to 54 healthy controls. The experiment consisted in an emotion recognition task using non-immersive virtual reality (VR) with DVFs of six basic emotions and neutral expression. Patients with depression showed a worst performance in facial affect recognition compared to healthy controls. Age of onset was negatively correlated with emotion recognition and no correlation was observed for duration of illness or number of lifetime hospitalizations. There was no correlation for the depression group between emotion recognition and degree of psychopathology, excessive rumination, degree of functioning, or quality of life. Hence, it is important to improve and validate VR tools for emotion recognition to achieve greater methodological homogeneity of studies and to be able to establish more conclusive results.
Collapse
Affiliation(s)
- Marta Monferrer
- Servicio de Salud de Castilla-La Mancha, Complejo Hospitalario Universitario de Albacete, Servicio de Salud Mental, 02004, Albacete, Spain
| | - Arturo S García
- Departmento de Sistemas Informáticos, Universidad de Castilla-La Mancha, 02071, Albacete, Spain
- Neurocognition and Emotion Unit, Instituto de Investigación en Informática de Albacete, 02071, Albacete, Spain
| | - Jorge J Ricarte
- Departmento de Psicología, Universidad de Castilla-La Mancha, 02071, Albacete, Spain
| | - María J Montes
- Servicio de Salud de Castilla-La Mancha, Complejo Hospitalario Universitario de Albacete, Servicio de Salud Mental, 02004, Albacete, Spain
| | - Antonio Fernández-Caballero
- Departmento de Sistemas Informáticos, Universidad de Castilla-La Mancha, 02071, Albacete, Spain
- Neurocognition and Emotion Unit, Instituto de Investigación en Informática de Albacete, 02071, Albacete, Spain
- CIBERSAM-ISCIII (Biomedical Research Networking Centre in Mental Health, Instituto de Salud Carlos III), 28016, Madrid, Spain
| | - Patricia Fernández-Sotos
- Servicio de Salud de Castilla-La Mancha, Complejo Hospitalario Universitario de Albacete, Servicio de Salud Mental, 02004, Albacete, Spain.
- CIBERSAM-ISCIII (Biomedical Research Networking Centre in Mental Health, Instituto de Salud Carlos III), 28016, Madrid, Spain.
| |
Collapse
|
6
|
Del Aguila J, González-Gualda LM, Játiva MA, Fernández-Sotos P, Fernández-Caballero A, García AS. How Interpersonal Distance Between Avatar and Human Influences Facial Affect Recognition in Immersive Virtual Reality. Front Psychol 2021; 12:675515. [PMID: 34335388 PMCID: PMC8319634 DOI: 10.3389/fpsyg.2021.675515] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2021] [Accepted: 06/22/2021] [Indexed: 11/17/2022] Open
Abstract
Purpose: The purpose of this study was to determine the optimal interpersonal distance (IPD) between humans and affective avatars in facial affect recognition in immersive virtual reality (IVR). The ideal IPD is the one in which the humans show the highest number of hits and the shortest reaction times in recognizing the emotions displayed by avatars. The results should help design future therapies to remedy facial affect recognition deficits. Methods: A group of 39 healthy volunteers participated in an experiment in which participants were shown 65 dynamic faces in IVR and had to identify six basic emotions plus neutral expression presented by the avatars. We decided to limit the experiment to five different distances: D1 (35 cm), D2 (55 cm), D3 (75 cm), D4 (95 cm), and D5 (115 cm), all belonging to the intimate and personal interpersonal spaces. Of the total of 65 faces, 13 faces were presented for each of the included distances. The views were shown at different angles: 50% in frontal view, 25% from the right profile, and 25% from the left profile. The order of appearance of the faces presented to each participant was randomized. Results: The overall success rate in facial emotion identification was 90.33%, being D3 the IPD with the best overall emotional recognition hits, although statistically significant differences could not be found between the IPDs. Consistent with results obtained in previous studies, identification rates for negative emotions were higher with increasing IPD, whereas the recognition task improved for positive emotions when IPD was closer. In addition, the study revealed irregular behavior in the facial detection of the emotion surprise. Conclusions: IVR allows us to reliably assess facial emotion recognition using dynamic avatars as all the IPDs tested showed to be effective. However, no statistically significant differences in facial emotion recognition were found among the different IPDs.
Collapse
Affiliation(s)
- Juan Del Aguila
- Complejo Hospitalario Universitario de Albacete (CHUA), Servicio de Salud de Castilla-La Mancha, Albacete, Spain
| | - Luz M González-Gualda
- Complejo Hospitalario Universitario de Albacete (CHUA), Servicio de Salud de Castilla-La Mancha, Albacete, Spain
| | - María Angeles Játiva
- Instituto de Investigación en Informática de Albacete, Universidad de Castilla-La Mancha, Albacete, Spain
| | - Patricia Fernández-Sotos
- Complejo Hospitalario Universitario de Albacete (CHUA), Servicio de Salud de Castilla-La Mancha, Albacete, Spain.,CIBERSAM (Biomedical Research Networking Centre in Mental Health), Madrid, Spain
| | - Antonio Fernández-Caballero
- Instituto de Investigación en Informática de Albacete, Universidad de Castilla-La Mancha, Albacete, Spain.,CIBERSAM (Biomedical Research Networking Centre in Mental Health), Madrid, Spain.,Departamento de Sistemas Informáticos, Universidad de Castilla-La Mancha, Albacete, Spain
| | - Arturo S García
- Instituto de Investigación en Informática de Albacete, Universidad de Castilla-La Mancha, Albacete, Spain.,Departamento de Sistemas Informáticos, Universidad de Castilla-La Mancha, Albacete, Spain
| |
Collapse
|
7
|
Facial Affect Recognition by Patients with Schizophrenia Using Human Avatars. J Clin Med 2021; 10:jcm10091904. [PMID: 33924939 PMCID: PMC8124197 DOI: 10.3390/jcm10091904] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2021] [Revised: 04/23/2021] [Accepted: 04/25/2021] [Indexed: 12/16/2022] Open
Abstract
People with schizophrenia have difficulty recognizing the emotions in the facial expressions of others, which affects their social interaction and functioning in the community. Static stimuli such as photographs have been used traditionally to examine deficiencies in the recognition of emotions in patients with schizophrenia, which has been criticized by some authors for lacking the dynamism that real facial stimuli have. With the aim of overcoming these drawbacks, in recent years, the creation and validation of virtual humans has been developed. This work presents the results of a study that evaluated facial recognition of emotions through a new set of dynamic virtual humans previously designed by the research team, in patients diagnosed of schizophrenia. The study included 56 stable patients, compared with 56 healthy controls. Our results showed that patients with schizophrenia present a deficit in facial affect recognition, compared to healthy controls (average hit rate 71.6% for patients vs 90.0% for controls). Facial expressions with greater dynamism (compared to less dynamic ones), as well as those presented from frontal view (compared to profile view) were better recognized in both groups. Regarding clinical and sociodemographic variables, the number of hospitalizations throughout life did not correlate with recognition rates. There was also no correlation between functioning or quality of life and recognition. A trend showed a reduction in the emotional recognition rate as a result of increases in Positive and Negative Syndrome Scale (PANSS), being statistically significant for negative PANSS. Patients presented a learning effect during the progression of the task, slightly greater in comparison to the control group. This finding is relevant when designing training interventions for people with schizophrenia. Maintaining the attention of patients and getting them to improve in the proposed tasks is a challenge for today’s psychiatry.
Collapse
|