1
|
Bevilacqua R, Stara V, Amabili G, Margaritini A, Benadduci M, Barbarossa F, Maranesi E, Rigaud AS, Dacunha S, Palmier C, Moller J, Browne R, Ogawa T, Wieching R. e-VITA study protocol: EU-Japan virtual coach for smart aging. Front Public Health 2024; 12:1256734. [PMID: 38544729 PMCID: PMC10968892 DOI: 10.3389/fpubh.2024.1256734] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2023] [Accepted: 02/16/2024] [Indexed: 04/26/2024] Open
Abstract
Aim The aim of this study is to report a trial protocol for assessing the improvement of older adults' well-being, promoting active and healthy aging, and reducing the risks of social exclusion, using a virtual coach. Background Increased longevity brings with it reduced autonomy and independence, and it is therefore necessary to act with preventive measures that can promote active and healthy aging. With the development of technology, new tools have appeared, including virtual coaches, which can enable people to lead a healthy lifestyle by identifying individual needs and goals and providing personalized recommendations and advice. However, it is important that these coaches take into consideration the inter-individual and cross-cultural differences of each person. Design A randomized controlled trial is proposed. Methods This study will recruit 240 healthy subjects aged 65 years and older. Participants will be assigned to an experimental group that will receive the e-VITA system or to the control group that will receive an information booklet only. The primary outcome measure is the person's quality of life (QoL). Data will be collected at baseline, 3 months after the trial, and at the end of the trial, after 6 months. Discussion This study will evaluate the effectiveness of the e-VITA system, consisting of a virtual coach, several sensors for monitoring, a smartphone for use at home, and a booklet, in improving the older person's quality of life. The increased perceived well-being will also be linked to improvements in other areas of the person's life, psychological and cognitive status, the area of sociality, nutrition, and eHealth literacy.
Collapse
Affiliation(s)
| | - Vera Stara
- Scientific Direction, IRCCS INRCA, Ancona, Italy
| | | | | | | | | | | | - Anne-Sophie Rigaud
- Université de Paris, Maladie d’Alzheimer, Paris, France
- Services de Gériatrie 1 & 2, AP-HP, Hôpital Broca, Paris, France
| | - Sébastien Dacunha
- Université de Paris, Maladie d’Alzheimer, Paris, France
- Services de Gériatrie 1 & 2, AP-HP, Hôpital Broca, Paris, France
| | - Cecilia Palmier
- Université de Paris, Maladie d’Alzheimer, Paris, France
- Services de Gériatrie 1 & 2, AP-HP, Hôpital Broca, Paris, France
| | - Johanna Moller
- Diocesan Caritas Assosiation of the Archdiocese of Cologne e.V., Cologne, Italy
| | - Ryan Browne
- Smart-Aging Research Center, Tohoku University, Sendai, Japan
| | - Toshimi Ogawa
- Smart-Aging Research Center, Tohoku University, Sendai, Japan
| | - Rainer Wieching
- Institute for New Media & Information Systems, University Siegen, Siegen, Germany
| |
Collapse
|
2
|
Naccarelli R, D’Agresti F, Roelen SD, Jokinen K, Casaccia S, Revel GM, Maggio M, Azimi Z, Alam MM, Saleem Q, Mohammed AH, Napolitano G, Szczepaniak F, Hariz M, Chollet G, Lohr C, Boudy J, Wieching R, Ogawa T. Empowering Smart Aging: Insights into the Technical Architecture of the e-VITA Virtual Coaching System for Older Adults. Sensors (Basel) 2024; 24:638. [PMID: 38276330 PMCID: PMC10818560 DOI: 10.3390/s24020638] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/09/2023] [Revised: 01/11/2024] [Accepted: 01/17/2024] [Indexed: 01/27/2024]
Abstract
With a substantial rise in life expectancy throughout the last century, society faces the imperative of seeking inventive approaches to foster active aging and provide adequate aging care. The e-VITA initiative, jointly funded by the European Union and Japan, centers on an advanced virtual coaching methodology designed to target essential aspects of promoting active and healthy aging. This paper describes the technical framework underlying the e-VITA virtual coaching system platform and presents preliminary feedback on its use. At its core is the e-VITA Manager, a pivotal component responsible for harmonizing the seamless integration of various specialized devices and modules. These modules include the Dialogue Manager, Data Fusion, and Emotional Detection, each making distinct contributions to enhance the platform's functionalities. The platform's design incorporates a multitude of devices and software components from Europe and Japan, each built upon diverse technologies and standards. This versatile platform facilitates communication and seamless integration among smart devices such as sensors and robots while efficiently managing data to provide comprehensive coaching functionalities.
Collapse
Affiliation(s)
- Riccardo Naccarelli
- Department of Industrial Engineering and Mathematical Sciences, Polytechnic University of Marche, 60131 Ancona, Italy; (S.C.); (G.M.R.)
| | | | - Sonja Dana Roelen
- Institut für Experimentelle Psychophysiologie GmbH, 40215 Düsseldorf, Germany; (S.D.R.); (Z.A.)
| | - Kristiina Jokinen
- Artificial Intelligence Research Center, National Institute of Advanced Industrial Science and Technology (AIRC/AIST), Tokyo 135-0064, Japan;
| | - Sara Casaccia
- Department of Industrial Engineering and Mathematical Sciences, Polytechnic University of Marche, 60131 Ancona, Italy; (S.C.); (G.M.R.)
| | - Gian Marco Revel
- Department of Industrial Engineering and Mathematical Sciences, Polytechnic University of Marche, 60131 Ancona, Italy; (S.C.); (G.M.R.)
| | - Martino Maggio
- Engineering Ingegneria Informatica SpA, 00144 Roma, Italy; (F.D.); (M.M.)
| | - Zohre Azimi
- Institut für Experimentelle Psychophysiologie GmbH, 40215 Düsseldorf, Germany; (S.D.R.); (Z.A.)
| | - Mirza Mohtashim Alam
- Leibniz Institute for Information Infrastructure, FIZ Karlsruhe, 76344 Eggenstein-Leopoldshafen, Germany;
| | - Qasid Saleem
- Institute for Applied Informatics (InfAI), 04109 Leipzig, Germany; (Q.S.); (A.H.M.); (G.N.)
| | - Abrar Hyder Mohammed
- Institute for Applied Informatics (InfAI), 04109 Leipzig, Germany; (Q.S.); (A.H.M.); (G.N.)
| | - Giulio Napolitano
- Institute for Applied Informatics (InfAI), 04109 Leipzig, Germany; (Q.S.); (A.H.M.); (G.N.)
| | - Florian Szczepaniak
- Institut Mines-Télécom (IMT), 91120 Palaiseau, France; (F.S.); (M.H.); (G.C.); (C.L.); (J.B.)
| | - Mossaab Hariz
- Institut Mines-Télécom (IMT), 91120 Palaiseau, France; (F.S.); (M.H.); (G.C.); (C.L.); (J.B.)
| | - Gérard Chollet
- Institut Mines-Télécom (IMT), 91120 Palaiseau, France; (F.S.); (M.H.); (G.C.); (C.L.); (J.B.)
| | - Christophe Lohr
- Institut Mines-Télécom (IMT), 91120 Palaiseau, France; (F.S.); (M.H.); (G.C.); (C.L.); (J.B.)
| | - Jérôme Boudy
- Institut Mines-Télécom (IMT), 91120 Palaiseau, France; (F.S.); (M.H.); (G.C.); (C.L.); (J.B.)
| | - Rainer Wieching
- Institute for Business Informatics & New Media, University Siegen, Kohlbettstr. 15, 57072 Siegen, Germany;
| | - Toshimi Ogawa
- Smart-Aging Research Center, Tohoku University, Sendai 980-8575, Japan;
| |
Collapse
|
3
|
Russo S, Lorusso L, D’Onofrio G, Ciccone F, Tritto M, Nocco S, Cardone D, Perpetuini D, Lombardo M, Lombardo D, Sancarlo D, Greco A, Merla A, Giuliani F. Assessing Feasibility of Cognitive Impairment Testing Using Social Robotic Technology Augmented with Affective Computing and Emotional State Detection Systems. Biomimetics (Basel) 2023; 8:475. [PMID: 37887606 PMCID: PMC10604561 DOI: 10.3390/biomimetics8060475] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2023] [Revised: 09/26/2023] [Accepted: 09/27/2023] [Indexed: 10/28/2023] Open
Abstract
Social robots represent a valid opportunity to manage the diagnosis, treatment, care, and support of older people with dementia. The aim of this study is to validate the Mini-Mental State Examination (MMSE) test administered by the Pepper robot equipped with systems to detect psychophysical and emotional states in older patients. Our main result is that the Pepper robot is capable of administering the MMSE and that cognitive status is not a determinant in the effective use of a social robot. People with mild cognitive impairment appreciate the robot, as it interacts with them. Acceptability does not relate strictly to the user experience, but the willingness to interact with the robot is an important variable for engagement. We demonstrate the feasibility of a novel approach that, in the future, could lead to more natural human-machine interaction when delivering cognitive tests with the aid of a social robot and a Computational Psychophysiology Module (CPM).
Collapse
Affiliation(s)
- Sergio Russo
- Research & Innovation Unit, Foundation IRCCS Casa Sollievo della Sofferenza, 71013 San Giovanni Rotondo, Italy;
| | - Letizia Lorusso
- Research & Innovation Unit, Foundation IRCCS Casa Sollievo della Sofferenza, 71013 San Giovanni Rotondo, Italy;
- Interdisciplinary Department of Medicine, School of Medical Statistics and Biometry, University of Bari Aldo Moro, 70124 Bari, Italy
| | - Grazia D’Onofrio
- Clinical Psychology Service, Health Department, IRCCS Casa Sollievo della Sofferenza, 71013 San Giovanni Rotondo, Italy; (G.D.); (F.C.)
| | - Filomena Ciccone
- Clinical Psychology Service, Health Department, IRCCS Casa Sollievo della Sofferenza, 71013 San Giovanni Rotondo, Italy; (G.D.); (F.C.)
| | - Michele Tritto
- Next2U Srl, Via dei Peligni 137, 65127 Pescara, Italy; (M.T.); (S.N.)
| | - Sergio Nocco
- Next2U Srl, Via dei Peligni 137, 65127 Pescara, Italy; (M.T.); (S.N.)
| | - Daniela Cardone
- Department of Engineering and Geology, University G. D’Annunzio of Chieti-Pescara, 65127 Pescara, Italy; (D.C.); (D.P.); (A.M.)
| | - David Perpetuini
- Department of Engineering and Geology, University G. D’Annunzio of Chieti-Pescara, 65127 Pescara, Italy; (D.C.); (D.P.); (A.M.)
| | - Marco Lombardo
- Behaviour Labs S.r.l.s. Piazza Gen. di Brigata Luigi Sapienza 22, 95030 Sant’Agata Li Battiati, Italy (D.L.)
| | - Daniele Lombardo
- Behaviour Labs S.r.l.s. Piazza Gen. di Brigata Luigi Sapienza 22, 95030 Sant’Agata Li Battiati, Italy (D.L.)
| | - Daniele Sancarlo
- Geriatrics Unit, Foundation IRCCS Casa Sollievo della Sofferenza, 71013 San Giovanni Rotondo, Italy; (D.S.); (A.G.)
| | - Antonio Greco
- Geriatrics Unit, Foundation IRCCS Casa Sollievo della Sofferenza, 71013 San Giovanni Rotondo, Italy; (D.S.); (A.G.)
| | - Arcangelo Merla
- Department of Engineering and Geology, University G. D’Annunzio of Chieti-Pescara, 65127 Pescara, Italy; (D.C.); (D.P.); (A.M.)
| | - Francesco Giuliani
- Research & Innovation Unit, Foundation IRCCS Casa Sollievo della Sofferenza, 71013 San Giovanni Rotondo, Italy;
| |
Collapse
|
4
|
Cîrneanu AL, Popescu D, Iordache D. New Trends in Emotion Recognition Using Image Analysis by Neural Networks, A Systematic Review. Sensors (Basel) 2023; 23:7092. [PMID: 37631629 PMCID: PMC10458371 DOI: 10.3390/s23167092] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/04/2023] [Revised: 07/29/2023] [Accepted: 08/02/2023] [Indexed: 08/27/2023]
Abstract
Facial emotion recognition (FER) is a computer vision process aimed at detecting and classifying human emotional expressions. FER systems are currently used in a vast range of applications from areas such as education, healthcare, or public safety; therefore, detection and recognition accuracies are very important. Similar to any computer vision task based on image analyses, FER solutions are also suitable for integration with artificial intelligence solutions represented by different neural network varieties, especially deep neural networks that have shown great potential in the last years due to their feature extraction capabilities and computational efficiency over large datasets. In this context, this paper reviews the latest developments in the FER area, with a focus on recent neural network models that implement specific facial image analysis algorithms to detect and recognize facial emotions. This paper's scope is to present from historical and conceptual perspectives the evolution of the neural network architectures that proved significant results in the FER area. This paper endorses convolutional neural network (CNN)-based architectures against other neural network architectures, such as recurrent neural networks or generative adversarial networks, highlighting the key elements and performance of each architecture, and the advantages and limitations of the proposed models in the analyzed papers. Additionally, this paper presents the available datasets that are currently used for emotion recognition from facial expressions and micro-expressions. The usage of FER systems is also highlighted in various domains such as healthcare, education, security, or social IoT. Finally, open issues and future possible developments in the FER area are identified.
Collapse
Affiliation(s)
- Andrada-Livia Cîrneanu
- Faculty of Automatic Control and Computers, University Politehnica of Bucharest, 060042 Bucharest, Romania;
| | - Dan Popescu
- Faculty of Automatic Control and Computers, University Politehnica of Bucharest, 060042 Bucharest, Romania;
| | - Dragoș Iordache
- The National Institute for Research & Development in Informatics-ICI Bucharest, 011455 Bucharest, Romania;
| |
Collapse
|
5
|
Mahzoon H, Ueda A, Yoshikawa Y, Ishiguro H. Effect of robot’s vertical body movement on its perceived emotion: A preliminary study on vertical oscillation and transition. PLoS One 2022; 17:e0271789. [PMID: 35947582 PMCID: PMC9365172 DOI: 10.1371/journal.pone.0271789] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2021] [Accepted: 07/07/2022] [Indexed: 11/19/2022] Open
Abstract
The emotion expressions of social robots are some of the most important developments in recent studies on human–robot interactions (HRIs). Several research studies have been conducted to assess effective factors to improve the quality of emotion expression of the robots. In this study, we examined the effects of a robot’s vertical oscillation and transition on the quality of its emotion expression, where the former indicates the periodic up/down movement of the body of the robot, while the latter indicates a one-time up or down movement. Short-term and long-term emotion expressions of the robot were studied independently for the four basic emotions described in the circumplex model of emotions: joy, anger, sadness, and relief. We designed an experiment with an adequate statistical power and minimum sample size of human subjects based on a priori power analysis. Human subjects were asked to evaluate the robot’s emotion expressions by watching its video with/without vertical movement. The results of the experiment showed that for the long-term emotions, the speed of vertical oscillation corresponded to the degree of arousal of the emotion expression as noted in the circumplex model; this indicated that fast oscillations improved the emotion expression with a higher degree of arousal, such as joy and anger, while slow or no oscillations were more suited to emotions with a lower degree of arousal, such as sadness and relief. For the short-term emotions, the direction of the vertical transition corresponded to the degree of valence for most of the expressed emotions, while the speed of vertical oscillation reflected the degree of arousal. The findings of this study can be adopted in the development of conversational robots to enhance their emotion expression.
Collapse
|
6
|
Valagkouti IA, Troussas C, Krouska A, Feidakis M, Sgouropoulou C. Emotion Recognition in Human–Robot Interaction Using the NAO Robot. Computers 2022; 11:72. [DOI: 10.3390/computers11050072] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
Affective computing can be implemented across many fields in order to provide a unique experience by tailoring services and products according to each person’s needs and interests. More specifically, digital learning and robotics in education can benefit from affective computing with a redesign of the curriculum’s contents based on students’ emotions during teaching. This key feature is observed during traditional learning methods, and robot tutors are adapting to it gradually. Following this trend, this work focused on creating a game that aims to raise environmental awareness by using the social robot NAO as a conversation agent. This quiz-like game supports emotion recognition with DeepFace, allowing users to review their answers if a negative emotion is detected. A version of this game was tested during real-life circumstances and produced favorable results, both for emotion analysis and overall user enjoyment.
Collapse
|
7
|
Filippini C, Di Crosta A, Palumbo R, Perpetuini D, Cardone D, Ceccato I, Di Domenico A, Merla A. Automated Affective Computing Based on Bio-Signals Analysis and Deep Learning Approach. Sensors (Basel) 2022; 22:s22051789. [PMID: 35270936 PMCID: PMC8914721 DOI: 10.3390/s22051789] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/26/2022] [Revised: 02/21/2022] [Accepted: 02/22/2022] [Indexed: 12/18/2022]
Abstract
Extensive possibilities of applications have rendered emotion recognition ineluctable and challenging in the fields of computer science as well as in human-machine interaction and affective computing. Fields that, in turn, are increasingly requiring real-time applications or interactions in everyday life scenarios. However, while extremely desirable, an accurate and automated emotion classification approach remains a challenging issue. To this end, this study presents an automated emotion recognition model based on easily accessible physiological signals and deep learning (DL) approaches. As a DL algorithm, a Feedforward Neural Network was employed in this study. The network outcome was further compared with canonical machine learning algorithms such as random forest (RF). The developed DL model relied on the combined use of wearables and contactless technologies, such as thermal infrared imaging. Such a model is able to classify the emotional state into four classes, derived from the linear combination of valence and arousal (referring to the circumplex model of affect’s four-quadrant structure) with an overall accuracy of 70% outperforming the 66% accuracy reached by the RF model. Considering the ecological and agile nature of the technique used the proposed model could lead to innovative applications in the affective computing field.
Collapse
Affiliation(s)
- Chiara Filippini
- Department of Neurosciences, Imaging and Clinical Sciences, University G. D’Annunzio of Chieti-Pescara, 9, 66100 Chieti, Italy; (C.F.); (D.P.); (D.C.); (I.C.)
| | - Adolfo Di Crosta
- Department of Psychological, Health and Territorial Sciences, University G. D’Annunzio of Chieti-Pescara, 9, 66100 Chieti, Italy; (A.D.C.); (R.P.); (A.D.D.)
| | - Rocco Palumbo
- Department of Psychological, Health and Territorial Sciences, University G. D’Annunzio of Chieti-Pescara, 9, 66100 Chieti, Italy; (A.D.C.); (R.P.); (A.D.D.)
| | - David Perpetuini
- Department of Neurosciences, Imaging and Clinical Sciences, University G. D’Annunzio of Chieti-Pescara, 9, 66100 Chieti, Italy; (C.F.); (D.P.); (D.C.); (I.C.)
| | - Daniela Cardone
- Department of Neurosciences, Imaging and Clinical Sciences, University G. D’Annunzio of Chieti-Pescara, 9, 66100 Chieti, Italy; (C.F.); (D.P.); (D.C.); (I.C.)
| | - Irene Ceccato
- Department of Neurosciences, Imaging and Clinical Sciences, University G. D’Annunzio of Chieti-Pescara, 9, 66100 Chieti, Italy; (C.F.); (D.P.); (D.C.); (I.C.)
| | - Alberto Di Domenico
- Department of Psychological, Health and Territorial Sciences, University G. D’Annunzio of Chieti-Pescara, 9, 66100 Chieti, Italy; (A.D.C.); (R.P.); (A.D.D.)
| | - Arcangelo Merla
- Department of Neurosciences, Imaging and Clinical Sciences, University G. D’Annunzio of Chieti-Pescara, 9, 66100 Chieti, Italy; (C.F.); (D.P.); (D.C.); (I.C.)
- Correspondence: ; Tel.: +39-0871-3556-954
| |
Collapse
|