1
|
Chen T, Helminen TM, Linnunsalo S, Hietanen JK. Autonomic and facial electromyographic responses to watching eyes. Iperception 2024; 15:20416695231226059. [PMID: 38268784 PMCID: PMC10807318 DOI: 10.1177/20416695231226059] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2023] [Accepted: 12/27/2023] [Indexed: 01/26/2024] Open
Abstract
We measured participants' psychophysiological responses and gaze behavior while viewing a stimulus person's direct and averted gaze in three different conditions manipulating the participants' experience of being watched. The results showed that skin conductance responses and heart rate deceleration responses were greater to direct than averted gaze only in the condition in which the participants had the experience of being watched by the other individual. In contrast, gaze direction had no effects on these responses when the participants were manipulated to believe that the other individual could not watch them or when the stimulus person was presented in a pre-recorded video. Importantly, the eye tracking measures showed no differences in participants' looking behavior between these stimulus presentation conditions. The results of facial electromyography responses suggested that direct gaze elicited greater zygomatic and periocular responses than averted gaze did, independent of the presentation condition. It was concluded that the affective arousal and attention-orienting indexing autonomic responses to eye contact are driven by the experience of being watched. In contrast, the facial responses seem to reflect automatized affiliative responses which can be elicited even in conditions in which seeing another's direct gaze does not signal that the self is being watched.
Collapse
Affiliation(s)
- Tingji Chen
- Department of Psychology, School of Education, Soochow University, Suzhou, Jiangsu, China
| | - Terhi M Helminen
- Human Information Processing Laboratory, Faculty of Social Sciences/Psychology, Tampere University, Tampere, Finland
| | - Samuli Linnunsalo
- Human Information Processing Laboratory, Faculty of Social Sciences/Psychology, Tampere University, Tampere, Finland
| | - Jari K Hietanen
- Human Information Processing Laboratory, Faculty of Social Sciences/Psychology, Tampere University, Tampere, Finland
| |
Collapse
|
2
|
Mok S, Park S, Whang M. Examining the Impact of Digital Human Gaze Expressions on Engagement Induction. Biomimetics (Basel) 2023; 8:610. [PMID: 38132549 PMCID: PMC10742036 DOI: 10.3390/biomimetics8080610] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2023] [Revised: 12/11/2023] [Accepted: 12/12/2023] [Indexed: 12/23/2023] Open
Abstract
With advancements in technology, digital humans are becoming increasingly sophisticated, with their application scope widening to include interactions with real people. However, research on expressions that facilitate natural engagement in interactions between real people and digital humans is scarce. With this study, we aimed to examine the differences in user engagement as measured by subjective evaluations, eye tracking, and electroencephalogram (EEG) responses relative to different gaze expressions in various conversational contexts. Conversational situations were categorized as face-to-face, face-to-video, and digital human interactions, with gaze expressions segmented into eye contact and gaze avoidance. Story stimuli incorporating twelve sentences verified to elicit positive and negative emotional responses were employed in the experiments after validation. A total of 45 participants (31 females and 14 males) underwent stimulation through positive and negative stories while exhibiting eye contact or gaze avoidance under each of the three conversational conditions. Engagement was assessed using subjective evaluation metrics in conjunction with measures of the subjects' gaze and brainwave activity. The findings revealed engagement disparities between the face-to-face and digital-human conversation conditions. Notably, only positive stimuli elicited variations in engagement based on gaze expression across different conversation conditions. Gaze analysis corroborated the engagement differences, aligning with prior research on social sensitivity, but only in response to positive stimuli. This research departs from traditional studies of un-natural interactions with digital humans, focusing instead on interactions with digital humans designed to mimic the appearance of real humans. This study demonstrates the potential for gaze expression to induce engagement, regardless of the human or digital nature of the conversational dyads.
Collapse
Affiliation(s)
- Subin Mok
- Department of Emotion Engineering, Sangmyung University, Seoul 03016, Republic of Korea; (S.M.); (S.P.)
| | - Sung Park
- Department of Emotion Engineering, Sangmyung University, Seoul 03016, Republic of Korea; (S.M.); (S.P.)
| | - Mincheol Whang
- Department of Human-Centered Artificial Intelligence, Sangmyung University, Seoul 03016, Republic of Korea
| |
Collapse
|
3
|
Linnunsalo S, Küster D, Yrttiaho S, Peltola MJ, Hietanen JK. Psychophysiological responses to eye contact with a humanoid robot: Impact of perceived intentionality. Neuropsychologia 2023; 189:108668. [PMID: 37619935 DOI: 10.1016/j.neuropsychologia.2023.108668] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2023] [Revised: 06/20/2023] [Accepted: 08/21/2023] [Indexed: 08/26/2023]
Abstract
Eye contact with a social robot has been shown to elicit similar psychophysiological responses to eye contact with another human. However, it is becoming increasingly clear that the attention- and affect-related psychophysiological responses differentiate between direct (toward the observer) and averted gaze mainly when viewing embodied faces that are capable of social interaction, whereas pictorial or pre-recorded stimuli have no such capability. It has been suggested that genuine eye contact, as indicated by the differential psychophysiological responses to direct and averted gaze, requires a feeling of being watched by another mind. Therefore, we measured event-related potentials (N170 and frontal P300) with EEG, facial electromyography, skin conductance, and heart rate deceleration responses to seeing a humanoid robot's direct versus averted gaze, while manipulating the impression of the robot's intentionality. The results showed that the N170 and the facial zygomatic responses were greater to direct than to averted gaze of the robot, and independent of the robot's intentionality, whereas the frontal P300 responses were more positive to direct than to averted gaze only when the robot appeared intentional. The study provides further evidence that the gaze behavior of a social robot elicits attentional and affective responses and adds that the robot's seemingly autonomous social behavior plays an important role in eliciting higher-level socio-cognitive processing.
Collapse
Affiliation(s)
- Samuli Linnunsalo
- Human Information Processing Laboratory, Faculty of Social Sciences/Psychology, Tampere University, Tampere, Finland.
| | - Dennis Küster
- Cognitive Systems Lab, Department of Computer Science, University of Bremen, Bremen, Germany
| | - Santeri Yrttiaho
- Human Information Processing Laboratory, Faculty of Social Sciences/Psychology, Tampere University, Tampere, Finland
| | - Mikko J Peltola
- Human Information Processing Laboratory, Faculty of Social Sciences/Psychology, Tampere University, Tampere, Finland; Tampere Institute for Advanced Study, Tampere University, Tampere, Finland
| | - Jari K Hietanen
- Human Information Processing Laboratory, Faculty of Social Sciences/Psychology, Tampere University, Tampere, Finland.
| |
Collapse
|
4
|
Sun W, Chen T, Hietanen JK. Skin conductance, facial EMG, and heart rate responses in multi-person gaze interactions. Biol Psychol 2023; 176:108465. [PMID: 36442581 DOI: 10.1016/j.biopsycho.2022.108465] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2022] [Revised: 11/23/2022] [Accepted: 11/23/2022] [Indexed: 11/27/2022]
Abstract
Previous literature has reported enhanced affective and attentional responses to faces with a direct vs. averted gaze. Typically, in these studies, only single faces were presented. However, daily social encounters often involve interaction with more than just one person. By employing an experimental set-up in which the participants believed they were interacting with two other persons, the present study, for the first time, investigated participants' skin conductance, facial electromyographic (EMG), and heart rate deceleration responses in multi-person eye contact situations. Responses were measured in two different social contexts; i) when the participants observed eye contact between two other persons ('vicarious eye contact effect'), and ii) when the participants themselves received direct gaze either from one or two persons. The results showed that the skin conductance, facial EMG, and heart rate deceleration responses elicited by observing two other persons making eye contact did not differ from those elicited by observing one person looking at the other while the other person was not reciprocating with their gaze. As a novel finding, the results showed that receiving direct gaze from two persons elicited greater affective arousal and zygomatic EMG, but smaller heart rate deceleration responses in participants than receiving direct gaze from one person only. The findings are thoroughly discussed and it is concluded that physiological responses in multi-person interaction contexts are influenced by many social effects between the interactors and can be markedly different from those observed in two-person interactions.
Collapse
Affiliation(s)
- Wenting Sun
- Department of Psychology, School of Education, Soochow University, Suzhou, Jiangsu 215123, China
| | - Tingji Chen
- Department of Psychology, School of Education, Soochow University, Suzhou, Jiangsu 215123, China.
| | - Jari K Hietanen
- Human Information Processing Laboratory, Faculty of Social Sciences/Psychology, Tampere University, Tampere, Finland
| |
Collapse
|
5
|
Enoki M, Inaishi T, Noguchi H. Extraction and Evaluation of Greeting Speech-Timing and Characteristic Upper Body Motion for Robots to Gain Attention of Older Adults. JOURNAL OF ROBOTICS AND MECHATRONICS 2022. [DOI: 10.20965/jrm.2022.p1338] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
Abstract
Greeting is important for socially assistive robot to smoothly initiate conversations with older adults. Because of their decreased cognitive function, older adults may occasionally be unaware of the presence of a robot. The purpose of this study was to investigate and evaluate the characteristic motion and utterance time for greeting older adults in comparison with those for greeting non-older adults. The motion and utterance for greeting a seated target imitating an older adult and greeting a non-older adult were measured. The utterance times of the greeting and motion parameters such as the maximum joint angles were calculated from the measured data. The parameters were compared using statistical methods. According to the results, the hip bending angle in older adults was 36.6° greater than in the non-older adults. The utterance lag for greeting the older adults was 0.7 s longer than that for greeting the non-older adults at this time. The impressions of the robot that greeted the participants based on the extracted motion parameters were compared to verify these parameter differences. Although the greeting styles did not differ significantly, it was verified that the robot’s greeting was more impressive than that of a computer-graphics robot.
Collapse
|
6
|
Being watched by a humanoid robot and a human: Effects on affect-related psychophysiological responses. Biol Psychol 2022; 175:108451. [DOI: 10.1016/j.biopsycho.2022.108451] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2022] [Revised: 10/27/2022] [Accepted: 10/31/2022] [Indexed: 11/06/2022]
|
7
|
Jung M, Kim J, Han K, Kim K. Social Telecommunication Experience with Full-Body Ownership Humanoid Robot. Int J Soc Robot 2022. [DOI: 10.1007/s12369-022-00922-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
8
|
Morillo-Mendez L, Schrooten MGS, Loutfi A, Mozos OM. Age-Related Differences in the Perception of Robotic Referential Gaze in Human-Robot Interaction. Int J Soc Robot 2022:1-13. [PMID: 36185773 PMCID: PMC9510350 DOI: 10.1007/s12369-022-00926-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/08/2022] [Indexed: 11/12/2022]
Abstract
There is an increased interest in using social robots to assist older adults during their daily life activities. As social robots are designed to interact with older users, it becomes relevant to study these interactions under the lens of social cognition. Gaze following, the social ability to infer where other people are looking at, deteriorates with older age. Therefore, the referential gaze from robots might not be an effective social cue to indicate spatial locations to older users. In this study, we explored the performance of older adults, middle-aged adults, and younger controls in a task assisted by the referential gaze of a Pepper robot. We examined age-related differences in task performance, and in self-reported social perception of the robot. Our main findings show that referential gaze from a robot benefited task performance, although the magnitude of this facilitation was lower for older participants. Moreover, perceived anthropomorphism of the robot varied less as a result of its referential gaze in older adults. This research supports that social robots, even if limited in their gazing capabilities, can be effectively perceived as social entities. Additionally, this research suggests that robotic social cues, usually validated with young participants, might be less optimal signs for older adults. Supplementary Information The online version contains supplementary material available at 10.1007/s12369-022-00926-6.
Collapse
Affiliation(s)
- Lucas Morillo-Mendez
- Centre for Applied Autonomous Sensor Systems, Örebro University, Fakultetsgatan 1, Örebro, 702 81 Sweden
| | | | - Amy Loutfi
- Centre for Applied Autonomous Sensor Systems, Örebro University, Fakultetsgatan 1, Örebro, 702 81 Sweden
| | - Oscar Martinez Mozos
- Centre for Applied Autonomous Sensor Systems, Örebro University, Fakultetsgatan 1, Örebro, 702 81 Sweden
| |
Collapse
|
9
|
A study on the influence of service robots’ level of anthropomorphism on the willingness of users to follow their recommendations. Sci Rep 2022; 12:15266. [PMID: 36088470 PMCID: PMC9463504 DOI: 10.1038/s41598-022-19501-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2022] [Accepted: 08/30/2022] [Indexed: 11/16/2022] Open
Abstract
Service robots are increasingly deployed in various industries including tourism. In spite of extensive research on the user’s experience in interaction with these robots, there are yet unanswered questions about the factors that influence user’s compliance. Through three online studies, we investigate the effect of the robot anthropomorphism and language style on customers’ willingness to follow its recommendations. The mediating role of the perceived mind and persuasiveness in this relationship is also investigated. Study 1 (n = 89) shows that a service robot with a higher level of anthropomorphic features positively influences the willingness of users to follow its recommendations while language style does not affect compliance. Study 2a (n = 168) further confirms this finding when we presented participants with a tablet vs. a service robot with an anthropomorphic appearance while communication style does not affect compliance. Finally, Study 2b (n = 122) supports the indirect effect of anthropomorphism level on the willingness to follow recommendations through perceived mind followed by persuasiveness. The findings provide valuable insight to enhance human–robot interaction in service settings.
Collapse
|
10
|
Eye contact avoidance in crowds: A large wearable eye-tracking study. Atten Percept Psychophys 2022; 84:2623-2640. [PMID: 35996058 PMCID: PMC9630249 DOI: 10.3758/s13414-022-02541-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/08/2022] [Indexed: 11/30/2022]
Abstract
Eye contact is essential for human interactions. We investigated whether humans are able to avoid eye contact while navigating crowds. At a science festival, we fitted 62 participants with a wearable eye tracker and instructed them to walk a route. Half of the participants were further instructed to avoid eye contact. We report that humans can flexibly allocate their gaze while navigating crowds and avoid eye contact primarily by orienting their head and eyes towards the floor. We discuss implications for crowd navigation and gaze behavior. In addition, we address a number of issues encountered in such field studies with regard to data quality, control of the environment, and participant adherence to instructions. We stress that methodological innovation and scientific progress are strongly interrelated.
Collapse
|
11
|
Wever MCM, van Houtum LAEM, Janssen LHC, Wentholt WGM, Spruit IM, Tollenaar MS, Will GJ, Elzinga BM. Neural and Affective Responses to Prolonged Eye Contact with One's Own Adolescent Child and Unfamiliar Others. Neuroimage 2022; 260:119463. [PMID: 35830902 DOI: 10.1016/j.neuroimage.2022.119463] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2022] [Revised: 06/22/2022] [Accepted: 07/09/2022] [Indexed: 11/25/2022] Open
Abstract
Eye contact is crucial for the formation and maintenance of social relationships, and plays a key role in facilitating a strong parent-child bond. However, the precise neural and affective mechanisms through which eye contact impacts on parent-child relationships remain elusive. We introduce a task to assess parents' neural and affective responses to prolonged direct and averted gaze coming from their own child, and an unfamiliar child and adult. While in the scanner, 79 parents (n = 44 mothers and n = 35 fathers) were presented with prolonged (16-38 s) videos of their own child, an unfamiliar child, an unfamiliar adult, and themselves (i.e., targets), facing the camera with a direct or an averted gaze. We measured BOLD-responses, tracked parents' eye movements during the videos, and asked them to report on their mood and feelings of connectedness with the targets after each video. Parents reported improved mood and increased feelings of connectedness after prolonged exposure to direct versus averted gaze and these effects were amplified for unfamiliar targets compared to their own child, due to high affect and connectedness ratings after videos of their own child. Neuroimaging results showed that the sight of one's own child was associated with increased activity in middle occipital gyrus, fusiform gyrus and inferior frontal gyrus relative to seeing an unfamiliar child or adult. While we found no robust evidence of specific neural correlates of eye contact (i.e., contrast direct > averted gaze), an exploratory parametric analysis showed that dorsomedial prefrontal cortex (dmPFC) activity increased linearly with duration of eye contact (collapsed across all "other" targets). Eye contact-related dmPFC activity correlated positively with increases in feelings of connectedness, suggesting that this region may drive feelings of connectedness during prolonged eye contact with others. These results underline the importance of prolonged eye contact for affiliative processes and provide first insights into its neural correlates. This may pave the way for new research in individuals or pairs in whom affiliative processes are disrupted.
Collapse
Affiliation(s)
- Mirjam C M Wever
- Department of Clinical Psychology, Leiden University, Leiden, the Netherlands; Leiden Institute for Brain and Cognition, Leiden, the Netherlands.
| | - Lisanne A E M van Houtum
- Department of Clinical Psychology, Leiden University, Leiden, the Netherlands; Leiden Institute for Brain and Cognition, Leiden, the Netherlands
| | - Loes H C Janssen
- Department of Clinical Psychology, Leiden University, Leiden, the Netherlands; Leiden Institute for Brain and Cognition, Leiden, the Netherlands
| | - Wilma G M Wentholt
- Department of Clinical Psychology, Leiden University, Leiden, the Netherlands; Leiden Institute for Brain and Cognition, Leiden, the Netherlands
| | - Iris M Spruit
- Department of Clinical Psychology, Leiden University, Leiden, the Netherlands
| | - Marieke S Tollenaar
- Department of Clinical Psychology, Leiden University, Leiden, the Netherlands; Leiden Institute for Brain and Cognition, Leiden, the Netherlands
| | - Geert-Jan Will
- Department of Clinical Psychology, Utrecht University, Utrecht, the Netherlands
| | - Bernet M Elzinga
- Department of Clinical Psychology, Leiden University, Leiden, the Netherlands; Leiden Institute for Brain and Cognition, Leiden, the Netherlands
| |
Collapse
|
12
|
Ishikawa M, Itakura S. Pupil dilation predicts modulation of direct gaze on action value calculations. Biol Psychol 2022; 171:108340. [PMID: 35460818 DOI: 10.1016/j.biopsycho.2022.108340] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2022] [Revised: 04/15/2022] [Accepted: 04/18/2022] [Indexed: 11/02/2022]
Abstract
Perceiving direct gaze facilitates social cognition and behaviour. We hypothesized that direct gaze modulates decision-making, particularly calculations of action values. To test our hypothesis, we used the reinforcement learning paradigm in situations with or without direct gaze. Forty adults were recruited and participated in pupil size measurements and a two-armed bandit task. The task was conducted with 70% and 30% reward probabilities for each option. During the task, a female showing the Direct Gaze (DG) or Closed Eyes (CE) condition was presented from the start of each trial. The results showed that behavioural bias to choices with 70% reward probability increased more in the DG condition than in the CE condition and the expected reward value. This bias to choices with 70% reward in the DG condition was predicted by pupil dilation to DG. These results suggest that participants over-evaluated the expected reward value in the DG condition, and this DG effect may be related to subjective expectations of rewarding events indexed by pupil dilations. It is considered that perceiving direct gaze is a driver of reward expectations that modulate action value calculations and then cognitive processing and behaviours are facilitated.
Collapse
Affiliation(s)
- Mitsuhiko Ishikawa
- Centre for Baby Science, Doshisha University, 4-1-1 Kizugawadai, Kizugawa, Kyoto 619-0295 Japan; Japan Society for the Promotion of Science.
| | - Shoji Itakura
- Centre for Baby Science, Doshisha University, 4-1-1 Kizugawadai, Kizugawa, Kyoto 619-0295 Japan
| |
Collapse
|
13
|
Human face and gaze perception is highly context specific and involves bottom-up and top-down neural processing. Neurosci Biobehav Rev 2021; 132:304-323. [PMID: 34861296 DOI: 10.1016/j.neubiorev.2021.11.042] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2021] [Revised: 11/24/2021] [Accepted: 11/24/2021] [Indexed: 11/21/2022]
Abstract
This review summarizes human perception and processing of face and gaze signals. Face and gaze signals are important means of non-verbal social communication. The review highlights that: (1) some evidence is available suggesting that the perception and processing of facial information starts in the prenatal period; (2) the perception and processing of face identity, expression and gaze direction is highly context specific, the effect of race and culture being a case in point. Culture affects by means of experiential shaping and social categorization the way in which information on face and gaze is collected and perceived; (3) face and gaze processing occurs in the so-called 'social brain'. Accumulating evidence suggests that the processing of facial identity, facial emotional expression and gaze involves two parallel and interacting pathways: a fast and crude subcortical route and a slower cortical pathway. The flow of information is bi-directional and includes bottom-up and top-down processing. The cortical networks particularly include the fusiform gyrus, superior temporal sulcus (STS), intraparietal sulcus, temporoparietal junction and medial prefrontal cortex.
Collapse
|
14
|
Zhu L, Zhang J. Does a Sense of Social Presence During Conversation Affect Student's Shared Memory? Evidence From SS-RIF Paradigm. Front Public Health 2021; 9:728762. [PMID: 34513793 PMCID: PMC8429795 DOI: 10.3389/fpubh.2021.728762] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2021] [Accepted: 08/05/2021] [Indexed: 11/21/2022] Open
Abstract
People constantly talk to one another about the past, and in so doing, they recount certain details while remaining silent about others. Collaborative or conversational remembering plays an important role in establishing shared representations of the past (e.g., the 911 attacks, Covid-19). According to the socially shared retrieval-induced forgetting (SS-RIF) effect, a listener will forget about relevant but unpracticed information during communication, due to intentional or unintentional selective retrieval of data by the speaker. The SS-RIF paradigm has been applied to explain how collective memory is shaped within the context of conversation/discourse. This study sought to determine if SS-RIF occurred only during face-to-face communication, or whether shared memories could be developed through other types of conversation quite common in modern society. We also investigated whether a level of social interaction in the real-world presence of others is a necessary condition for inducing SS-RIF, and if listeners experience different degrees of SS-RIF due to different levels of perceived social presence. We observed the SS-RIF phenomenon in listeners both in real life and video; the degree of forgetting was the same for the two conditions. These results indicate that social presence may not be associated with SS-RIF. Public silence affects the formation of collective memory regardless of the face-to-face presence of others, and thus physical presence is not necessary to induce SS-RIF.
Collapse
Affiliation(s)
- Lin Zhu
- School of Psychology, Fujian Normal University, Fuzhou, China
| | - Jinkun Zhang
- School of Psychology, Fujian Normal University, Fuzhou, China
| |
Collapse
|
15
|
Hietanen JK, Peltola MJ. The eye contact smile: The effects of sending and receiving a direct gaze. VISUAL COGNITION 2021. [DOI: 10.1080/13506285.2021.1915904] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Affiliation(s)
- Jari K. Hietanen
- Human Information Processing Laboratory, Faculty of Social Sciences/Psychology, Tampere University, Tampere, Finland
| | - Mikko J. Peltola
- Human Information Processing Laboratory, Faculty of Social Sciences/Psychology, Tampere University, Tampere, Finland
| |
Collapse
|
16
|
Schellen E, Bossi F, Wykowska A. Robot Gaze Behavior Affects Honesty in Human-Robot Interaction. Front Artif Intell 2021; 4:663190. [PMID: 34046585 PMCID: PMC8144295 DOI: 10.3389/frai.2021.663190] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2021] [Accepted: 04/13/2021] [Indexed: 11/18/2022] Open
Abstract
As the use of humanoid robots proliferates, an increasing amount of people may find themselves face-to-“face” with a robot in everyday life. Although there is a plethora of information available on facial social cues and how we interpret them in the field of human-human social interaction, we cannot assume that these findings flawlessly transfer to human-robot interaction. Therefore, more research on facial cues in human-robot interaction is required. This study investigated deception in human-robot interaction context, focusing on the effect that eye contact with a robot has on honesty toward this robot. In an iterative task, participants could assist a humanoid robot by providing it with correct information, or potentially secure a reward for themselves by providing it with incorrect information. Results show that participants are increasingly honest after the robot establishes eye contact with them, but only if this is in response to deceptive behavior. Behavior is not influenced by the establishment of eye contact if the participant is actively engaging in honest behavior. These findings support the notion that humanoid robots can be perceived as, and treated like, social agents, since the herein described effect mirrors one present in human-human social interaction.
Collapse
Affiliation(s)
- Elef Schellen
- Social Cognition in Human-Robot Interaction, Istituto Italiano di Tecnologia, Genoa, Italy
| | - Francesco Bossi
- Social Cognition in Human-Robot Interaction, Istituto Italiano di Tecnologia, Genoa, Italy.,MoMiLab Research Unit, IMT School for Advanced Studies Lucca, Lucca, Italy
| | - Agnieszka Wykowska
- Social Cognition in Human-Robot Interaction, Istituto Italiano di Tecnologia, Genoa, Italy
| |
Collapse
|
17
|
Tanioka T, Yokotani T, Tanioka R, Betriana F, Matsumoto K, Locsin R, Zhao Y, Osaka K, Miyagawa M, Schoenhofer S. Development Issues of Healthcare Robots: Compassionate Communication for Older Adults with Dementia. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2021; 18:ijerph18094538. [PMID: 33923353 PMCID: PMC8123161 DOI: 10.3390/ijerph18094538] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/14/2021] [Revised: 04/14/2021] [Accepted: 04/21/2021] [Indexed: 12/30/2022]
Abstract
Although progress is being made in affective computing, issues remain in enabling the effective expression of compassionate communication by healthcare robots. Identifying, describing and reconciling these concerns are important in order to provide quality contemporary healthcare for older adults with dementia. The purpose of this case study was to explore the development issues of healthcare robots in expressing compassionate communication for older adults with dementia. An exploratory descriptive case study was conducted with the Pepper robot and older adults with dementia using high-tech digital cameras to document significant communication proceedings that occurred during the activities. Data were collected in December 2020. The application program for an intentional conversation using Pepper was jointly developed by Tanioka’s team and the Xing Company, allowing Pepper’s words and head movements to be remotely controlled. The analysis of the results revealed four development issues, namely, (1) accurate sensing behavior for “listening” to voices appropriately and accurately interacting with subjects; (2) inefficiency in “listening” and “gaze” activities; (3) fidelity of behavioral responses; and (4) deficiency in natural language processing AI development, i.e., the ability to respond actively to situations that were not pre-programmed by the developer. Conversational engagements between the Pepper robot and patients with dementia illustrated a practical usage of technologies with artificial intelligence and natural language processing. The development issues found in this study require reconciliation in order to enhance the potential for healthcare robot engagement in compassionate communication in the care of older adults with dementia.
Collapse
Affiliation(s)
- Tetsuya Tanioka
- Institute of Biomedical Sciences, Tokushima University, Tokushima 770-8509, Japan;
- Anne Boykin Institute, Florida Atlantic University, Boca Raton, FL 33431, USA;
- Correspondence: ; Tel.: +81-88-633-9021
| | - Tomoya Yokotani
- Graduate School of Health Sciences, Tokushima University, Tokushima 770-8509, Japan; (T.Y.); (R.T.); (F.B.)
| | - Ryuichi Tanioka
- Graduate School of Health Sciences, Tokushima University, Tokushima 770-8509, Japan; (T.Y.); (R.T.); (F.B.)
| | - Feni Betriana
- Graduate School of Health Sciences, Tokushima University, Tokushima 770-8509, Japan; (T.Y.); (R.T.); (F.B.)
| | - Kazuyuki Matsumoto
- Graduate School of Engineering, Tokushima University, Tokushima 770-8506, Japan;
| | - Rozzano Locsin
- Institute of Biomedical Sciences, Tokushima University, Tokushima 770-8509, Japan;
- Christine E. Lynn College of Nursing, Florida Atlantic University, Boca Raton, FL 33431, USA
| | - Yueren Zhao
- Department of Psychiatry, Fujita Health University, Aichi 470-1192, Japan;
| | - Kyoko Osaka
- Department of Nursing, Nursing Course of Kochi Medical School, Kochi University, Kochi 783-8505, Japan;
| | - Misao Miyagawa
- Department of Nursing, Faculty of Health and Welfare, Tokushima Bunri University, Tokushima 770-8514, Japan;
| | - Savina Schoenhofer
- Anne Boykin Institute, Florida Atlantic University, Boca Raton, FL 33431, USA;
| |
Collapse
|