1
|
Schiller IS, Breuer C, Aspöck L, Ehret J, Bönsch A, Kuhlen TW, Fels J, Schlittmeier SJ. A lecturer's voice quality and its effect on memory, listening effort, and perception in a VR environment. Sci Rep 2024; 14:12407. [PMID: 38811832 PMCID: PMC11137055 DOI: 10.1038/s41598-024-63097-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2024] [Accepted: 05/24/2024] [Indexed: 05/31/2024] Open
Abstract
Many lecturers develop voice problems, such as hoarseness. Nevertheless, research on how voice quality influences listeners' perception, comprehension, and retention of spoken language is limited to a small number of audio-only experiments. We aimed to address this gap by using audio-visual virtual reality (VR) to investigate the impact of a lecturer's hoarseness on university students' heard text recall, listening effort, and listening impression. Fifty participants were immersed in a virtual seminar room, where they engaged in a Dual-Task Paradigm. They listened to narratives presented by a virtual female professor, who spoke in either a typical or hoarse voice. Simultaneously, participants performed a secondary task. Results revealed significantly prolonged secondary-task response times with the hoarse voice compared to the typical voice, indicating increased listening effort. Subjectively, participants rated the hoarse voice as more annoying, effortful to listen to, and impeding for their cognitive performance. No effect of voice quality was found on heard text recall, suggesting that, while hoarseness may compromise certain aspects of spoken language processing, this might not necessarily result in reduced information retention. In summary, our findings underscore the importance of promoting vocal health among lecturers, which may contribute to enhanced listening conditions in learning spaces.
Collapse
Affiliation(s)
- Isabel S Schiller
- Work and Engineering Psychology, RWTH Aachen University, Aachen, 52066, Germany.
| | - Carolin Breuer
- Institute for Hearing Technology and Acoustics, RWTH Aachen University, Aachen, 52074, Germany
| | - Lukas Aspöck
- Institute for Hearing Technology and Acoustics, RWTH Aachen University, Aachen, 52074, Germany
| | - Jonathan Ehret
- Visual Computing Institute, RWTH Aachen University, Aachen, 52074, Germany
| | - Andrea Bönsch
- Visual Computing Institute, RWTH Aachen University, Aachen, 52074, Germany
| | - Torsten W Kuhlen
- Visual Computing Institute, RWTH Aachen University, Aachen, 52074, Germany
| | - Janina Fels
- Institute for Hearing Technology and Acoustics, RWTH Aachen University, Aachen, 52074, Germany
| | | |
Collapse
|
2
|
Mok S, Park S, Whang M. Examining the Impact of Digital Human Gaze Expressions on Engagement Induction. Biomimetics (Basel) 2023; 8:610. [PMID: 38132549 PMCID: PMC10742036 DOI: 10.3390/biomimetics8080610] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2023] [Revised: 12/11/2023] [Accepted: 12/12/2023] [Indexed: 12/23/2023] Open
Abstract
With advancements in technology, digital humans are becoming increasingly sophisticated, with their application scope widening to include interactions with real people. However, research on expressions that facilitate natural engagement in interactions between real people and digital humans is scarce. With this study, we aimed to examine the differences in user engagement as measured by subjective evaluations, eye tracking, and electroencephalogram (EEG) responses relative to different gaze expressions in various conversational contexts. Conversational situations were categorized as face-to-face, face-to-video, and digital human interactions, with gaze expressions segmented into eye contact and gaze avoidance. Story stimuli incorporating twelve sentences verified to elicit positive and negative emotional responses were employed in the experiments after validation. A total of 45 participants (31 females and 14 males) underwent stimulation through positive and negative stories while exhibiting eye contact or gaze avoidance under each of the three conversational conditions. Engagement was assessed using subjective evaluation metrics in conjunction with measures of the subjects' gaze and brainwave activity. The findings revealed engagement disparities between the face-to-face and digital-human conversation conditions. Notably, only positive stimuli elicited variations in engagement based on gaze expression across different conversation conditions. Gaze analysis corroborated the engagement differences, aligning with prior research on social sensitivity, but only in response to positive stimuli. This research departs from traditional studies of un-natural interactions with digital humans, focusing instead on interactions with digital humans designed to mimic the appearance of real humans. This study demonstrates the potential for gaze expression to induce engagement, regardless of the human or digital nature of the conversational dyads.
Collapse
Affiliation(s)
- Subin Mok
- Department of Emotion Engineering, Sangmyung University, Seoul 03016, Republic of Korea; (S.M.); (S.P.)
| | - Sung Park
- Department of Emotion Engineering, Sangmyung University, Seoul 03016, Republic of Korea; (S.M.); (S.P.)
| | - Mincheol Whang
- Department of Human-Centered Artificial Intelligence, Sangmyung University, Seoul 03016, Republic of Korea
| |
Collapse
|
3
|
Hadley LV, Culling JF. Timing of head turns to upcoming talkers in triadic conversation: Evidence for prediction of turn ends and interruptions. Front Psychol 2022; 13:1061582. [PMID: 36605274 PMCID: PMC9807761 DOI: 10.3389/fpsyg.2022.1061582] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2022] [Accepted: 11/24/2022] [Indexed: 12/24/2022] Open
Abstract
In conversation, people are able to listen to an utterance and respond within only a few hundred milliseconds. It takes substantially longer to prepare even a simple utterance, suggesting that interlocutors may make use of predictions about when the talker is about to end. But it is not only the upcoming talker that needs to anticipate the prior talker ending-listeners that are simply following the conversation could also benefit from predicting the turn end in order to shift attention appropriately with the turn switch. In this paper, we examined whether people predict upcoming turn ends when watching conversational turns switch between others by analysing natural conversations. These conversations were between triads of older adults in different levels and types of noise. The analysis focused on the observer during turn switches between the other two parties using head orientation (i.e. saccades from one talker to the next) to identify when their focus moved from one talker to the next. For non-overlapping utterances, observers started to turn to the upcoming talker before the prior talker had finished speaking in 17% of turn switches (going up to 26% when accounting for motor-planning time). For overlapping utterances, observers started to turn towards the interrupter before they interrupted in 18% of turn switches (going up to 33% when accounting for motor-planning time). The timing of head turns was more precise at lower than higher noise levels, and was not affected by noise type. These findings demonstrate that listeners in natural group conversation situations often exhibit head movements that anticipate the end of one conversational turn and the beginning of another. Furthermore, this work demonstrates the value of analysing head movement as a cue to social attention, which could be relevant for advancing communication technology such as hearing devices.
Collapse
Affiliation(s)
- Lauren V. Hadley
- Hearing Sciences – Scottish Section, School of Medicine, University of Nottingham, Glasgow, United Kingdom
| | - John F. Culling
- School of Psychology, Cardiff University, Cardiff, United Kingdom
| |
Collapse
|
4
|
Jhan XD, Wong SK, Ebrahimi E, Lai Y, Huang WC, Babu SV. Effects of Small Talk With a Crowd of Virtual Humans on Users' Emotional and Behavioral Responses. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:3767-3777. [PMID: 36049003 DOI: 10.1109/tvcg.2022.3203107] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
In this contribution, we empirically investigated the effect of small talk on the users' non-verbal behaviors and emotions when users interacted with a crowd of virtual humans (VHs) with positive behavioral dispositions. Users were tasked with collecting items in a virtual marketplace via natural speech-based dialogue with a crowd of virtual pedestrians and vendors. The users were able to engage in natural speech-based conversation in a predefined corpus of small talk content that covered various commonplace small talk topics such as conversations about the weather, general concerns, and entertainment based on similar real-life situations. For instance, the VHs with the small talk ability would ask the users some simple questions to make small talk or remind the users of their belongings. We conducted a between-subjects empirical evaluation to investigate whether the user behaviors and emotions were different between a small talk condition and a non-small talk condition, and examined gender effects of the participants. We collected objective and subjective measures of the users to analyze users' emotions and social interaction behaviors, when in conversation with VHs that either possessed small-talk capability or not, besides task or goal oriented dialogue capabilities. Our result revealed that the VHs with small talk capability could alter the emotions and non-verbal behaviors of the users. Furthermore, the non-verbal behaviors between female and male participants differed greatly in the presence or absence of small talk.
Collapse
|
5
|
|
6
|
Van Pinxteren MM, Pluymaekers M, Lemmink JG. Human-like communication in conversational agents: a literature review and research agenda. JOURNAL OF SERVICE MANAGEMENT 2020. [DOI: 10.1108/josm-06-2019-0175] [Citation(s) in RCA: 47] [Impact Index Per Article: 11.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
PurposeConversational agents (chatbots, avatars and robots) are increasingly substituting human employees in service encounters. Their presence offers many potential benefits, but customers are reluctant to engage with them. A possible explanation is that conversational agents do not make optimal use of communicative behaviors that enhance relational outcomes. The purpose of this paper is to identify which human-like communicative behaviors used by conversational agents have positive effects on relational outcomes and which additional behaviors could be investigated in future research.Design/methodology/approachThis paper presents a systematic review of 61 articles that investigated the effects of communicative behaviors used by conversational agents on relational outcomes. A taxonomy is created of all behaviors investigated in these studies, and a research agenda is constructed on the basis of an analysis of their effects and a comparison with the literature on human-to-human service encounters.FindingsThe communicative behaviors can be classified along two dimensions: modality (verbal, nonverbal, appearance) and footing (similarity, responsiveness). Regarding the research agenda, it is noteworthy that some categories of behaviors show mixed results and some behaviors that are effective in human-to-human interactions have not yet been investigated in conversational agents.Practical implicationsBy identifying potentially effective communicative behaviors in conversational agents, this study assists managers in optimizing encounters between conversational agents and customers.Originality/valueThis is the first study that develops a taxonomy of communicative behaviors in conversational agents and uses it to identify avenues for future research.
Collapse
|
7
|
Arai H, Kimoto M, Iio T, Shimohara K, Matsumura R, Shiomi M. How Can Robot's Gaze Ratio and Body Direction Show an Awareness of Priority to the People With Whom it is Interacting? IEEE Robot Autom Lett 2019. [DOI: 10.1109/lra.2019.2929992] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
8
|
Baker AL, Phillips EK, Ullman D, Keebler JR. Toward an Understanding of Trust Repair in Human-Robot Interaction. ACM T INTERACT INTEL 2018. [DOI: 10.1145/3181671] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
Abstract
Gone are the days of robots solely operating in isolation, without direct interaction with people. Rather, robots are increasingly being deployed in environments and roles that require complex social interaction with humans. The implementation of human-robot teams continues to increase as technology develops in tandem with the state of human-robot interaction (HRI) research. Trust, a major component of human interaction, is an important facet of HRI. However, the ideas of
trust repair
and
trust violations
are understudied in the HRI literature. Trust repair is the activity of rebuilding trust after one party breaks the trust of another. These trust breaks are referred to as
trust violations
. Just as with humans, trust violations with robots are inevitable; as a result, a clear understanding of the process of HRI trust repair must be developed in order to ensure that a human-robot team can continue to perform well after a trust violation. Previous research on human-automation trust and human-human trust can serve as starting places for exploring trust repair in HRI. Although existing models of human-automation and human-human trust are helpful, they do not account for some of the complexities of building and maintaining trust in unique relationships between humans and robots. The purpose of this article is to provide a foundation for exploring human-robot trust repair by drawing upon prior work in the human-robot, human-automation, and human-human trust literature, concluding with recommendations for advancing this body of work.
Collapse
|
9
|
Robb A, Kleinsmith A, Cordar A, White C, Lampotang S, Wendling A, Lok B. Do Variations in Agency Indirectly Affect Behavior with Others? An Analysis of Gaze Behavior. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2016; 22:1336-1345. [PMID: 26780812 DOI: 10.1109/tvcg.2016.2518405] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
In a group setting, it is possible for attributes of one group member to indirectly affect how other group members are perceived. In this paper, we explore whether one group member's agency (e.g. if they are real or virtual) can indirectly affect behavior with other group members. We also consider whether variations in the agency of a group member directly affects behavior with that group member. To do so, we examined gaze behavior during a team training exercise, in which sixty-nine nurses worked with a surgeon and an anesthesiologist to prepare a simulated patient for surgery. The agency of the surgeon and the anesthesiologist were varied between conditions. Nurses' gaze behavior was coded using videos of their interactions. Agency was observed to directly affect behavior, such that participants spent more time gazing at virtual teammates than human teammates. However, participants continued to obey polite gaze norms with virtual teammates. In contrast, agency was not observed to indirectly affect gaze behavior. The presence of a second human did not affect participants' gaze behavior with virtual teammates.
Collapse
|