1
|
Flanagan T, Georgiou NC, Scassellati B, Kushnir T. School-age children are more skeptical of inaccurate robots than adults. Cognition 2024; 249:105814. [PMID: 38763071 DOI: 10.1016/j.cognition.2024.105814] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2023] [Revised: 05/10/2024] [Accepted: 05/13/2024] [Indexed: 05/21/2024]
Abstract
We expect children to learn new words, skills, and ideas from various technologies. When learning from humans, children prefer people who are reliable and trustworthy, yet children also forgive people's occasional mistakes. Are the dynamics of children learning from technologies, which can also be unreliable, similar to learning from humans? We tackle this question by focusing on early childhood, an age at which children are expected to master foundational academic skills. In this project, 168 4-7-year-old children (Study 1) and 168 adults (Study 2) played a word-guessing game with either a human or robot. The partner first gave a sequence of correct answers, but then followed this with a sequence of wrong answers, with a reaction following each one. Reactions varied by condition, either expressing an accident, an accident marked with an apology, or an unhelpful intention. We found that older children were less trusting than both younger children and adults and were even more skeptical after errors. Trust decreased most rapidly when errors were intentional, but only children (and especially older children) outright rejected help from intentionally unhelpful partners. As an exception to this general trend, older children maintained their trust for longer when a robot (but not a human) apologized for its mistake. Our work suggests that educational technology design cannot be one size fits all but rather must account for developmental changes in children's learning goals.
Collapse
Affiliation(s)
- Teresa Flanagan
- Department of Psychology & Neuroscience, Duke University, 417 Chapel Drive, Durham, NC 27701, United States of America.
| | - Nicholas C Georgiou
- Department of Computer Science, Yale University, 51 Prospect Street, New Haven, CT 06511, United States of America
| | - Brian Scassellati
- Department of Computer Science, Yale University, 51 Prospect Street, New Haven, CT 06511, United States of America
| | - Tamar Kushnir
- Department of Psychology & Neuroscience, Duke University, 417 Chapel Drive, Durham, NC 27701, United States of America
| |
Collapse
|
2
|
Vagnetti R, Camp N, Story M, Ait-Belaid K, Mitra S, Zecca M, Di Nuovo A, Magistro D. Instruments for Measuring Psychological Dimensions in Human-Robot Interaction: Systematic Review of Psychometric Properties. J Med Internet Res 2024; 26:e55597. [PMID: 38682783 DOI: 10.2196/55597] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2023] [Revised: 04/03/2024] [Accepted: 04/25/2024] [Indexed: 05/01/2024] Open
Abstract
BACKGROUND Numerous user-related psychological dimensions can significantly influence the dynamics between humans and robots. For developers and researchers, it is crucial to have a comprehensive understanding of the psychometric properties of the available instruments used to assess these dimensions as they indicate the reliability and validity of the assessment. OBJECTIVE This study aims to provide a systematic review of the instruments available for assessing the psychological aspects of the relationship between people and social and domestic robots, offering a summary of their psychometric properties and the quality of the evidence. METHODS A systematic review was conducted following the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines across different databases: Scopus, PubMed, and IEEE Xplore. The search strategy encompassed studies meeting the following inclusion criteria: (1) the instrument could assess psychological dimensions related to social and domestic robots, including attitudes, beliefs, opinions, feelings, and perceptions; (2) the study focused on validating the instrument; (3) the study evaluated the psychometric properties of the instrument; (4) the study underwent peer review; and (5) the study was in English. Studies focusing on industrial robots, rescue robots, or robotic arms or those primarily concerned with technology validation or measuring anthropomorphism were excluded. Independent reviewers extracted instrument properties and the methodological quality of their evidence following the Consensus-Based Standards for the Selection of Health Measurement Instruments guidelines. RESULTS From 3828 identified records, the search strategy yielded 34 (0.89%) articles that validated and examined the psychometric properties of 27 instruments designed to assess individuals' psychological dimensions in relation to social and domestic robots. These instruments encompass a broad spectrum of psychological dimensions. While most studies predominantly focused on structural validity (24/27, 89%) and internal consistency (26/27, 96%), consideration of other psychometric properties was frequently inconsistent or absent. No instrument evaluated measurement error and responsiveness despite their significance in the clinical context. Most of the instruments (17/27, 63%) were targeted at both adults and older adults (aged ≥18 years). There was a limited number of instruments specifically designed for children, older adults, and health care contexts. CONCLUSIONS Given the strong interest in assessing psychological dimensions in the human-robot relationship, there is a need to develop new instruments using more rigorous methodologies and consider a broader range of psychometric properties. This is essential to ensure the creation of reliable and valid measures for assessing people's psychological dimensions regarding social and domestic robots. Among its limitations, this review included instruments applicable to both social and domestic robots while excluding those for other specific types of robots (eg, industrial robots).
Collapse
Affiliation(s)
- Roberto Vagnetti
- Department of Sport Science, School of Science and Technology, Nottingham Trent University, Nottingham, United Kingdom
| | - Nicola Camp
- Department of Sport Science, School of Science and Technology, Nottingham Trent University, Nottingham, United Kingdom
| | - Matthew Story
- Department of Computing & Advanced Wellbeing Research Centre, Sheffield Hallam University, Sheffield, United Kingdom
| | - Khaoula Ait-Belaid
- Wolfson School of Mechanical, Electrical and Manufacturing Engineering, Loughborough University, Loughborough, United Kingdom
| | - Suvobrata Mitra
- Department of Psychology, Nottingham Trent University, Nottingham, United Kingdom
| | - Massimiliano Zecca
- Wolfson School of Mechanical, Electrical and Manufacturing Engineering, Loughborough University, Loughborough, United Kingdom
| | - Alessandro Di Nuovo
- Department of Computing & Advanced Wellbeing Research Centre, Sheffield Hallam University, Sheffield, United Kingdom
| | - Daniele Magistro
- Department of Sport Science, School of Science and Technology, Nottingham Trent University, Nottingham, United Kingdom
| |
Collapse
|
3
|
Oudah M, Makovi K, Gray K, Battu B, Rahwan T. Perception of experience influences altruism and perception of agency influences trust in human-machine interactions. Sci Rep 2024; 14:12410. [PMID: 38811749 PMCID: PMC11136977 DOI: 10.1038/s41598-024-63360-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2023] [Accepted: 05/28/2024] [Indexed: 05/31/2024] Open
Abstract
As robots become increasingly integrated into social economic interactions, it becomes crucial to understand how people perceive a robot's mind. It has been argued that minds are perceived along two dimensions: experience, i.e., the ability to feel, and agency, i.e., the ability to act and take responsibility for one's actions. However, the influence of these perceived dimensions on human-machine interactions, particularly those involving altruism and trust, remains unknown. We hypothesize that the perception of experience influences altruism, while the perception of agency influences trust. To test these hypotheses, we pair participants with bot partners in a dictator game (to measure altruism) and a trust game (to measure trust) while varying the bots' perceived experience and agency, either by manipulating the degree to which the bot resembles humans, or by manipulating the description of the bots' ability to feel and exercise self-control. The results demonstrate that the money transferred in the dictator game is influenced by the perceived experience, while the money transferred in the trust game is influenced by the perceived agency, thereby confirming our hypotheses. More broadly, our findings support the specificity of the mind hypothesis: Perceptions of different dimensions of the mind lead to different kinds of social behavior.
Collapse
Affiliation(s)
- Mayada Oudah
- Social Science Division, New York University Abu Dhabi, Abu Dhabi, UAE
| | - Kinga Makovi
- Social Science Division, New York University Abu Dhabi, Abu Dhabi, UAE
| | - Kurt Gray
- Department of Psychology and Neuroscience, University of North Carolina, Chapel Hill, USA
| | - Balaraju Battu
- Computer Science, Science Division, New York University Abu Dhabi, Abu Dhabi, UAE.
| | - Talal Rahwan
- Computer Science, Science Division, New York University Abu Dhabi, Abu Dhabi, UAE.
| |
Collapse
|
4
|
Marchesi S, De Tommaso D, Kompatsiari K, Wu Y, Wykowska A. Tools and methods to study and replicate experiments addressing human social cognition in interactive scenarios. Behav Res Methods 2024:10.3758/s13428-024-02434-z. [PMID: 38782872 DOI: 10.3758/s13428-024-02434-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/23/2024] [Indexed: 05/25/2024]
Abstract
In the last decade, scientists investigating human social cognition have started bringing traditional laboratory paradigms more "into the wild" to examine how socio-cognitive mechanisms of the human brain work in real-life settings. As this implies transferring 2D observational paradigms to 3D interactive environments, there is a risk of compromising experimental control. In this context, we propose a methodological approach which uses humanoid robots as proxies of social interaction partners and embeds them in experimental protocols that adapt classical paradigms of cognitive psychology to interactive scenarios. This allows for a relatively high degree of "naturalness" of interaction and excellent experimental control at the same time. Here, we present two case studies where our methods and tools were applied and replicated across two different laboratories, namely the Italian Institute of Technology in Genova (Italy) and the Agency for Science, Technology and Research in Singapore. In the first case study, we present a replication of an interactive version of a gaze-cueing paradigm reported in Kompatsiari et al. (J Exp Psychol Gen 151(1):121-136, 2022). The second case study presents a replication of a "shared experience" paradigm reported in Marchesi et al. (Technol Mind Behav 3(3):11, 2022). As both studies replicate results across labs and different cultures, we argue that our methods allow for reliable and replicable setups, even though the protocols are complex and involve social interaction. We conclude that our approach can be of benefit to the research field of social cognition and grant higher replicability, for example, in cross-cultural comparisons of social cognition mechanisms.
Collapse
Affiliation(s)
- Serena Marchesi
- Social Cognition in Human-Robot Interaction, Italian Institute of Technology, Genova, Italy
- Robotics and Autonomous Systems Department, A*STAR Institute for Infocomm Research, Singapore, Singapore
| | - Davide De Tommaso
- Social Cognition in Human-Robot Interaction, Italian Institute of Technology, Genova, Italy
| | - Kyveli Kompatsiari
- Social Cognition in Human-Robot Interaction, Italian Institute of Technology, Genova, Italy
| | - Yan Wu
- Robotics and Autonomous Systems Department, A*STAR Institute for Infocomm Research, Singapore, Singapore
| | - Agnieszka Wykowska
- Social Cognition in Human-Robot Interaction, Italian Institute of Technology, Genova, Italy.
| |
Collapse
|
5
|
Li X, Yow WQ. Younger, not older, children trust an inaccurate human informant more than an inaccurate robot informant. Child Dev 2024; 95:988-1000. [PMID: 38041211 DOI: 10.1111/cdev.14048] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2023] [Revised: 10/30/2023] [Accepted: 11/15/2023] [Indexed: 12/03/2023]
Abstract
This study examined preschoolers' trust toward accurate and inaccurate robot informants versus human informants. Singaporean children aged 3-5 years (N = 120, 57 girls, mostly Asian; data collected from 2017 to 2018) viewed either a robot or a human adult label familiar objects either accurately or inaccurately. Children's trust was assessed by examining their subsequent willingness to accept novel object labels provided by the same informant. Regardless of age, children trusted accurate robots to a similar extent as accurate humans. However, while older children (dis)trusted inaccurate robots and humans comparably, younger children trusted inaccurate robots less than inaccurate humans. The results indicate a developmental change in children's reliance on informants' characteristics to decide whom to trust.
Collapse
Affiliation(s)
- Xiaoqian Li
- Humanities, Arts and Social Sciences, Singapore University of Technology and Design, Singapore
| | - W Quin Yow
- Humanities, Arts and Social Sciences, Singapore University of Technology and Design, Singapore
| |
Collapse
|
6
|
Goldman EJ, Poulin-Dubois D. Children's anthropomorphism of inanimate agents. WILEY INTERDISCIPLINARY REVIEWS. COGNITIVE SCIENCE 2024:e1676. [PMID: 38659105 DOI: 10.1002/wcs.1676] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/08/2023] [Revised: 10/30/2023] [Accepted: 01/12/2024] [Indexed: 04/26/2024]
Abstract
This review article examines the extant literature on animism and anthropomorphism in infants and young children. A substantial body of work indicates that both infants and young children have a broad concept of what constitutes a sentient agent and react to inanimate objects as they do to people in the same context. The literature has also revealed a developmental pattern in which anthropomorphism decreases with age, but social robots appear to be an exception to this pattern. Additionally, the review shows that children attribute psychological properties to social robots less so than people but still anthropomorphize them. Importantly, some research suggests that anthropomorphism of social robots is dependent upon their morphology and human-like behaviors. The extent to which children anthropomorphize robots is dependent on their exposure to them and the presence of human-like features. Based on the existing literature, we conclude that in infancy, a large range of inanimate objects (e.g., boxes, geometric figures) that display animate motion patterns trigger the same behaviors observed in child-adult interactions, suggesting some implicit form of anthropomorphism. The review concludes that additional research is needed to understand what infants and children judge as social agents and how the perception of inanimate agents changes over the lifespan. As exposure to robots and virtual assistants increases, future research must focus on better understanding the full impact that regular interactions with such partners will have on children's anthropomorphizing. This article is categorized under: Psychology > Learning Cognitive Biology > Cognitive Development Computer Science and Robotics > Robotics.
Collapse
|
7
|
Li M, Guo F, Li Z, Ma H, Duffy VG. Interactive effects of users' openness and robot reliability on trust: evidence from psychological intentions, task performance, visual behaviours, and cerebral activations. ERGONOMICS 2024:1-21. [PMID: 38635303 DOI: 10.1080/00140139.2024.2343954] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/20/2021] [Accepted: 04/09/2024] [Indexed: 04/19/2024]
Abstract
Although trust plays a vital role in human-robot interaction, there is currently a dearth of literature examining the effect of users' openness personality on trust in actual interaction. This study aims to investigate the interaction effects of users' openness and robot reliability on trust. We designed a voice-based walking task and collected subjective trust ratings, task metrics, eye-tracking data, and fNIRS signals from users with different openness to unravel the psychological intentions, task performance, visual behaviours, and cerebral activations underlying trust. The results showed significant interaction effects. Users with low openness exhibited lower subjective trust, more fixations, and higher activation of rTPJ in the highly reliable condition than those with high openness. The results suggested that users with low openness might be more cautious and suspicious about the highly reliable robot and allocate more visual attention and neural processing to monitor and infer robot status than users with high openness.
Collapse
Affiliation(s)
- Mingming Li
- Department of Industrial Engineering, College of Management Science and Engineering, Anhui University of Technology, Maanshan, China
- Department of Industrial Engineering, School of Business Administration, Northeastern University, Shenyang, China
| | - Fu Guo
- Department of Industrial Engineering, School of Business Administration, Northeastern University, Shenyang, China
| | - Zhixing Li
- Department of Industrial Engineering, School of Business Administration, Northeastern University, Shenyang, China
| | - Haiyang Ma
- Department of Industrial Engineering, School of Business Administration, Northeastern University, Shenyang, China
| | - Vincent G Duffy
- School of Industrial Engineering, Purdue University, West Lafayette, IN, USA
| |
Collapse
|
8
|
Kühne K, Herbold E, Bendel O, Zhou Y, Fischer MH. "Ick bin een Berlina": dialect proficiency impacts a robot's trustworthiness and competence evaluation. Front Robot AI 2024; 10:1241519. [PMID: 38348348 PMCID: PMC10859411 DOI: 10.3389/frobt.2023.1241519] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2023] [Accepted: 11/27/2023] [Indexed: 02/15/2024] Open
Abstract
Background: Robots are increasingly used as interaction partners with humans. Social robots are designed to follow expected behavioral norms when engaging with humans and are available with different voices and even accents. Some studies suggest that people prefer robots to speak in the user's dialect, while others indicate a preference for different dialects. Methods: Our study examined the impact of the Berlin dialect on perceived trustworthiness and competence of a robot. One hundred and twenty German native speakers (M age = 32 years, SD = 12 years) watched an online video featuring a NAO robot speaking either in the Berlin dialect or standard German and assessed its trustworthiness and competence. Results: We found a positive relationship between participants' self-reported Berlin dialect proficiency and trustworthiness in the dialect-speaking robot. Only when controlled for demographic factors, there was a positive association between participants' dialect proficiency, dialect performance and their assessment of robot's competence for the standard German-speaking robot. Participants' age, gender, length of residency in Berlin, and device used to respond also influenced assessments. Finally, the robot's competence positively predicted its trustworthiness. Discussion: Our results inform the design of social robots and emphasize the importance of device control in online experiments.
Collapse
Affiliation(s)
- Katharina Kühne
- Division of Cognitive Sciences, University of Potsdam, Potsdam, Germany
| | - Erika Herbold
- Division of Cognitive Sciences, University of Potsdam, Potsdam, Germany
| | - Oliver Bendel
- School of Business FHNW, Brugg-Windisch, Brugg, Switzerland
| | - Yuefang Zhou
- Division of Cognitive Sciences, University of Potsdam, Potsdam, Germany
| | - Martin H. Fischer
- Division of Cognitive Sciences, University of Potsdam, Potsdam, Germany
| |
Collapse
|
9
|
Tanner A, Urech A, Schulze H, Manser T. Older Adults' Engagement and Mood During Robot-Assisted Group Activities in Nursing Homes: Development and Observational Pilot Study. JMIR Rehabil Assist Technol 2023; 10:e48031. [PMID: 38145484 PMCID: PMC10775040 DOI: 10.2196/48031] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2023] [Revised: 10/23/2023] [Accepted: 11/15/2023] [Indexed: 12/26/2023] Open
Abstract
BACKGROUND Promoting the well-being of older adults in an aging society requires new solutions. One resource might be the use of social robots for group activities that promote physical and cognitive stimulation. Engaging in a robot-assisted group activity may help in the slowdown of physical and cognitive decline in older adults. Currently, our knowledge is limited on whether older adults engage in group activities with humanlike social robots and whether they experience a positive affect while doing so. Both are necessary preconditions to achieve the intended effects of a group activity. OBJECTIVE Our pilot study has 2 aims. First, we aimed to develop and pilot an observational coding scheme for robot-assisted group activities because self-report data on engagement and mood of nursing home residents are often difficult to obtain, and the existing observation instruments do have limitations. Second, we aimed to investigate older adults' engagement and mood during robot-assisted group activities in 4 different nursing care homes in the German-speaking part of Switzerland. METHODS We developed an observation system, inspired by existing tools, for a structured observation of engagement and mood of older adults during a robot-assisted group activity. In this study, 85 older adult residents from 4 different care homes in Switzerland participated in 5 robot-assisted group activity sessions, and they were observed using our developed system. The data were collected in the form of video clips that were assessed by 2 raters regarding engagement (direction of gaze, posture as well as body expression, and activity) and mood (positive and negative affects). Both variables were rated on a 5-point rating scale. RESULTS Our pilot study findings show that the engagement and mood of older adults can be assessed reliably by using the proposed observational coding scheme. Most participants actively engaged in robot-assisted group activities (mean 4.19, SD 0.47; median 4.0). The variables used to measure engagement were direction of gaze (mean 4.65, SD 0.49; median 5.0), posture and body expression (mean 4.03, SD 0.71; median 4.0), and activity (mean 3.90, SD 0.65; median 4.0). Further, we observed mainly positive affects in this group. Almost no negative affect was observed (mean 1.13, SD 0.20; median 1.0), while the positive affect (mean 3.22, SD 0.55; median 3.2) was high. CONCLUSIONS The developed observational coding system can be used and further developed in future studies on robot-assisted group activities in the nursing home context and potentially in other settings. Additionally, our pilot study indicates that cognitive and physical stimulation of older adults can be promoted by social robots in a group setting. This finding encourages future technological development and improvement of social robots and points to the potential of observational research to systematically evaluate such developments.
Collapse
Affiliation(s)
- Alexandra Tanner
- School of Applied Psychology, University of Applied Sciences and Arts Northwestern Switzerland, Olten, Switzerland
- City of Bern (Digital Stadt Bern), Bern, Switzerland
| | - Andreas Urech
- School of Applied Psychology, University of Applied Sciences and Arts Northwestern Switzerland, Olten, Switzerland
| | - Hartmut Schulze
- School of Applied Psychology, University of Applied Sciences and Arts Northwestern Switzerland, Olten, Switzerland
| | - Tanja Manser
- School of Applied Psychology, University of Applied Sciences and Arts Northwestern Switzerland, Olten, Switzerland
| |
Collapse
|
10
|
Dosso JA, Riminchan A, Robillard JM. Social robotics for children: an investigation of manufacturers' claims. Front Robot AI 2023; 10:1080157. [PMID: 38187475 PMCID: PMC10770258 DOI: 10.3389/frobt.2023.1080157] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2022] [Accepted: 11/15/2023] [Indexed: 01/09/2024] Open
Abstract
As the market for commercial children's social robots grows, manufacturers' claims around the functionality and outcomes of their products have the potential to impact consumer purchasing decisions. In this work, we qualitatively and quantitatively assess the content and scientific support for claims about social robots for children made on manufacturers' websites. A sample of 21 robot websites was obtained using location-independent keyword searches on Google, Yahoo, and Bing from April to July 2021. All claims made on manufacturers' websites about robot functionality and outcomes (n = 653 statements) were subjected to content analysis, and the quality of evidence for these claims was evaluated using a validated quality evaluation tool. Social robot manufacturers made clear claims about the impact of their products in the areas of interaction, education, emotion, and adaptivity. Claims tended to focus on the child rather than the parent or other users. Robots were primarily described in the context of interactive, educational, and emotional uses, rather than being for health, safety, or security. The quality of the information used to support these claims was highly variable and at times potentially misleading. Many websites used language implying that robots had interior thoughts and experiences; for example, that they would love the child. This study provides insight into the content and quality of parent-facing manufacturer claims regarding commercial social robots for children.
Collapse
Affiliation(s)
- Jill A. Dosso
- Neuroscience, Engagement, and Smart Tech (NEST) Laboratory, Department of Medicine, Division of Neurology, The University of British Columbia, Vancouver, BC, Canada
- Neuroscience, Engagement, and Smart Tech (NEST) Laboratory, British Columbia Children’s and Women’s Hospital, Vancouver, BC, Canada
| | - Anna Riminchan
- Neuroscience, Engagement, and Smart Tech (NEST) Laboratory, Department of Medicine, Division of Neurology, The University of British Columbia, Vancouver, BC, Canada
- Neuroscience, Engagement, and Smart Tech (NEST) Laboratory, British Columbia Children’s and Women’s Hospital, Vancouver, BC, Canada
| | - Julie M. Robillard
- Neuroscience, Engagement, and Smart Tech (NEST) Laboratory, Department of Medicine, Division of Neurology, The University of British Columbia, Vancouver, BC, Canada
- Neuroscience, Engagement, and Smart Tech (NEST) Laboratory, British Columbia Children’s and Women’s Hospital, Vancouver, BC, Canada
| |
Collapse
|
11
|
Meyer L, Rachman L, Araiza-Illan G, Gaudrain E, Başkent D. Use of a humanoid robot for auditory psychophysical testing. PLoS One 2023; 18:e0294328. [PMID: 38091272 PMCID: PMC10718414 DOI: 10.1371/journal.pone.0294328] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2023] [Accepted: 10/31/2023] [Indexed: 12/18/2023] Open
Abstract
Tasks in psychophysical tests can at times be repetitive and cause individuals to lose engagement during the test. To facilitate engagement, we propose the use of a humanoid NAO robot, named Sam, as an alternative interface for conducting psychophysical tests. Specifically, we aim to evaluate the performance of Sam as an auditory testing interface, given its potential limitations and technical differences, in comparison to the current laptop interface. We examine the results and durations of two voice perception tests, voice cue sensitivity and voice gender categorisation, obtained from both the conventionally used laptop interface and Sam. Both tests investigate the perception and use of two speaker-specific voice cues, fundamental frequency (F0) and vocal tract length (VTL), important for characterising voice gender. Responses are logged on the laptop using a connected mouse, and on Sam using the tactile sensors. Comparison of test results from both interfaces shows functional similarity between the interfaces and replicates findings from previous studies with similar tests. Comparison of test durations shows longer testing times with Sam, primarily due to longer processing times in comparison to the laptop, as well as other design limitations due to the implementation of the test on the robot. Despite the inherent constraints of the NAO robot, such as in sound quality, relatively long processing and testing times, and different methods of response logging, the NAO interface appears to facilitate collecting similar data to the current laptop interface, confirming its potential as an alternative psychophysical test interface for auditory perception tests.
Collapse
Affiliation(s)
- Luke Meyer
- Department of Otorhinolaryngology, Head and Neck Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
- W.J. Kolff Institute for Biomedical Engineering and Materials Science, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - Laura Rachman
- Department of Otorhinolaryngology, Head and Neck Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
- W.J. Kolff Institute for Biomedical Engineering and Materials Science, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - Gloria Araiza-Illan
- Department of Otorhinolaryngology, Head and Neck Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
- W.J. Kolff Institute for Biomedical Engineering and Materials Science, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - Etienne Gaudrain
- Lyon Neuroscience Research Center, CNRS UMR 5292, INSERM UMRS 1028, Université Claude Bernard Lyon 1, Université de Lyon, Lyon, France
| | - Deniz Başkent
- Department of Otorhinolaryngology, Head and Neck Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
- W.J. Kolff Institute for Biomedical Engineering and Materials Science, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| |
Collapse
|
12
|
Zhang T, Ding Y, Hu C, Zhang M, Zhu W, Bowen CR, Han Y, Yang Y. Self-Powered Stretchable Sensor Arrays Exhibiting Magnetoelasticity for Real-Time Human-Machine Interaction. ADVANCED MATERIALS (DEERFIELD BEACH, FLA.) 2023; 35:e2203786. [PMID: 35701188 DOI: 10.1002/adma.202203786] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/27/2022] [Revised: 06/12/2022] [Indexed: 06/15/2023]
Abstract
Stretchable strain sensors are highly desirable for human motion monitoring, and can be used to build new forms of bionic robots. However, the current use of flexible strain gauges is hindered by the need for an external power supply, and the demand for long-term operation. Here, a new flexible self-powered strain sensor system based on an electromagnetic generator that possesses a high stretchability in excess of 150%, a short response time of 30 ms, and an excellent linearity (R2 > 0.98), is presented. Based on this new form of sensor, a human-machine interaction system is designed to achieve remote control of a robot hand and vehicle using a human hand, which provides a new scheme for real-time gesture interaction.
Collapse
Affiliation(s)
- Tongtong Zhang
- CAS Center for Excellence in Nanoscience, Beijing Key Laboratory of Micro-Nano Energy and Sensor, Beijing Institute of Nanoenergy and Nanosystems, Chinese Academy of Sciences, Beijing, 101400, P. R. China
- School of Nanoscience and Technology, University of Chinese Academy of Sciences, Beijing, 100049, P. R. China
| | - Yi Ding
- State Key Laboratory of Advanced Power Transmission Technology (State Grid Smart Grid Research Institute Co. Ltd.), Beijing, 102209, P. R. China
| | - Chaosheng Hu
- CAS Center for Excellence in Nanoscience, Beijing Key Laboratory of Micro-Nano Energy and Sensor, Beijing Institute of Nanoenergy and Nanosystems, Chinese Academy of Sciences, Beijing, 101400, P. R. China
- School of Nanoscience and Technology, University of Chinese Academy of Sciences, Beijing, 100049, P. R. China
| | - Maoyi Zhang
- CAS Center for Excellence in Nanoscience, Beijing Key Laboratory of Micro-Nano Energy and Sensor, Beijing Institute of Nanoenergy and Nanosystems, Chinese Academy of Sciences, Beijing, 101400, P. R. China
- School of Nanoscience and Technology, University of Chinese Academy of Sciences, Beijing, 100049, P. R. China
| | - Wenxuan Zhu
- CAS Center for Excellence in Nanoscience, Beijing Key Laboratory of Micro-Nano Energy and Sensor, Beijing Institute of Nanoenergy and Nanosystems, Chinese Academy of Sciences, Beijing, 101400, P. R. China
- School of Chemistry and Chemical Engineering, Center on Nanoenergy Research, Guangxi University, Nanning, Guangxi, 530004, P. R. China
| | - Chris R Bowen
- Department of Mechanical Engineering, University of Bath, Bath, BA2 7AK, UK
| | - Yu Han
- State Key Laboratory of Advanced Power Transmission Technology (State Grid Smart Grid Research Institute Co. Ltd.), Beijing, 102209, P. R. China
| | - Ya Yang
- CAS Center for Excellence in Nanoscience, Beijing Key Laboratory of Micro-Nano Energy and Sensor, Beijing Institute of Nanoenergy and Nanosystems, Chinese Academy of Sciences, Beijing, 101400, P. R. China
- School of Nanoscience and Technology, University of Chinese Academy of Sciences, Beijing, 100049, P. R. China
- School of Chemistry and Chemical Engineering, Center on Nanoenergy Research, Guangxi University, Nanning, Guangxi, 530004, P. R. China
| |
Collapse
|
13
|
Wang J, Chen Y, Huo S, Mai L, Jia F. Research Hotspots and Trends of Social Robot Interaction Design: A Bibliometric Analysis. SENSORS (BASEL, SWITZERLAND) 2023; 23:9369. [PMID: 38067743 PMCID: PMC10708843 DOI: 10.3390/s23239369] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/06/2023] [Revised: 11/06/2023] [Accepted: 11/08/2023] [Indexed: 12/18/2023]
Abstract
(1) Background: Social robot interaction design is crucial for determining user acceptance and experience. However, few studies have systematically discussed the current focus and future research directions of social robot interaction design from a bibliometric perspective. Therefore, we conducted this study in order to identify the latest research progress and evolution trajectory of research hotspots in social robot interaction design over the last decade. (2) Methods: We conducted a comprehensive review based on 2416 papers related to social robot interaction design obtained from the Web of Science (WOS) database. Our review utilized bibliometric techniques and integrated VOSviewer and CiteSpace to construct a knowledge map. (3) Conclusions: The current research hotspots of social robot interaction design mainly focus on #1 the study of human-robot relationships in social robots, #2 research on the emotional design of social robots, #3 research on social robots for children's psychotherapy, #4 research on companion robots for elderly rehabilitation, and #5 research on educational social robots. The reference co-citation analysis identifies the classic literature that forms the basis of the current research, which provides theoretical guidance and methods for the current research. Finally, we discuss several future research directions and challenges in this field.
Collapse
Affiliation(s)
- Jianmin Wang
- College of Arts and Media, Tongji University, Shanghai 201804, China
- Shenzhen Research Institute, Sun Yat-Sen University, Shenzhen 518057, China
| | - Yongkang Chen
- College of Design and Innovation, Tongji University, Shanghai 200092, China; (Y.C.)
| | - Siguang Huo
- College of Design and Innovation, Tongji University, Shanghai 200092, China; (Y.C.)
| | - Liya Mai
- College of Design and Innovation, Tongji University, Shanghai 200092, China; (Y.C.)
| | - Fusheng Jia
- College of Design and Innovation, Tongji University, Shanghai 200092, China; (Y.C.)
| |
Collapse
|
14
|
Krpan D, Booth JE, Damien A. The positive-negative-competence (PNC) model of psychological responses to representations of robots. Nat Hum Behav 2023; 7:1933-1954. [PMID: 37783891 PMCID: PMC10663151 DOI: 10.1038/s41562-023-01705-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2022] [Accepted: 08/25/2023] [Indexed: 10/04/2023]
Abstract
Robots are becoming an increasingly prominent part of society. Despite their growing importance, there exists no overarching model that synthesizes people's psychological reactions to robots and identifies what factors shape them. To address this, we created a taxonomy of affective, cognitive and behavioural processes in response to a comprehensive stimulus sample depicting robots from 28 domains of human activity (for example, education, hospitality and industry) and examined its individual difference predictors. Across seven studies that tested 9,274 UK and US participants recruited via online panels, we used a data-driven approach combining qualitative and quantitative techniques to develop the positive-negative-competence model, which categorizes all psychological processes in response to the stimulus sample into three dimensions: positive, negative and competence-related. We also established the main individual difference predictors of these dimensions and examined the mechanisms for each predictor. Overall, this research provides an in-depth understanding of psychological functioning regarding representations of robots.
Collapse
Affiliation(s)
- Dario Krpan
- Department of Psychological and Behavioural Science, London School of Economics and Political Science, London, UK.
| | - Jonathan E Booth
- Department of Management, London School of Economics and Political Science, London, UK
| | - Andreea Damien
- Department of Psychological and Behavioural Science, London School of Economics and Political Science, London, UK
| |
Collapse
|
15
|
Groß A, Singh A, Banh NC, Richter B, Scharlau I, Rohlfing KJ, Wrede B. Scaffolding the human partner by contrastive guidance in an explanatory human-robot dialogue. Front Robot AI 2023; 10:1236184. [PMID: 37965633 PMCID: PMC10642948 DOI: 10.3389/frobt.2023.1236184] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2023] [Accepted: 10/11/2023] [Indexed: 11/16/2023] Open
Abstract
Explanation has been identified as an important capability for AI-based systems, but research on systematic strategies for achieving understanding in interaction with such systems is still sparse. Negation is a linguistic strategy that is often used in explanations. It creates a contrast space between the affirmed and the negated item that enriches explaining processes with additional contextual information. While negation in human speech has been shown to lead to higher processing costs and worse task performance in terms of recall or action execution when used in isolation, it can decrease processing costs when used in context. So far, it has not been considered as a guiding strategy for explanations in human-robot interaction. We conducted an empirical study to investigate the use of negation as a guiding strategy in explanatory human-robot dialogue, in which a virtual robot explains tasks and possible actions to a human explainee to solve them in terms of gestures on a touchscreen. Our results show that negation vs. affirmation 1) increases processing costs measured as reaction time and 2) increases several aspects of task performance. While there was no significant effect of negation on the number of initially correctly executed gestures, we found a significantly lower number of attempts-measured as breaks in the finger movement data before the correct gesture was carried out-when being instructed through a negation. We further found that the gestures significantly resembled the presented prototype gesture more following an instruction with a negation as opposed to an affirmation. Also, the participants rated the benefit of contrastive vs. affirmative explanations significantly higher. Repeating the instructions decreased the effects of negation, yielding similar processing costs and task performance measures for negation and affirmation after several iterations. We discuss our results with respect to possible effects of negation on linguistic processing of explanations and limitations of our study.
Collapse
Affiliation(s)
- André Groß
- Medical Assistance Systems, Medical School OWL, Bielefeld University, Bielefeld, Germany
- Center for Cognitive Interaction Technology,CITEC, Bielefeld University, Bielefeld, Germany
| | - Amit Singh
- Psycholinguistics, Faculty of Arts and Humanities, Paderborn University, Paderborn, Germany
| | - Ngoc Chi Banh
- Cognitive Psychology, Faculty of Arts and Humanities, Paderborn University, Paderborn, Germany
| | - Birte Richter
- Medical Assistance Systems, Medical School OWL, Bielefeld University, Bielefeld, Germany
- Center for Cognitive Interaction Technology,CITEC, Bielefeld University, Bielefeld, Germany
| | - Ingrid Scharlau
- Cognitive Psychology, Faculty of Arts and Humanities, Paderborn University, Paderborn, Germany
| | - Katharina J. Rohlfing
- Psycholinguistics, Faculty of Arts and Humanities, Paderborn University, Paderborn, Germany
| | - Britta Wrede
- Medical Assistance Systems, Medical School OWL, Bielefeld University, Bielefeld, Germany
- Center for Cognitive Interaction Technology,CITEC, Bielefeld University, Bielefeld, Germany
| |
Collapse
|
16
|
Huang G, Moore RK. Using social robots for language learning: are we there yet? JOURNAL OF CHINA COMPUTER-ASSISTED LANGUAGE LEARNING 2023; 3:208-230. [PMID: 38013743 PMCID: PMC10464067 DOI: 10.1515/jccall-2023-0013] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/18/2023] [Accepted: 05/29/2023] [Indexed: 11/29/2023]
Abstract
Along with the development of speech and language technologies and growing market interest, social robots have attracted more academic and commercial attention in recent decades. Their multimodal embodiment offers a broad range of possibilities, which have gained importance in the education sector. It has also led to a new technology-based field of language education: robot-assisted language learning (RALL). RALL has developed rapidly in second language learning, especially driven by the need to compensate for the shortage of first-language tutors. There are many implementation cases and studies of social robots, from early government-led attempts in Japan and South Korea to increasing research interests in Europe and worldwide. Compared with RALL used for English as a foreign language (EFL), however, there are fewer studies on applying RALL for teaching Chinese as a foreign language (CFL). One potential reason is that RALL is not well-known in the CFL field. This scope review paper attempts to fill this gap by addressing the balance between classroom implementation and research frontiers of social robots. The review first introduces the technical tool used in RALL, namely the social robot, at a high level. It then presents a historical overview of the real-life implementation of social robots in language classrooms in East Asia and Europe. It then provides a summary of the evaluation of RALL from the perspectives of L2 learners, teachers and technology developers. The overall goal of this paper is to gain insights into RALL's potential and challenges and identify a rich set of open research questions for applying RALL to CFL. It is hoped that the review may inform interdisciplinary analysis and practice for scientific research and front-line teaching in future.
Collapse
|
17
|
Lanfranchi JB, Lemonnier S. The Estimation of Physical Distances Between Oneself and a Social Robot: Am I as Far From the Robot as It is from Me? EUROPES JOURNAL OF PSYCHOLOGY 2023; 19:299-307. [PMID: 37731753 PMCID: PMC10508198 DOI: 10.5964/ejop.9519] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2022] [Accepted: 04/06/2023] [Indexed: 09/22/2023]
Abstract
Research on the perception of interpersonal distance has shown the existence of an asymmetry effect which depends on the reference point of the estimation: the distance from oneself to others can be perceived as longer or shorter than the distance from others to oneself. The mechanism underlying this asymmetric effect is related to the object's cognitive salience. The self often functions as a habitual reference point and therefore one's own salience may be higher than that of other objects. In this case, an egocentric asymmetry effect appears with a perceived shorter distance from others to oneself. However, if others are more salient than oneself, then the reverse can happen (allocentric asymmetry effect). The present work investigates if asymmetry in self-other(s) distance perception changes when the other is a social robot. An experiment was conducted with 174 participants who were asked to estimate the distance between themselves and both robotic and human assistants on a schematic map of a hospital emergency room (between-subjects design). With robust ANOVA, the results showed that the participants felt closer to the human assistant than to the robot, notably when the person served as the estimation reference point. Perceived distances to the social robot were not significantly distorted. If a rather allocentric effect with the human assistant might reflect an affiliation goal on the part of the participants, the absence of effect with the social robot forces us to reconsider its humanization. This could nevertheless reflect a purely mechanical and utilitarian conception of it.
Collapse
|
18
|
Marchesi S, Abubshait A, Kompatsiari K, Wu Y, Wykowska A. Cultural differences in joint attention and engagement in mutual gaze with a robot face. Sci Rep 2023; 13:11689. [PMID: 37468517 DOI: 10.1038/s41598-023-38704-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2023] [Accepted: 07/13/2023] [Indexed: 07/21/2023] Open
Abstract
Joint attention is a pivotal mechanism underlying human ability to interact with one another. The fundamental nature of joint attention in the context of social cognition has led researchers to develop tasks that address this mechanism and operationalize it in a laboratory setting, in the form of a gaze cueing paradigm. In the present study, we addressed the question of whether engaging in joint attention with a robot face is culture-specific. We adapted a classical gaze-cueing paradigm such that a robot avatar cued participants' gaze subsequent to either engaging participants in eye contact or not. Our critical question of interest was whether the gaze cueing effect (GCE) is stable across different cultures, especially if cognitive resources to exert top-down control are reduced. To achieve the latter, we introduced a mathematical stress task orthogonally to the gaze cueing protocol. Results showed larger GCE in the Singapore sample, relative to the Italian sample, independent of gaze type (eye contact vs. no eye contact) or amount of experienced stress, which translates to available cognitive resources. Moreover, since after each block, participants rated how engaged they felt with the robot avatar during the task, we observed that Italian participants rated as more engaging the avatar during the eye contact blocks, relative to no eye contact while Singaporean participants did not show any difference in engagement relative to the gaze. We discuss the results in terms of cultural differences in robot-induced joint attention, and engagement in eye contact, as well as the dissociation between implicit and explicit measures related to processing of gaze.
Collapse
Affiliation(s)
- Serena Marchesi
- Social Cognition in Human-Robot Interaction, Italian Institute of Technology, Genova, Italy
- Robotics and Autonomous Systems Department, A*STAR Institute for Infocomm Research, Singapore, Singapore
| | - Abdulaziz Abubshait
- Social Cognition in Human-Robot Interaction, Italian Institute of Technology, Genova, Italy
| | - Kyveli Kompatsiari
- Social Cognition in Human-Robot Interaction, Italian Institute of Technology, Genova, Italy
| | - Yan Wu
- Robotics and Autonomous Systems Department, A*STAR Institute for Infocomm Research, Singapore, Singapore
| | - Agnieszka Wykowska
- Social Cognition in Human-Robot Interaction, Italian Institute of Technology, Genova, Italy.
| |
Collapse
|
19
|
Takata K, Yoshikawa Y, Muramatsu T, Matsumoto Y, Ishiguro H, Mimura M, Kumazaki H. Social skills training using multiple humanoid robots for individuals with autism spectrum conditions. Front Psychiatry 2023; 14:1168837. [PMID: 37539327 PMCID: PMC10394831 DOI: 10.3389/fpsyt.2023.1168837] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/18/2023] [Accepted: 06/23/2023] [Indexed: 08/05/2023] Open
Abstract
Introduction Social skills training (SST) is used to help individuals with autism spectrum conditions (ASC) better understand the perspectives of others and social interactions, develop empathy skills, and learn how to engage with others socially. However, many individuals with ASC cannot easily sustain high motivation and concentration during such an intervention when it is administered by humans. We developed a social skills training program using multiple humanoid robots (STUH), including an android robot, that aimed to enable individuals with ASC to become familiar with the perspectives of others and improve their sociability and empathy skills. The objective of the present study was to investigate the effectiveness of STUH for these individuals. Methods In STUH, we prepared 50 social exercises that consisted of conversations and behavioral interactions between an android robot and a simple humanoid robot. We prepared another humanoid robot that featured a cartoon-like and mechanical design, which played the role of host. In the first half-session of STUH, participants worked on the exercise from the perspective of an outsider. In the second half-session of STUH, they simulated experience by using robots as their avatars. The intervention associated with STUH was conducted for five days in total. We conducted an analysis of variance (ANOVA) featuring the intervention time point as the independent variable to examine changes in each score on the sociability index items. Results In total, 14 individuals with ASC participated in the study. The results of multiple comparison tests using the Bonferroni method indicated that all sociability index items improved between preintervention and follow-up. Our program enabled the participants to become familiar with the perspectives of others and improve their sociability. Discussion Given the promising results of this study, future studies featuring long-term follow-up should be conducted to draw definitive conclusions about the efficacy of our training system.
Collapse
Affiliation(s)
- Keiji Takata
- Department of Psychology, Saitama Gakuen University, Saitama, Japan
| | - Yuichiro Yoshikawa
- Department of Systems Innovation, Graduate School of Engineering Science, Osaka University, Osaka, Japan
| | - Taro Muramatsu
- Department of Neuropsychiatry, Keio University School of Medicine, Tokyo, Japan
| | - Yoshio Matsumoto
- Human Augmentation Research Center, National Institute of Advanced Industrial Science and Technology, Chiba, Japan
- Department of Clinical Research on Social Recognition and Memory, Research Center for Child Mental Development, Kanazawa University, Ishikawa, Japan
| | - Hiroshi Ishiguro
- Department of Systems Innovation, Graduate School of Engineering Science, Osaka University, Osaka, Japan
| | - Masaru Mimura
- Department of Neuropsychiatry, Keio University School of Medicine, Tokyo, Japan
| | - Hirokazu Kumazaki
- Department of Neuropsychiatry, Keio University School of Medicine, Tokyo, Japan
- Human Augmentation Research Center, National Institute of Advanced Industrial Science and Technology, Chiba, Japan
- College of Science and Engineering, Kanazawa University, Ishikawa, Japan
- Department of Neuropsychiatry, Graduate School of Biomedical Sciences, Nagasaki University, Nagasaki, Japan
| |
Collapse
|
20
|
Corrales-Paredes A, Sanz DO, Terrón-López MJ, Egido-García V. User Experience Design for Social Robots: A Case Study in Integrating Embodiment. SENSORS (BASEL, SWITZERLAND) 2023; 23:5274. [PMID: 37300001 PMCID: PMC10256079 DOI: 10.3390/s23115274] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/30/2023] [Revised: 05/23/2023] [Accepted: 05/30/2023] [Indexed: 06/12/2023]
Abstract
Social robotics is an emerging field with a high level of innovation. For many years, it was a concept framed in the literature and theoretical approaches. Scientific and technological advances have made it possible for robots to progressively make their way into different areas of our society, and now, they are ready to make the leap out of the industry and extend their presence into our daily lives. In this sense, user experience plays a fundamental role in achieving a smooth and natural interaction between robots and humans. This research focused on the user experience approach in terms of the embodiment of a robot, centring on its movements, gestures, and dialogues. The aim was to investigate how the interaction between robotic platforms and humans takes place and what differential aspects should be considered when designing the robot tasks. To achieve this objective, a qualitative and quantitative study was conducted based on a real interview between several human users and the robotic platform. The data were gathered by recording the session and having each user complete a form. The results showed that participants generally enjoyed interacting with the robot and found it engaging, which led to greater trust and satisfaction. However, delays and errors in the robot's responses caused frustration and disconnection. The study found that incorporating embodiment into the design of the robot improved the user experience, and the robot's personality and behaviour were significant factors. It was concluded that robotic platforms and their appearance, movements, and way of communicating have a decisive influence on the user's opinion and the way they interact.
Collapse
Affiliation(s)
- Ana Corrales-Paredes
- Science, Computation and Technology Department, School of Architecture, Engineering and Design, Universidad Europea de Madrid, 28670 Villaviciosa de Odón, Spain;
| | - Diego Ortega Sanz
- Aerospace and Industrial Engineering Department, School of Architecture, Engineering and Design, Universidad Europea de Madrid, 28670 Villaviciosa de Odón, Spain;
| | - María-José Terrón-López
- Aerospace and Industrial Engineering Department, School of Architecture, Engineering and Design, Universidad Europea de Madrid, 28670 Villaviciosa de Odón, Spain;
| | - Verónica Egido-García
- Vice-Dean Engineering, School of Architecture, Engineering and Design, Universidad Europea de Madrid, 28670 Villaviciosa de Odón, Spain
| |
Collapse
|
21
|
Caruana N, Moffat R, Miguel-Blanco A, Cross ES. Perceptions of intelligence & sentience shape children's interactions with robot reading companions. Sci Rep 2023; 13:7341. [PMID: 37147422 PMCID: PMC10162967 DOI: 10.1038/s41598-023-32104-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2022] [Accepted: 03/21/2023] [Indexed: 05/07/2023] Open
Abstract
The potential for robots to support education is being increasingly studied and rapidly realised. However, most research evaluating education robots has neglected to examine the fundamental features that make them more or less effective, given the needs and expectations of learners. This study explored how children's perceptions, expectations and experiences are shaped by aesthetic and functional features during interactions with different robot 'reading buddies'. We collected a range of quantitative and qualitative measures of subjective experience before and after children read a book with one of three different robots. An inductive thematic analysis revealed that robots have the potential offer children an engaging and non-judgemental social context to promote reading engagement. This was supported by children's perceptions of robots as being intelligent enough to read, listen and comprehend the story, particularly when they had the capacity to talk. A key challenge in the use of robots for this purpose was the unpredictable nature of robot behaviour, which remains difficult to perfectly control and time using either human operators or autonomous algorithms. Consequently, some children found the robots' responses distracting. We provide recommendations for future research seeking to position seemingly sentient and intelligent robots as an assistive tool within and beyond education settings.
Collapse
Affiliation(s)
- Nathan Caruana
- School of Psychological Sciences, Macquarie University, Level 3, 16 University Ave, Sydney, NSW, 2109, Australia.
| | - Ryssa Moffat
- School of Psychological Sciences, Macquarie University, Level 3, 16 University Ave, Sydney, NSW, 2109, Australia
| | - Aitor Miguel-Blanco
- School of Psychological Sciences, Macquarie University, Level 3, 16 University Ave, Sydney, NSW, 2109, Australia
| | - Emily S Cross
- School of Psychological Sciences, Macquarie University, Level 3, 16 University Ave, Sydney, NSW, 2109, Australia.
- Centre for Elite Performance, Expertise and Training, Macquarie University, Sydney, Australia.
- Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, UK.
- MARCS Institute for Brain, Behaviour and Development, University of Western Sydney, Sydney, Australia.
- Department of Humanities, Social & Political Sciences (D-GESS) and the Department of Health Sciences and Technology (D-HEST), ETH Zurich, Zurich, Switzerland.
| |
Collapse
|
22
|
Gross S, Krenn B. A Communicative Perspective on Human–Robot Collaboration in Industry: Mapping Communicative Modes on Collaborative Scenarios. Int J Soc Robot 2023. [DOI: 10.1007/s12369-023-00991-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/05/2023]
|
23
|
Maroto-Gómez M, Alonso-Martín F, Malfaz M, Castro-González Á, Castillo JC, Salichs MÁ. A Systematic Literature Review of Decision-Making and Control Systems for Autonomous and Social Robots. Int J Soc Robot 2023. [DOI: 10.1007/s12369-023-00977-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/29/2023]
Abstract
AbstractIn the last years, considerable research has been carried out to develop robots that can improve our quality of life during tedious and challenging tasks. In these contexts, robots operating without human supervision open many possibilities to assist people in their daily activities. When autonomous robots collaborate with humans, social skills are necessary for adequate communication and cooperation. Considering these facts, endowing autonomous and social robots with decision-making and control models is critical for appropriately fulfiling their initial goals. This manuscript presents a systematic review of the evolution of decision-making systems and control architectures for autonomous and social robots in the last three decades. These architectures have been incorporating new methods based on biologically inspired models and Machine Learning to enhance these systems’ possibilities to developed societies. The review explores the most novel advances in each application area, comparing their most essential features. Additionally, we describe the current challenges of software architecture devoted to action selection, an analysis not provided in similar reviews of behavioural models for autonomous and social robots. Finally, we present the future directions that these systems can take in the future.
Collapse
|
24
|
Higashino K, Kimoto M, Iio T, Shimohara K, Shiomi M. Is Politeness Better than Impoliteness? Comparisons of Robot's Encouragement Effects Toward Performance, Moods, and Propagation. Int J Soc Robot 2023. [DOI: 10.1007/s12369-023-00971-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/12/2023]
Abstract
AbstractThis study experimentally compared the effects of encouragement with polite/ impolite attitudes from a robot in a monotonous task from three viewpoints: performance, mood, and propagation. Experiment I investigated encouragement effects on performance and mood. The participants did a monotonous task during which a robot continuously provided polite, neutral, or impolite encouragement. Our experiment results showed that polite and impolite encouragement significantly improved performance more than neutral comments, although there was no significant difference between polite and impolite encouragement. In addition, impolite encouragement caused significantly more negative moods than polite encouragement. Experiment II determined whether the robot's encouragement influenced the participants' encouragement styles. The participants behaved similarly to the robot in Experiment I, i.e., they selected polite, neutral, and impolite encouragements by observing the progress of a monotonous task by a dummy participant. The experiment results, which showed that the robot's encouragement significantly influenced the participants' encouragement styles, suggest that polite encouragement is more advantageous than impolite encouragement.
Collapse
|
25
|
Błażejowska G, Gruba Ł, Indurkhya B, Gunia A. A Study on the Role of Affective Feedback in Robot-Assisted Learning. SENSORS (BASEL, SWITZERLAND) 2023; 23:1181. [PMID: 36772223 PMCID: PMC9918924 DOI: 10.3390/s23031181] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/16/2022] [Revised: 01/12/2023] [Accepted: 01/17/2023] [Indexed: 06/18/2023]
Abstract
In recent years, there have been many approaches to using robots to teach computer programming. In intelligent tutoring systems and computer-aided learning, there is also some research to show that affective feedback to the student increases learning efficiency. However, a few studies on the role of incorporating an emotional personality in the robot in robot-assisted learning have found different results. To explore this issue further, we conducted a pilot study to investigate the effect of positive verbal encouragement and non-verbal emotive behaviour of the Miro-E robot during a robot-assisted programming session. The participants were tasked to program the robot's behaviour. In the experimental group, the robot monitored the participants' emotional state via their facial expressions, and provided affective feedback to the participants after completing each task. In the control group, the robot responded in a neutral way. The participants filled out a questionnaire before and after the programming session. The results show a positive reaction of the participants to the robot and the exercise. Though the number of participants was small, as the experiment was conducted during the pandemic, a qualitative analysis of the data was carried out. We found that the greatest affective outcome of the session was for students who had little experience or interest in programming before. We also found that the affective expressions of the robot had a negative impact on its likeability, revealing vestiges of the uncanny valley effect.
Collapse
Affiliation(s)
| | | | - Bipin Indurkhya
- Cognitive Science Department, Institute of Philosophy, Jagiellonian University, 31-007 Krakow, Poland
| | - Artur Gunia
- Cognitive Science Department, Institute of Philosophy, Jagiellonian University, 31-007 Krakow, Poland
| |
Collapse
|
26
|
Zonca J, Folsø A, Sciutti A. Social Influence Under Uncertainty in Interaction with Peers, Robots and Computers. Int J Soc Robot 2023. [DOI: 10.1007/s12369-022-00959-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/13/2023]
Abstract
AbstractTaking advice from others requires confidence in their competence. This is important for interaction with peers, but also for collaboration with social robots and artificial agents. Nonetheless, we do not always have access to information about others’ competence or performance. In these uncertain environments, do our prior beliefs about the nature and the competence of our interacting partners modulate our willingness to rely on their judgments? In a joint perceptual decision making task, participants made perceptual judgments and observed the simulated estimates of either a human participant, a social humanoid robot or a computer. Then they could modify their estimates based on this feedback. Results show participants’ belief about the nature of their partner biased their compliance with its judgments: participants were more influenced by the social robot than human and computer partners. This difference emerged strongly at the very beginning of the task and decreased with repeated exposure to empirical feedback on the partner’s responses, disclosing the role of prior beliefs in social influence under uncertainty. Furthermore, the results of our functional task suggest an important difference between human–human and human–robot interaction in the absence of overt socially relevant signal from the partner: the former is modulated by social normative mechanisms, whereas the latter is guided by purely informational mechanisms linked to the perceived competence of the partner.
Collapse
|
27
|
Peretti G, Manzi F, Di Dio C, Cangelosi A, Harris PL, Massaro D, Marchetti A. Can a robot lie? Young children's understanding of intentionality beneath false statements. INFANT AND CHILD DEVELOPMENT 2023. [DOI: 10.1002/icd.2398] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/06/2023]
Affiliation(s)
- Giulia Peretti
- Research Unit on Theory of Mind, Department of Psychology Università Cattolica del Sacro Cuore Milan Italy
| | - Federico Manzi
- Research Unit on Theory of Mind, Department of Psychology Università Cattolica del Sacro Cuore Milan Italy
| | - Cinzia Di Dio
- Research Unit on Theory of Mind, Department of Psychology Università Cattolica del Sacro Cuore Milan Italy
| | | | - Paul L. Harris
- Graduate School of Education Harvard University Cambridge Massachusetts USA
| | - Davide Massaro
- Research Unit on Theory of Mind, Department of Psychology Università Cattolica del Sacro Cuore Milan Italy
| | - Antonella Marchetti
- Research Unit on Theory of Mind, Department of Psychology Università Cattolica del Sacro Cuore Milan Italy
| |
Collapse
|
28
|
Dolata M, Katsiuba D, Wellnhammer N, Schwabe G. Learning with Digital Agents: An Analysis based on the Activity Theory. J MANAGE INFORM SYST 2023. [DOI: 10.1080/07421222.2023.2172775] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/03/2023]
|
29
|
Ahtinen A, Kaipainen K, Jarske S, Väänänen K. Supporting Remote Social Robot Design Collaboration with Online Canvases: Lessons Learned from Facilitators' and Participants' Experiences. Int J Soc Robot 2023; 15:317-343. [PMID: 36741031 PMCID: PMC9885415 DOI: 10.1007/s12369-023-00966-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/16/2023] [Indexed: 01/31/2023]
Abstract
Social robot design projects typically involve multidisciplinary teamwork and collaboration, adopt a Human-Centred Design (HCD) approach, and deal with physical (tangible) objects, i.e., robots. HCD takes a human to the centre point of the design process. A typical activity in HCD are design workshops where a facilitator is needed to guide and moderate the task-related and interactional activities throughout the session. Facilitation is also usually needed in longer-term design projects or courses to guide participants through the different phases of design during several sessions. Recently, due to the COVID-19 pandemic, most design activities including social robot design were rapidly transferred to online mode. Designing for tangible objects is challenging in online settings because the interaction experience with a physical object is hard to demonstrate online. In this article, we report how we harnessed online canvases to support both short-term social robot design workshops and a long-term design course. Based on participants' feedback and facilitators' experiences, we report lessons learned from utilizing collaborative design canvases for creative social robot design projects that specifically focus on early stages and concept ideation. We propose practical guidelines for canvas-based online facilitation focusing on creative design workshops and projects. In addition, we discuss the lessons learned concerning social robot design activities taking place in online mode. To respond to the challenges of designing tangible robots in a fully online mode, we suggest a Hybrid Robotic Design Model (HRDM), where the participants work in contact with facilitators, other participants and robots at specific points, while the other phases are conducted online.
Collapse
Affiliation(s)
- Aino Ahtinen
- grid.502801.e0000 0001 2314 6254Tampere University, Computing Sciences, Tampere, Finland
| | - Kirsikka Kaipainen
- grid.502801.e0000 0001 2314 6254Tampere University, Computing Sciences, Tampere, Finland
| | - Salla Jarske
- grid.502801.e0000 0001 2314 6254Tampere University, Computing Sciences, Tampere, Finland
| | - Kaisa Väänänen
- grid.502801.e0000 0001 2314 6254Tampere University, Computing Sciences, Tampere, Finland
| |
Collapse
|
30
|
Esfandbod A, Rokhi Z, Meghdari AF, Taheri A, Alemi M, Karimi M. Utilizing an Emotional Robot Capable of Lip-Syncing in Robot-Assisted Speech Therapy Sessions for Children with Language Disorders. Int J Soc Robot 2023; 15:165-183. [PMID: 36467283 PMCID: PMC9684761 DOI: 10.1007/s12369-022-00946-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/04/2022] [Indexed: 11/24/2022]
Abstract
This study scrutinizes the impacts of utilizing a socially assistive robot, the RASA robot, during speech therapy sessions for children with language disorders. Two capabilities were developed for the robotic platform to enhance children-robot interactions during speech therapy interventions: facial expression communication (containing recognition and expression) and lip-syncing. Facial expression recognition was conducted by training several well-known CNN architectures on one of the most extensive facial expressions databases, the AffectNet database, and then modifying them using the transfer learning strategy performed on the CK+ dataset. The robot's lip-syncing capability was designed in two steps. The first step was concerned with designing precise schemes of the articulatory elements needed during the pronunciation of the Persian phonemes (i.e., consonants and vowels). The second step included developing an algorithm to pronounce words by disassembling them into their components (including consonants and vowels) and then morphing them into each other successively. To pursue the study's primary goal, two comparable groups of children with language disorders were considered, the intervention and control groups. The intervention group attended therapy sessions in which the robot acted as the therapist's assistant, while the control group only communicated with the human therapist. The study's first purpose was to compare the children's engagement while playing a mimic game with the affective robot and the therapist, conducted via video coding. The second objective was to assess the efficacy of the robot's presence in the speech therapy sessions alongside the therapist, accomplished by administering the Persian Test of Language Development, Persian TOLD. According to the first scenario, playing with the affective robot is more engaging than playing with the therapist. Furthermore, the statistical analysis of the study's results indicates that participating in robot-assisted speech therapy (RAST) sessions enhances children with language disorders' achievements in comparison with taking part in conventional speech therapy interventions.
Collapse
Affiliation(s)
- Alireza Esfandbod
- Social and Cognitive Robotics Laboratory, Center of Excellence in Design, Robotics, and Automation (CEDRA), Sharif University of Technology, Tehran, Iran
| | - Zeynab Rokhi
- Social and Cognitive Robotics Laboratory, Center of Excellence in Design, Robotics, and Automation (CEDRA), Sharif University of Technology, Tehran, Iran
| | - Ali F. Meghdari
- Social and Cognitive Robotics Laboratory, Center of Excellence in Design, Robotics, and Automation (CEDRA), Sharif University of Technology, Tehran, Iran
- Fereshtegaan International Branch, Islamic Azad University, Tehran, Iran
| | - Alireza Taheri
- Social and Cognitive Robotics Laboratory, Center of Excellence in Design, Robotics, and Automation (CEDRA), Sharif University of Technology, Tehran, Iran
| | - Minoo Alemi
- Social and Cognitive Robotics Laboratory, Center of Excellence in Design, Robotics, and Automation (CEDRA), Sharif University of Technology, Tehran, Iran
- Department of Humanities, West Tehran Branch, Islamic Azad University, Tehran, Iran
| | | |
Collapse
|
31
|
Chin JH, Haring KS, Kim P. Understanding the neural mechanisms of empathy toward robots to shape future applications. Front Neurorobot 2023; 17:1145989. [PMID: 37125225 PMCID: PMC10130423 DOI: 10.3389/fnbot.2023.1145989] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2023] [Accepted: 03/06/2023] [Indexed: 05/02/2023] Open
Abstract
This article provides an overview on how modern neuroscience evaluations link to robot empathy. It evaluates the brain correlates of empathy and caregiving, and how they may be related to the higher functions with an emphasis on women. We discuss that the understanding of the brain correlates can inform the development of social robots with enhanced empathy and caregiving abilities. We propose that the availability of these robots will benefit many aspects of the society including transition to parenthood and parenting, in which women are deeply involved in real life and scientific research. We conclude with some of the barriers for women in the field and how robotics and robot empathy research benefits from a broad representation of researchers.
Collapse
Affiliation(s)
- Jenna H. Chin
- Family and Child Neuroscience Lab, Department of Psychology, Brain, Artificial Intelligence, and Child (BAIC) Center, University of Denver, Denver, CO, United States
| | - Kerstin S. Haring
- Humane Robot Technology (HuRoT) Laboratory, Department of Computer Science, Ritchie School of Engineering and Computer Science, University of Denver, Denver, CO, United States
- *Correspondence: Kerstin S. Haring
| | - Pilyoung Kim
- Family and Child Neuroscience Lab, Department of Psychology, Brain, Artificial Intelligence, and Child (BAIC) Center, University of Denver, Denver, CO, United States
- Department of Psychology, Ewha Womans University, Seoul, Republic of Korea
| |
Collapse
|
32
|
Lei X, Rau PLP. Emotional responses to performance feedback in an educational game during cooperation and competition with a robot: Evidence from fNIRS. COMPUTERS IN HUMAN BEHAVIOR 2023. [DOI: 10.1016/j.chb.2022.107496] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
33
|
Correia F, Melo FS, Paiva A. When a Robot Is Your Teammate. Top Cogn Sci 2022. [PMID: 36573665 DOI: 10.1111/tops.12634] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2021] [Revised: 08/19/2022] [Accepted: 11/02/2022] [Indexed: 12/28/2022]
Abstract
Creating effective teamwork between humans and robots involves not only addressing their performance as a team but also sustaining the quality and sense of unity among teammates, also known as cohesion. This paper explores the research problem of: how can we endow robotic teammates with social capabilities to improve the cohesive alliance with humans? By defining the concept of a human-robot cohesive alliance in the light of the multidimensional construct of cohesion from the social sciences, we propose to address this problem through the idea of multifaceted human-robot cohesion. We present our preliminary effort from previous works to examine each of the five dimensions of cohesion: social, collective, emotional, structural, and task. We finish the paper with a discussion on how human-robot cohesion contributes to the key questions and ongoing challenges of creating robotic teammates. Overall, cohesion in human-robot teams might be a key factor to propel team performance and it should be considered in the design, development, and evaluation of robotic teammates.
Collapse
Affiliation(s)
- Filipa Correia
- INESC-ID, Instituto Superior Técnico, Universidade de Lisboa
- ITI, LARSyS, Instituto Superior Técnico, Universidade de Lisboa
| | | | - Ana Paiva
- INESC-ID, Instituto Superior Técnico, Universidade de Lisboa
| |
Collapse
|
34
|
Leung AYM, Zhao IY, Lin S, Lau TK. Exploring the Presence of Humanoid Social Robots at Home and Capturing Human-Robot Interactions with Older Adults: Experiences from Four Case Studies. Healthcare (Basel) 2022; 11:healthcare11010039. [PMID: 36611499 PMCID: PMC9818881 DOI: 10.3390/healthcare11010039] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2022] [Revised: 12/14/2022] [Accepted: 12/20/2022] [Indexed: 12/24/2022] Open
Abstract
BACKGROUND Social robots have the potential to bring benefits to aged care. However, it is uncertain whether placing these robots in older people's home is acceptable and whether human-robot interactions would occur or not. METHODS Four case studies were conducted to understand the experiences of older adults and family caregivers when humanoid social robot Ka Ka was placed in homes for two weeks. RESULTS Four older adults and three family caregivers were involved. Older adults interacted with the social robot Ka Ka every day during the study period. 'Talking to Ka Ka', 'listening to music', 'using the calendar reminder', and 'listening to the weather report' were the most commonly used features. Qualitative data reported the strengths of Ka Ka, such as providing emotional support to older adults living alone, diversifying their daily activities, and enhancing family relationships. The voice from Ka Ka (female, soft, and pleasing to the ear) was considered as 'bringing a pleasant feeling' to older adults. CONCLUSIONS In order to support aging-in-place and fill the gaps of the intensified shortage of health and social manpower, it is of prime importance to develop reliable and age-friendly AI-based robotic services that meet the needs and preferences of older adults and caregivers.
Collapse
Affiliation(s)
- Angela Y. M. Leung
- WHO Collaborating Centre for Community Health Services, School of Nursing, The Hong Kong Polytechnic University, Hong Kong 999077, China
- Research Institute of Smart Aging (RISA), The Hong Kong Polytechnic University, Hong Kong 999077, China
- Correspondence: ; Tel.: +852-2766-5587
| | - Ivy Y. Zhao
- WHO Collaborating Centre for Community Health Services, School of Nursing, The Hong Kong Polytechnic University, Hong Kong 999077, China
- Research Institute of Smart Aging (RISA), The Hong Kong Polytechnic University, Hong Kong 999077, China
| | - Shuanglan Lin
- WHO Collaborating Centre for Community Health Services, School of Nursing, The Hong Kong Polytechnic University, Hong Kong 999077, China
| | - Terence K. Lau
- WHO Collaborating Centre for Community Health Services, School of Nursing, The Hong Kong Polytechnic University, Hong Kong 999077, China
- Research Institute of Smart Aging (RISA), The Hong Kong Polytechnic University, Hong Kong 999077, China
| |
Collapse
|
35
|
A Systematic Review on Social Robots in Public Spaces: Threat Landscape and Attack Surface. COMPUTERS 2022. [DOI: 10.3390/computers11120181] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
Abstract
There is a growing interest in using social robots in public spaces for indoor and outdoor applications. The threat landscape is an important research area being investigated and debated by various stakeholders. Objectives: This study aims to identify and synthesize empirical research on the complete threat landscape of social robots in public spaces. Specifically, this paper identifies the potential threat actors, their motives for attacks, vulnerabilities, attack vectors, potential impacts of attacks, possible attack scenarios, and mitigations to these threats. Methods: This systematic literature review follows the guidelines by Kitchenham and Charters. The search was conducted in five digital databases, and 1469 studies were retrieved. This study analyzed 21 studies that satisfied the selection criteria. Results: Main findings reveal four threat categories: cybersecurity, social, physical, and public space. Conclusion: This study completely grasped the complexity of the transdisciplinary problem of social robot security and privacy while accommodating the diversity of stakeholders’ perspectives. Findings give researchers and other stakeholders a comprehensive view by highlighting current developments and new research directions in this field. This study also proposed a taxonomy for threat actors and the threat landscape of social robots in public spaces.
Collapse
|
36
|
Competence-Aware Systems. ARTIF INTELL 2022. [DOI: 10.1016/j.artint.2022.103844] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/30/2022]
|
37
|
Vaitonytė J, Alimardani M, Louwerse MM. Scoping review of the neural evidence on the uncanny valley. COMPUTERS IN HUMAN BEHAVIOR REPORTS 2022. [DOI: 10.1016/j.chbr.2022.100263] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022] Open
|
38
|
Osawa H, Horino K, Sato T. Social agents as catalysts: Social dynamics in the classroom with book introduction robot. Front Robot AI 2022; 9:934325. [PMID: 36504495 PMCID: PMC9726744 DOI: 10.3389/frobt.2022.934325] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2022] [Accepted: 10/18/2022] [Indexed: 11/24/2022] Open
Abstract
One of the possible benefits of robot-mediated education is the effect of the robot becoming a catalyst between people and facilitating learning. In this study, the authors focused on an asynchronous active learning method mediated by robots. Active learning is believed to help students continue learning and develop the ability to think independently. Therefore, the authors improved the UGA (User Generated Agent) system that we have created for long-term active learning in COVID-19 to create an environment where children introduce books to each other via robots. The authors installed the robot in an elementary school and conducted an experiment lasting more than a year. As a result, it was confirmed that the robot could continue to be used without getting bored even over a long period of time. They also analyzed how the children created the contents by analyzing the contents that had a particularly high number of views. In particular, the authors observed changes in children's behavior, such as spontaneous advertising activities, guidance from upperclassmen to lowerclassmen, collaboration with multiple people, and increased interest in technology, even under conditions where the new coronavirus was spreading and children's social interaction was inhibited.
Collapse
Affiliation(s)
- Hirotaka Osawa
- Human-Agent Interaction Laboratory, Faculty of Science and Technology, Keio University, Yokohama, Kanagawa, Japan,*Correspondence: Hirotaka Osawa,
| | - Kohei Horino
- Human-Agent Interaction Laboratory, Graduate School of Science and Technology, University of Tsukuba, Tsukuba, Ibaraki, Japan
| | - Takuya Sato
- Human-Agent Interaction Laboratory, Graduate School of Science and Technology, University of Tsukuba, Tsukuba, Ibaraki, Japan
| |
Collapse
|
39
|
Desideri L, Cesario L, Sidoti C, Malavasi M. Immersive robotic telepresence system to support a person with intellectual and motor disabilities perform a daily task: a case study. JOURNAL OF ENABLING TECHNOLOGIES 2022. [DOI: 10.1108/jet-05-2022-0042] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
PurposeIn this proof-of-concept study, the authors assessed the feasibility of using a humanoid robot controlled remotely via an immersive telepresence system to support a person with intellectual and motor disabilities performing a daily task (i.e. setting a table for lunch).Design/methodology/approachThe system involved a head-mounted display and two joysticks. A teleoperator was able to see through the video cameras of the robot and deliver the instructions verbally to the participant located in a different room. To assess the system, a baseline phase (A) was followed by an intervention (i.e. tele-operated support) phase (B) and a return to a baseline phase (A).FindingsData showed a marked increase in the average frequency of task steps correctly performed from baseline (M = 15%) to intervention (M = 93%). Accuracy reached 100% in the return to baseline.Originality/valueThese preliminary findings, along with qualitative feedback from users, suggest that an immersive telepresence system may be used to provide remote support to people with intellectual and motor disabilities.
Collapse
|
40
|
Bond formation with pet-robots: An integrative approach. CURRENT PSYCHOLOGY 2022. [DOI: 10.1007/s12144-022-03792-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
AbstractThe challenge of long-term interaction between humans and robots is still a bottleneck in service robot research. To gain an understanding of sustained relatedness with robots, this study proposes a conceptual framework for bond formation. More specifically, it addresses the dynamics of children bonding with robotic pets as the basis for certain services in healthcare and education. The framework presented herein offers an integrative approach and draws from theoretical models and empirical research in Human Robot Interaction and also from related disciplines that investigate lasting relationships, such as human-animal affiliation and attachment to everyday objects. The research question is how children’s relatedness to personified technologies occurs and evolves and what underpinning processes are involved. The subfield of research is child-robot interaction, within the boundaries of social psychology, where the robot is viewed as a social agent, and human-system interaction, where the robot is regarded as an artificial entity. The proposed framework envisions bonding with pet-robots as a socio-affective process towards lasting connectedness and emotional involvement that evolves through three stages: first encounter, short-term interaction and lasting relationship. The stages are characterized by children’s behaviors, cognitions and feelings that can be identified, measured and, maybe more importantly, managed. This model aims to integrate fragmentary and heterogeneous knowledge into a new perspective on the impact of robots in close and enduring proximity to children.
Collapse
|
41
|
Zinina A, Kotov A, Arinkin N, Zaidelman L. Learning a foreign language vocabulary with a companion robot. COGN SYST RES 2022. [DOI: 10.1016/j.cogsys.2022.10.007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
42
|
Pauketat JV, Anthis JR. Predicting the moral consideration of artificial intelligences. COMPUTERS IN HUMAN BEHAVIOR 2022. [DOI: 10.1016/j.chb.2022.107372] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/14/2023]
|
43
|
Cui Z, Wang W, Xia H, Wang C, Tu J, Ji S, Tan JMR, Liu Z, Zhang F, Li W, Lv Z, Li Z, Guo W, Koh NY, Ng KB, Feng X, Zheng Y, Chen X. Freestanding and Scalable Force-Softness Bimodal Sensor Arrays for Haptic Body-Feature Identification. ADVANCED MATERIALS (DEERFIELD BEACH, FLA.) 2022; 34:e2207016. [PMID: 36134530 DOI: 10.1002/adma.202207016] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Revised: 09/13/2022] [Indexed: 06/16/2023]
Abstract
Tactile technologies that can identify human body features are valuable in clinical diagnosis and human-machine interactions. Previously, cutting-edge tactile platforms have been able to identify structured non-living objects; however, identification of human body features remains challenging mainly because of the irregular contour and heterogeneous spatial distribution of softness. Here, freestanding and scalable tactile platforms of force-softness bimodal sensor arrays are developed, enabling tactile gloves to identify body features using machine-learning methods. The bimodal sensors are engineered by adding a protrusion on a piezoresistive pressure sensor, endowing the resistance signals with combined information of pressure and the softness of samples. The simple design enables 112 bimodal sensors to be integrated into a thin, conformal, and stretchable tactile glove, allowing the tactile information to be digitalized while hand skills are performed on the human body. The tactile glove shows high accuracy (98%) in identifying four body features of a real person, and four organ models (healthy and pathological) inside an abdominal simulator, demonstrating identification of body features of the bimodal tactile platforms and showing their potential use in future healthcare and robotics.
Collapse
Affiliation(s)
- Zequn Cui
- Innovative Center for Flexible Devices (iFLEX) & Max Planck-NTU Joint Lab for Artificial Senses, School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798, Singapore
| | - Wensong Wang
- School of Electrical & Electronic Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798, Singapore
| | - Huarong Xia
- Innovative Center for Flexible Devices (iFLEX) & Max Planck-NTU Joint Lab for Artificial Senses, School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798, Singapore
| | - Changxian Wang
- Innovative Center for Flexible Devices (iFLEX) & Max Planck-NTU Joint Lab for Artificial Senses, School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798, Singapore
| | - Jiaqi Tu
- Innovative Center for Flexible Devices (iFLEX) & Max Planck-NTU Joint Lab for Artificial Senses, School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798, Singapore
- Institute of Flexible Electronics Technology of THU, Zhejiang, Jiaxing, 314000, China
| | - Shaobo Ji
- Innovative Center for Flexible Devices (iFLEX) & Max Planck-NTU Joint Lab for Artificial Senses, School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798, Singapore
| | - Joel Ming Rui Tan
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798, Singapore
| | - Zhihua Liu
- Institute of Materials Research and Engineering, the Agency for Science, Technology and Research, 2 Fusionopolis Way, Innovis, #08-03, Singapore, 138634, Singapore
| | - Feilong Zhang
- Innovative Center for Flexible Devices (iFLEX) & Max Planck-NTU Joint Lab for Artificial Senses, School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798, Singapore
| | - Wenlong Li
- Institute of Materials Research and Engineering, the Agency for Science, Technology and Research, 2 Fusionopolis Way, Innovis, #08-03, Singapore, 138634, Singapore
| | - Zhisheng Lv
- Institute of Materials Research and Engineering, the Agency for Science, Technology and Research, 2 Fusionopolis Way, Innovis, #08-03, Singapore, 138634, Singapore
| | - Zheng Li
- Innovative Center for Flexible Devices (iFLEX) & Max Planck-NTU Joint Lab for Artificial Senses, School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798, Singapore
| | - Wei Guo
- Innovative Center for Flexible Devices (iFLEX) & Max Planck-NTU Joint Lab for Artificial Senses, School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798, Singapore
| | - Nien Yue Koh
- Lee Kong Chian School of Medicine, Novena Campus, Nanyang Technological University, 11 Mandalay Road, Singapore, 308232, Singapore
| | - Kian Bee Ng
- Lee Kong Chian School of Medicine, Novena Campus, Nanyang Technological University, 11 Mandalay Road, Singapore, 308232, Singapore
| | - Xue Feng
- Laboratory of Flexible Electronics Technology, Tsinghua University, Beijing, 100190, China
| | - Yuanjin Zheng
- School of Electrical & Electronic Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798, Singapore
| | - Xiaodong Chen
- Innovative Center for Flexible Devices (iFLEX) & Max Planck-NTU Joint Lab for Artificial Senses, School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798, Singapore
- Institute of Materials Research and Engineering, the Agency for Science, Technology and Research, 2 Fusionopolis Way, Innovis, #08-03, Singapore, 138634, Singapore
- Institute for Digital Molecular Analytics and Science (IDMxS), Nanyang Technological University, 59 Nanyang Drive, Singapore, 636921, Singapore
| |
Collapse
|
44
|
Pareto L, Ekström S, Serholt S. Children’s learning-by-teaching with a social robot versus a younger child: Comparing interactions and tutoring styles. Front Robot AI 2022; 9:875704. [DOI: 10.3389/frobt.2022.875704] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2022] [Accepted: 09/20/2022] [Indexed: 11/05/2022] Open
Abstract
Human peer tutoring is known to be effective for learning, and social robots are currently being explored for robot-assisted peer tutoring. In peer tutoring, not only the tutee but also the tutor benefit from the activity. Exploiting the learning-by-teaching mechanism, robots as tutees can be a promising approach for tutor learning. This study compares robots and humans by examining children’s learning-by-teaching with a social robot and younger children, respectively. The study comprised a small-scale field experiment in a Swedish primary school, following a within-subject design. Ten sixth-grade students (age 12–13) assigned as tutors conducted two 30 min peer tutoring sessions each, one with a robot tutee and one with a third-grade student (age 9–10) as the tutee. The tutoring task consisted of teaching the tutee to play a two-player educational game designed to promote conceptual understanding and mathematical thinking. The tutoring sessions were video recorded, and verbal actions were transcribed and extended with crucial game actions and user gestures, to explore differences in interaction patterns between the two conditions. An extension to the classical initiation–response–feedback framework for classroom interactions, the IRFCE tutoring framework, was modified and used as an analytic lens. Actors, tutoring actions, and teaching interactions were examined and coded as they unfolded in the respective child–robot and child–child interactions during the sessions. Significant differences between the robot tutee and child tutee conditions regarding action frequencies and characteristics were found, concerning tutee initiatives, tutee questions, tutor explanations, tutee involvement, and evaluation feedback. We have identified ample opportunities for the tutor to learn from teaching in both conditions, for different reasons. The child tutee condition provided opportunities to engage in explanations to the tutee, experience smooth collaboration, and gain motivation through social responsibility for the younger child. The robot tutee condition provided opportunities to answer challenging questions from the tutee, receive plenty of feedback, and communicate using mathematical language. Hence, both conditions provide good learning opportunities for a tutor, but in different ways.
Collapse
|
45
|
Esfandbod A, Nourbala A, Rokhi Z, Meghdari AF, Taheri A, Alemi M. Design, Manufacture, and Acceptance Evaluation of APO: A Lip-syncing Social Robot Developed for Lip-reading Training Programs. Int J Soc Robot 2022:1-15. [PMID: 36320591 PMCID: PMC9614198 DOI: 10.1007/s12369-022-00933-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/27/2022] [Indexed: 11/27/2022]
Abstract
Lack of educational facilities for the burgeoning world population, financial barriers, and the growing tendency in favor of inclusive education have all helped channel a general inclination toward using various educational assistive technologies, e.g., socially assistive robots. Employing social robots in diverse educational scenarios could enhance learners' achievements by motivating them and sustaining their level of engagement. This study is devoted to manufacturing and investigating the acceptance of a novel social robot named APO, designed to improve hearing-impaired individuals' lip-reading skills through an educational game. To accomplish the robot's objective, we proposed and implemented a lip-syncing system on the APO social robot. The proposed robot's potential with regard to its primary goals, tutoring and practicing lip-reading, was examined through two main experiments. The first experiment was dedicated to evaluating the clarity of the utterances articulated by the robot. The evaluation was quantified by comparing the robot's articulation of words with a video of a human teacher lip-syncing the same words. In this inspection, due to the adults' advanced skill in lip-reading compared to children, twenty-one adult participants were asked to identify the words lip-synced in the two scenarios (the articulation of the robot and the video recorded from the human teacher). Subsequently, the number of words that participants correctly recognized from the robot and the human teacher articulations was considered a metric to evaluate the caliber of the designed lip-syncing system. The outcome of this experiment revealed that no significant differences were observed between the participants' recognition of the robot and the human tutor's articulation of multisyllabic words. Following the validation of the proposed articulatory system, the acceptance of the robot by a group of hearing-impaired participants, eighteen adults and sixteen children, was scrutinized in the second experiment. The adults and the children were asked to fill in two standard questionnaires, UTAUT and SAM, respectively. Our findings revealed that the robot acquired higher scores than the lip-syncing video in most of the questionnaires' items, which could be interpreted as a greater intention of utilizing the APO robot as an assistive technology for lip-reading instruction among adults and children.
Collapse
Affiliation(s)
- Alireza Esfandbod
- Social and Cognitive Robotics Laboratory, Center of Excellence in Design, Robotics, and Automation (CEDRA), Sharif University of Technology, Tehran, Iran
| | - Ahmad Nourbala
- Social and Cognitive Robotics Laboratory, Center of Excellence in Design, Robotics, and Automation (CEDRA), Sharif University of Technology, Tehran, Iran
| | - Zeynab Rokhi
- Social and Cognitive Robotics Laboratory, Center of Excellence in Design, Robotics, and Automation (CEDRA), Sharif University of Technology, Tehran, Iran
| | - Ali F. Meghdari
- Social and Cognitive Robotics Laboratory, Center of Excellence in Design, Robotics, and Automation (CEDRA), Sharif University of Technology, Tehran, Iran
- Fereshtegaan International Branch, Chancellor, Islamic Azad University, Tehran, Iran
| | - Alireza Taheri
- Social and Cognitive Robotics Laboratory, Center of Excellence in Design, Robotics, and Automation (CEDRA), Sharif University of Technology, Tehran, Iran
| | - Minoo Alemi
- Social and Cognitive Robotics Laboratory, Center of Excellence in Design, Robotics, and Automation (CEDRA), Sharif University of Technology, Tehran, Iran
- Department of Humanities, West Tehran Branch, Islamic Azad University, Tehran, Iran
| |
Collapse
|
46
|
Vaitonytė J, Alimardani M, Louwerse MM. Corneal reflections and skin contrast yield better memory of human and virtual faces. Cogn Res Princ Implic 2022; 7:94. [PMID: 36258062 PMCID: PMC9579222 DOI: 10.1186/s41235-022-00445-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2022] [Accepted: 10/07/2022] [Indexed: 11/10/2022] Open
Abstract
Virtual faces have been found to be rated less human-like and remembered worse than photographic images of humans. What it is in virtual faces that yields reduced memory has so far remained unclear. The current study investigated face memory in the context of virtual agent faces and human faces, real and manipulated, considering two factors of predicted influence, i.e., corneal reflections and skin contrast. Corneal reflections referred to the bright points in each eye that occur when the ambient light reflects from the surface of the cornea. Skin contrast referred to the degree to which skin surface is rough versus smooth. We conducted two memory experiments, one with high-quality virtual agent faces (Experiment 1) and the other with the photographs of human faces that were manipulated (Experiment 2). Experiment 1 showed better memory for virtual faces with increased corneal reflections and skin contrast (rougher rather than smoother skin). Experiment 2 replicated these findings, showing that removing the corneal reflections and smoothening the skin reduced memory recognition of manipulated faces, with a stronger effect exerted by the eyes than the skin. This study highlights specific features of the eyes and skin that can help explain memory discrepancies between real and virtual faces and in turn elucidates the factors that play a role in the cognitive processing of faces.
Collapse
Affiliation(s)
- Julija Vaitonytė
- grid.12295.3d0000 0001 0943 3265Department of Cognitive Science and Artificial Intelligence, Tilburg University, Dante Building D 134, Warandelaan 2, 5037 AB Tilburg, The Netherlands
| | - Maryam Alimardani
- grid.12295.3d0000 0001 0943 3265Department of Cognitive Science and Artificial Intelligence, Tilburg University, Dante Building D 134, Warandelaan 2, 5037 AB Tilburg, The Netherlands
| | - Max M. Louwerse
- grid.12295.3d0000 0001 0943 3265Department of Cognitive Science and Artificial Intelligence, Tilburg University, Dante Building D 134, Warandelaan 2, 5037 AB Tilburg, The Netherlands
| |
Collapse
|
47
|
van den Berghe R. Social robots in a translanguaging pedagogy: A review to identify opportunities for robot-assisted (language) learning. Front Robot AI 2022; 9:958624. [PMID: 36313250 PMCID: PMC9613956 DOI: 10.3389/frobt.2022.958624] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2022] [Accepted: 08/18/2022] [Indexed: 11/13/2022] Open
Abstract
This mini review discusses the use of social robots in a translanguaging pedagogy: the use of robots to enable students to use their full linguistic repertoire within schools, so any language that they speak at home or in another aspect of their lives. Current research on robot-assisted second-language learning is reviewed with the aim of finding out whether students’ languages have been employed strategically to support learning of another language. A total of 83 articles has been analyzed on the use of first and second languages in student-robot interactions. Most interactions were either exclusively in the second language, or exclusively in the first language, with only target words in the second language. Few studies strategically mixed the two languages to bootstrap learning, and only one study used the first language of students with migrant backgrounds to learn the second language. The review concludes with recommendations for future use of social robots in a translanguaging pedagogy.
Collapse
|
48
|
Rohlfing KJ, Altvater-Mackensen N, Caruana N, van den Berghe R, Bruno B, Tolksdorf NF, Hanulíková A. Social/dialogical roles of social robots in supporting children’s learning of language and literacy—A review and analysis of innovative roles. Front Robot AI 2022; 9:971749. [PMID: 36274914 PMCID: PMC9581183 DOI: 10.3389/frobt.2022.971749] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2022] [Accepted: 08/19/2022] [Indexed: 11/16/2022] Open
Abstract
One of the many purposes for which social robots are designed is education, and there have been many attempts to systematize their potential in this field. What these attempts have in common is the recognition that learning can be supported in a variety of ways because a learner can be engaged in different activities that foster learning. Up to now, three roles have been proposed when designing these activities for robots: as a teacher or tutor, a learning peer, or a novice. Current research proposes that deciding in favor of one role over another depends on the content or preferred pedagogical form. However, the design of activities changes not only the content of learning, but also the nature of a human–robot social relationship. This is particularly important in language acquisition, which has been recognized as a social endeavor. The following review aims to specify the differences in human–robot social relationships when children learn language through interacting with a social robot. After proposing categories for comparing these different relationships, we review established and more specific, innovative roles that a robot can play in language-learning scenarios. This follows Mead’s (1946) theoretical approach proposing that social roles are performed in interactive acts. These acts are crucial for learning, because not only can they shape the social environment of learning but also engage the learner to different degrees. We specify the degree of engagement by referring to Chi’s (2009) progression of learning activities that range from active, constructive, toward interactive with the latter fostering deeper learning. Taken together, this approach enables us to compare and evaluate different human–robot social relationships that arise when applying a robot in a particular social role.
Collapse
Affiliation(s)
- Katharina J. Rohlfing
- Developmental Psycholinguistics, Faculty of Arts and Humanities, Paderborn University, Paderborn, Germany
- *Correspondence: Katharina J. Rohlfing,
| | - Nicole Altvater-Mackensen
- Developmental Psychology, Psychologisches Institut, Johannes-Gutenberg-Universität Mainz, English Linguistics, University of Mannheim, Mainz, Germany
| | - Nathan Caruana
- School of Psychological Science, Macquarie University Centre for Reading, Macquarie University, Sydney, NSW, Australia
| | - Rianne van den Berghe
- Urban Care & Education, Windesheim University of Applied Sciences, Almere, Netherlands
| | | | - Nils F. Tolksdorf
- Developmental Psycholinguistics, Faculty of Arts and Humanities, Paderborn University, Paderborn, Germany
| | - Adriana Hanulíková
- Language and Cognition, Deutsches Seminar, Albert-Ludwigs-Universität Freiburg, Freiburg, Germany
| |
Collapse
|
49
|
Triantafyllidis A, Alexiadis A, Elmas D, Gerovasilis G, Votis K, Tzovaras D. A social robot-based platform for health behavior change toward prevention of childhood obesity. UNIVERSAL ACCESS IN THE INFORMATION SOCIETY 2022; 22:1-11. [PMID: 36211232 PMCID: PMC9526206 DOI: 10.1007/s10209-022-00922-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Accepted: 09/20/2022] [Indexed: 06/16/2023]
Abstract
Childhood obesity is a major public health challenge which is linked with the occurrence of diseases such as diabetes and cancer. The COVID-19 pandemic has forced changes to the lifestyle behaviors of children, thereby making the risk of developing obesity even greater. Novel preventive tools and approaches are required to fight childhood obesity. We present a social robot-based platform which utilizes an interactive motivational strategy in communication with children, collects self-reports through the touch of tangible objects, and processes behavioral data, aiming to: (a) screen and assess the behaviors of children in the dimensions of physical activity, diet, and education, and (b) recommend individualized goals for health behavior change. The platform was integrated through a microservice architecture within a multi-component system targeting childhood obesity prevention. The platform was evaluated in an experimental study with 30 children aged 9-12 years in a real-life school setting, showing children's acceptance to use it, and an 80% success rate in achieving weekly personal health goals recommended by the social robot-based platform. The results provide preliminary evidence on the implementation feasibility and potential of the social robot-based platform toward the betterment of children's health behaviors in the context of childhood obesity prevention. Further rigorous longer-term studies are required.
Collapse
Affiliation(s)
- Andreas Triantafyllidis
- Information Technologies Institute, Centre for Research and Technology Hellas (CERTH), 57001 Thessaloníki, Greece
| | - Anastasios Alexiadis
- Information Technologies Institute, Centre for Research and Technology Hellas (CERTH), 57001 Thessaloníki, Greece
| | - Dimosthenis Elmas
- Information Technologies Institute, Centre for Research and Technology Hellas (CERTH), 57001 Thessaloníki, Greece
| | - Georgios Gerovasilis
- Information Technologies Institute, Centre for Research and Technology Hellas (CERTH), 57001 Thessaloníki, Greece
| | - Konstantinos Votis
- Information Technologies Institute, Centre for Research and Technology Hellas (CERTH), 57001 Thessaloníki, Greece
| | - Dimitrios Tzovaras
- Information Technologies Institute, Centre for Research and Technology Hellas (CERTH), 57001 Thessaloníki, Greece
| |
Collapse
|
50
|
Morillo-Mendez L, Schrooten MGS, Loutfi A, Mozos OM. Age-Related Differences in the Perception of Robotic Referential Gaze in Human-Robot Interaction. Int J Soc Robot 2022:1-13. [PMID: 36185773 PMCID: PMC9510350 DOI: 10.1007/s12369-022-00926-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/08/2022] [Indexed: 11/12/2022]
Abstract
There is an increased interest in using social robots to assist older adults during their daily life activities. As social robots are designed to interact with older users, it becomes relevant to study these interactions under the lens of social cognition. Gaze following, the social ability to infer where other people are looking at, deteriorates with older age. Therefore, the referential gaze from robots might not be an effective social cue to indicate spatial locations to older users. In this study, we explored the performance of older adults, middle-aged adults, and younger controls in a task assisted by the referential gaze of a Pepper robot. We examined age-related differences in task performance, and in self-reported social perception of the robot. Our main findings show that referential gaze from a robot benefited task performance, although the magnitude of this facilitation was lower for older participants. Moreover, perceived anthropomorphism of the robot varied less as a result of its referential gaze in older adults. This research supports that social robots, even if limited in their gazing capabilities, can be effectively perceived as social entities. Additionally, this research suggests that robotic social cues, usually validated with young participants, might be less optimal signs for older adults. Supplementary Information The online version contains supplementary material available at 10.1007/s12369-022-00926-6.
Collapse
Affiliation(s)
- Lucas Morillo-Mendez
- Centre for Applied Autonomous Sensor Systems, Örebro University, Fakultetsgatan 1, Örebro, 702 81 Sweden
| | | | - Amy Loutfi
- Centre for Applied Autonomous Sensor Systems, Örebro University, Fakultetsgatan 1, Örebro, 702 81 Sweden
| | - Oscar Martinez Mozos
- Centre for Applied Autonomous Sensor Systems, Örebro University, Fakultetsgatan 1, Örebro, 702 81 Sweden
| |
Collapse
|