1
|
Jording M, Hartz A, Vogel DHV, Schulte-Rüther M, Vogeley K. Impaired recognition of interactive intentions in adults with autism spectrum disorder not attributable to differences in visual attention or coordination via eye contact and joint attention. Sci Rep 2024; 14:8297. [PMID: 38594289 PMCID: PMC11004189 DOI: 10.1038/s41598-024-58696-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2023] [Accepted: 04/01/2024] [Indexed: 04/11/2024] Open
Abstract
Altered nonverbal communication patterns especially with regard to gaze interactions are commonly reported for persons with autism spectrum disorder (ASD). In this study we investigate and differentiate for the first time the interplay of attention allocation, the establishment of shared focus (eye contact and joint attention) and the recognition of intentions in gaze interactions in adults with ASD compared to control persons. Participants interacted via gaze with a virtual character (VC), who they believed was controlled by another person. Participants were instructed to ascertain whether their partner was trying to interact with them. In fact, the VC was fully algorithm-controlled and showed either interactive or non-interactive gaze behavior. Participants with ASD were specifically impaired in ascertaining whether their partner was trying to interact with them or not as compared to participants without ASD whereas neither the allocation of attention nor the ability to establish a shared focus were affected. Thus, perception and production of gaze cues seem preserved while the evaluation of gaze cues appeared to be impaired. An additional exploratory analysis suggests that especially the interpretation of contingencies between the interactants' actions are altered in ASD and should be investigated more closely.
Collapse
Affiliation(s)
- Mathis Jording
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Forschungszentrum Jülich, Jülich, Germany.
- Department of Psychiatry, Faculty of Medicine, University Hospital Cologne, University of Cologne, Cologne, Germany.
| | - Arne Hartz
- Child Neuropsychology Section, Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, University Hospital RWTH, Aachen, Germany
| | - David H V Vogel
- Department of Psychiatry, Faculty of Medicine, University Hospital Cologne, University of Cologne, Cologne, Germany
- Department of Neurology, Faculty of Medicine, University Hospital Cologne, University of Cologne, Cologne, Germany
| | - Martin Schulte-Rüther
- Child Neuropsychology Section, Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, University Hospital RWTH, Aachen, Germany
- Department of Child and Adolescent Psychiatry, Center for Psychosocial Medicine - University Hospital Heidelberg, Ruprechts-Karls University Heidelberg, Heidelberg, Germany
- Department of Child and Adolescent Psychiatry and Psychotherapy, University Medical Center Göttingen, Georg-August University Göttingen, Göttingen, Germany
| | - Kai Vogeley
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Forschungszentrum Jülich, Jülich, Germany
- Department of Psychiatry, Faculty of Medicine, University Hospital Cologne, University of Cologne, Cologne, Germany
| |
Collapse
|
2
|
Carneiro T, Carvalho A, Frota S, Filipe MG. Serious Games for Developing Social Skills in Children and Adolescents with Autism Spectrum Disorder: A Systematic Review. Healthcare (Basel) 2024; 12:508. [PMID: 38470619 PMCID: PMC10931397 DOI: 10.3390/healthcare12050508] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2024] [Revised: 02/11/2024] [Accepted: 02/18/2024] [Indexed: 03/14/2024] Open
Abstract
Serious games represent a promising avenue for intervention with children diagnosed with autism spectrum disorder, a neurodevelopmental disorder marked by persistent challenges in social communication and the presence of restricted, repetitive behaviors. Despite this potential, comprehensive reviews on this subject are scarce. This systematic review aims to evaluate the effectiveness of serious games and their specific characteristics in enhancing social skills among children and adolescents with autism. Employing PICO strategies and adhering to PRISMA guidelines, we screened 149 studies initially identified through PubMed and EBSCOhost databases. Nine studies met inclusion criteria and found a positive influence of serious games on social skills and related domains, encompassing emotion recognition/encoding/decoding, emotional regulation, eye gaze, joint attention, and behavioral skills. Nevertheless, despite these promising results, the limited available evidence underscores the need for rigorous study designs to consolidate findings and integrate evidence-based intervention strategies.
Collapse
Affiliation(s)
- Tânia Carneiro
- Center of Linguistics, School of Arts and Humanities, University of Lisbon, 1600-214 Lisboa, Portugal; (T.C.); (S.F.)
| | - António Carvalho
- Faculty of Psychology, Education, and Sports, Lusófona University, 4000-098 Porto, Portugal
| | - Sónia Frota
- Center of Linguistics, School of Arts and Humanities, University of Lisbon, 1600-214 Lisboa, Portugal; (T.C.); (S.F.)
| | - Marisa G. Filipe
- Center of Linguistics, School of Arts and Humanities, University of Lisbon, 1600-214 Lisboa, Portugal; (T.C.); (S.F.)
| |
Collapse
|
3
|
Gabrielli S, Cristofolini M, Dianti M, Alvari G, Vallefuoco E, Bentenuto A, Venuti P, Mayora Ibarra O, Salvadori E. Co-Design of a Virtual Reality Multiplayer Adventure Game for Adolescents With Autism Spectrum Disorder: Mixed Methods Study. JMIR Serious Games 2023; 11:e51719. [PMID: 38064258 PMCID: PMC10746967 DOI: 10.2196/51719] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2023] [Revised: 10/04/2023] [Accepted: 11/05/2023] [Indexed: 12/25/2023] Open
Abstract
BACKGROUND Virtual reality (VR) adventure games can offer ideal technological solutions for training social skills in adolescents with autism spectrum disorder (ASD), leveraging their support for multisensory and multiplayer interactions over distance, which may lower barriers to training access and increase user motivation. However, the design of VR-based game environments for social skills training is still understudied and deserves the deployment of an inclusive design approach to ensure its acceptability by target users. OBJECTIVE We aimed to present the inclusive design process that we had followed to develop the Zentastic VR adventure game to foster social skills training in adolescents with ASD and to investigate its feasibility as a training environment for adolescents. METHODS The VR game supports multiplayer training sessions involving small groups of adolescents and their therapists, who act as facilitators. Adolescents with ASD and their therapists were involved in the design and in an explorative acceptability study of an initial prototype of the gaming environment, as well as in a later feasibility multisession evaluation of the VR game final release. RESULTS The feasibility study demonstrated good acceptability of the VR game by adolescents and an enhancement of their social skills from baseline to posttraining. CONCLUSIONS The findings provide preliminary evidence of the benefits that VR-based games can bring to the training of adolescents with ASD and, potentially, other neurodevelopmental disorders.
Collapse
Affiliation(s)
- Silvia Gabrielli
- Digital Health Research, Fondazione Bruno Kessler, Trento, Italy
| | - Melanie Cristofolini
- ODFLab - Observational, Diagnosis and Education Lab, Department of Psychology and Cognitive Science, University of Trento, Trento, Italy
- Meeva Srl, Trento, Italy
| | - Marco Dianti
- Digital Health Research, Fondazione Bruno Kessler, Trento, Italy
- Meeva Srl, Trento, Italy
| | - Gianpaolo Alvari
- ODFLab - Observational, Diagnosis and Education Lab, Department of Psychology and Cognitive Science, University of Trento, Trento, Italy
| | - Ersilia Vallefuoco
- ODFLab - Observational, Diagnosis and Education Lab, Department of Psychology and Cognitive Science, University of Trento, Trento, Italy
| | - Arianna Bentenuto
- ODFLab - Observational, Diagnosis and Education Lab, Department of Psychology and Cognitive Science, University of Trento, Trento, Italy
- Cooperativa Socio Sanitaria Albero Blu, Trento, Italy
| | - Paola Venuti
- ODFLab - Observational, Diagnosis and Education Lab, Department of Psychology and Cognitive Science, University of Trento, Trento, Italy
| | | | | |
Collapse
|
4
|
Schmälzle R, Lim S, Cho HJ, Wu J, Bente G. Examining the exposure-reception-retention link in realistic communication environments via VR and eye-tracking: The VR billboard paradigm. PLoS One 2023; 18:e0291924. [PMID: 38033032 PMCID: PMC10688884 DOI: 10.1371/journal.pone.0291924] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2023] [Accepted: 09/10/2023] [Indexed: 12/02/2023] Open
Abstract
Exposure is key to message effects. No effects can ensue if a health, political, or commercial message is not noticed. Yet, existing research in communication, advertising, and related disciplines often measures 'opportunities for exposure' at an aggregate level, whereas knowing whether recipients were 'actually exposed' to a message requires a micro-level approach. Micro-level research, on the other hand, focuses on message processing and retention, takes place under highly controlled laboratory conditions with forced message exposure, and largely ignores how recipients attend selectively to messages under more natural conditions. Eye-tracking enables us to assess actual exposure, but its previous applications were restricted to screen-based reading paradigms lacking ecological validity or field studies that suffer from limited experimental control. Our solution is to measure eye-tracking within an immersive VR environment that creates the message delivery and reception context. Specifically, we simulate a car ride down a highway alongside which billboards are placed. The VR headset (HP Omnicept Pro) provides an interactive 3D view of the environment and holds a seamlessly integrated binocular eye tracker that records the drivers' gaze and detects all fixations on the billboards. This allows us to quantify the nexus between exposure and reception rigorously, and to link our measures to subsequent memory, i.e., whether messages were remembered, forgotten, or not even encoded. An empirical study shows that incidental memory for messages differs based on participants' gaze behavior while passing the billboards. The study further shows how an experimental manipulation of attentional demands directly impacts drivers' gaze behavior and memory. We discuss the large potential of this paradigm to quantify exposure and message reception in realistic communication environments and the equally promising applications in new media contexts (e.g., the Metaverse).
Collapse
Affiliation(s)
- Ralf Schmälzle
- Department of Communication, College of Communication Arts and Sciences, Michigan State University, East Lansing, Michigan, United States of America
| | - Sue Lim
- Department of Communication, College of Communication Arts and Sciences, Michigan State University, East Lansing, Michigan, United States of America
| | - Hee Jung Cho
- Department of Communication, College of Communication Arts and Sciences, Michigan State University, East Lansing, Michigan, United States of America
| | - Juncheng Wu
- Department of Communication, College of Communication Arts and Sciences, Michigan State University, East Lansing, Michigan, United States of America
| | - Gary Bente
- Department of Communication, College of Communication Arts and Sciences, Michigan State University, East Lansing, Michigan, United States of America
| |
Collapse
|
5
|
Chu L, Shen L, Ma C, Chen J, Tian Y, Zhang C, Gong Z, Li M, Wang C, Pan L, Zhu P, Wu D, Wang Y, Yu G. Effects of a Nonwearable Digital Therapeutic Intervention on Preschoolers With Autism Spectrum Disorder in China: Open-Label Randomized Controlled Trial. J Med Internet Res 2023; 25:e45836. [PMID: 37616029 PMCID: PMC10485722 DOI: 10.2196/45836] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2023] [Revised: 07/24/2023] [Accepted: 07/26/2023] [Indexed: 08/25/2023] Open
Abstract
BACKGROUND Autism spectrum disorder (ASD) is a neurodevelopmental disorder that can cause difficulty with communication and social interactions as well as complicated family dynamics. Digital health interventions can reduce treatment costs and promote healthy lifestyle changes. These therapies can be adjunctive or replace traditional treatments. However, issues with cooperation and compliance prevent preschool patients with ASD from applying these tools. In this open-label, randomized controlled trial, we developed a nonwearable digital therapy called virtual reality-incorporated cognitive behavioral therapy (VR-CBT). OBJECTIVE The aim of this study was to assess the adjunctive function of VR-CBT by comparing the effects of VR-CBT plus learning style profile (LSP) intervention with those of LSP-only intervention in preschool children with ASD. METHODS This trial was performed in China on 78 preschool children (age 3-6 years, IQ>70) diagnosed with ASD who were randomized to receive a 20-week VR-CBT plus LSP intervention (intervention group, 39/78, 50%) or LSP intervention only (control group, 39/78, 50%). The primary outcome was the change of scores from baseline to week 20, assessed by using the parent-rated Autism Behavior Checklist (ABC). Secondary outcomes included the Childhood Autism Rating Scale (CARS), Attention-Deficit/Hyperactivity Disorder Rating Scale-IV (ADHD-RS-IV), and behavioral performance data (accuracy and reaction time) in go/no-go tasks. All primary and secondary outcomes were analyzed in the intention-to-treat population. RESULTS After the intervention, there was an intervention effect on total ABC (β=-5.528; P<.001) and CARS scores (β=-1.365; P=.02). A similar trend was observed in the ABC subscales: sensory (β=-1.133; P=.047), relating (β=-1.512; P=.03), body and object use (β=-1.211; P=.03), and social and self-help (β=-1.593; P=.03). The intervention also showed statistically significant effects in improving behavioral performance (go/no-go task, accuracy, β=2.923; P=.04). Moreover, a significant improvement of ADHD hyperactivity-impulsivity symptoms was observed in 53 children with comorbid ADHD based on ADHD-RS-IV (β=-1.269; P=.02). No statistically significant intervention effect was detected in the language subscale of ABC (β=-.080; P=.83). Intervention group girls had larger improvements in ABC subscales, that is, sensory and body and object use and in the CARS score and accuracy of go/no-go task (all P<.05) than the control group girls. Statistically significant intervention effects could be observed in hyperactivity-impulsivity symptoms in the intervention group boys with comorbid ADHD compared with those in the control group boys (β=-1.333; P=.03). CONCLUSIONS We found potentially positive effects of nonwearable digital therapy plus LSP on core symptoms associated with ASD, leading to a modest improvement in the function of sensory, motor, and response inhibition, while reducing impulsivity and hyperactivity in preschoolers with both ASD and ADHD. VR-CBT was found to be an effective and feasible adjunctive digital tool. TRIAL REGISTRATION Chinese Clinical Trial Registry ChiCTR2100053165; http://www.chictr.org.cn/showproj.aspx?proj=137016.
Collapse
Affiliation(s)
- Liting Chu
- Department of Child Health Care, Shanghai Children's Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai, China
- Division of Child and Adolescent Health, Shanghai Municipal Center for Disease Control and Prevention, Shanghai, China
| | - Li Shen
- Clinical Research Center, Shanghai Jiao Tong University Affiliated Sixth People's Hospital, Shanghai, China
| | - Chenhuan Ma
- Department of Child Health Care, Shanghai Children's Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai, China
| | - Jinjin Chen
- Department of Child Health Care, Shanghai Children's Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai, China
| | - Yuan Tian
- Department of Child Health Care, Shanghai Children's Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai, China
| | - Chuncao Zhang
- Department of Child Health Care, Shanghai Children's Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai, China
| | - Zilan Gong
- Department of Child Health Care, Shanghai Children's Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai, China
| | - Mengfan Li
- Department of Child Health Care, Shanghai Children's Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai, China
| | - Chengjie Wang
- Department of Child Health Care, Shanghai Children's Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai, China
| | - Lizhu Pan
- Department of Child Health Care, Shanghai Children's Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai, China
| | - Peiying Zhu
- Department of Child Health Care, Shanghai Children's Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai, China
| | - Danmai Wu
- Department of Child Health Care, Shanghai Children's Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai, China
| | - Yu Wang
- Department of Child Health Care, Shanghai Children's Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai, China
| | - Guangjun Yu
- Department of Child Health Care, Shanghai Children's Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai, China
- School of Medicine, The Chinese University of Hong Kong, Shenzhen, China
| |
Collapse
|
6
|
McDonald DQ, DeJardin E, Sariyanidi E, Herrington JD, Tunç B, Zampella CJ, Schultz RT. Predicting Autism from Head Movement Patterns during Naturalistic Social Interactions. PROCEEDINGS OF THE 2023 7TH INTERNATIONAL CONFERENCE ON MEDICAL AND HEALTH INFORMATICS (ICMHI 2023) : MAY 12-14, 2023, KYOTO, JAPAN. INTERNATIONAL CONFERENCE ON MEDICAL AND HEALTH INFORMATICS (7TH : 2023 : KYOTO, JAPAN) 2023; 2023:55-60. [PMID: 38699395 PMCID: PMC11064057 DOI: 10.1145/3608298.3608309] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/05/2024]
Abstract
Autism spectrum disorder (ASD) is a neurodevelopmental condition characterized in part by difficulties in verbal and nonverbal social communication. Evidence indicates that autistic people, compared to neurotypical peers, exhibit differences in head movements, a key form of nonverbal communication. Despite the crucial role of head movements in social communication, research on this nonverbal cue is relatively scarce compared to other forms of nonverbal communication, such as facial expressions and gestures. There is a need for scalable, reliable, and accurate instruments for measuring head movements directly within the context of social interactions. In this study, we used computer vision and machine learning to examine the head movement patterns of neurotypical and autistic individuals during naturalistic, face-to-face conversations, at both the individual (monadic) and interpersonal (dyadic) levels. Our model predicts diagnostic status using dyadic head movement data with an accuracy of 80%, highlighting the value of head movement as a marker of social communication. The monadic data pipeline had lower accuracy (69.2%) compared to the dyadic approach, emphasizing the importance of studying back-and-forth social communication cues within a true social context. The proposed classifier is not intended for diagnostic purposes, and future research should replicate our findings in larger, more representative samples.
Collapse
Affiliation(s)
| | - Ellis DeJardin
- Children's Hospital of Philadelphia Philadelphia, PA, USA
| | | | - John D Herrington
- Children's Hospital of Philadelphia Philadelphia, PA, USA
- University of Pennsylvania Philadelphia, PA, USA
| | - Birkan Tunç
- Children's Hospital of Philadelphia Philadelphia, PA, USA
- University of Pennsylvania Philadelphia, PA, USA
| | | | - Robert T Schultz
- Children's Hospital of Philadelphia Philadelphia, PA, USA
- University of Pennsylvania Philadelphia, PA, USA
| |
Collapse
|
7
|
From virtual to prosocial reality: The effects of prosocial virtual reality games on preschool Children's prosocial tendencies in real life environments. COMPUTERS IN HUMAN BEHAVIOR 2023. [DOI: 10.1016/j.chb.2022.107546] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
8
|
Smiles and Angry Faces vs. Nods and Head Shakes: Facial Expressions at the Service of Autonomous Vehicles. MULTIMODAL TECHNOLOGIES AND INTERACTION 2023. [DOI: 10.3390/mti7020010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/22/2023] Open
Abstract
When deciding whether to cross the street or not, pedestrians take into consideration information provided by both vehicle kinematics and the driver of an approaching vehicle. It will not be long, however, before drivers of autonomous vehicles (AVs) will be unable to communicate their intention to pedestrians, as they will be engaged in activities unrelated to driving. External human–machine interfaces (eHMIs) have been developed to fill the communication gap that will result by offering information to pedestrians about the situational awareness and intention of an AV. Several anthropomorphic eHMI concepts have employed facial expressions to communicate vehicle intention. The aim of the present study was to evaluate the efficiency of emotional (smile; angry expression) and conversational (nod; head shake) facial expressions in communicating vehicle intention (yielding; non-yielding). Participants completed a crossing intention task where they were tasked with deciding appropriately whether to cross the street or not. Emotional expressions communicated vehicle intention more efficiently than conversational expressions, as evidenced by the lower latency in the emotional expression condition compared to the conversational expression condition. The implications of our findings for the development of anthropomorphic eHMIs that employ facial expressions to communicate vehicle intention are discussed.
Collapse
|
9
|
Mofatteh M, Mashayekhi MS, Arfaie S, Chen Y, Mirza AB, Fares J, Bandyopadhyay S, Henich E, Liao X, Bernstein M. Augmented and virtual reality usage in awake craniotomy: a systematic review. Neurosurg Rev 2022; 46:19. [PMID: 36529827 PMCID: PMC9760592 DOI: 10.1007/s10143-022-01929-7] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2022] [Revised: 11/21/2022] [Accepted: 12/12/2022] [Indexed: 12/23/2022]
Abstract
Augmented and virtual reality (AR, VR) are becoming promising tools in neurosurgery. AR and VR can reduce challenges associated with conventional approaches via the simulation and mimicry of specific environments of choice for surgeons. Awake craniotomy (AC) enables the resection of lesions from eloquent brain areas while monitoring higher cortical and subcortical functions. Evidence suggests that both surgeons and patients benefit from the various applications of AR and VR in AC. This paper investigates the application of AR and VR in AC and assesses its prospective utility in neurosurgery. A systematic review of the literature was performed using PubMed, Scopus, and Web of Science databases in accordance with the PRISMA guidelines. Our search results yielded 220 articles. A total of six articles consisting of 118 patients have been included in this review. VR was used in four papers, and the other two used AR. Tumour was the most common pathology in 108 patients, followed by vascular lesions in eight patients. VR was used for intraoperative mapping of language, vision, and social cognition, while AR was incorporated in preoperative training of white matter dissection and intraoperative visualisation and navigation. Overall, patients and surgeons were satisfied with the applications of AR and VR in their cases. AR and VR can be safely incorporated during AC to supplement, augment, or even replace conventional approaches in neurosurgery. Future investigations are required to assess the feasibility of AR and VR in various phases of AC.
Collapse
Affiliation(s)
- Mohammad Mofatteh
- School of Medicine, Dentistry and Biomedical Sciences, Queen's University Belfast, Belfast, UK.
| | | | - Saman Arfaie
- Department of Neurology and Neurosurgery, McGill University, Montreal, Quebec, Canada
- Department of Molecular and Cell Biology, University of California Berkeley, Berkeley, CA, USA
| | - Yimin Chen
- Department of Neurology, Foshan Sanshui District People's Hospital, Foshan, China
| | | | - Jawad Fares
- Department of Neurological Surgery, Feinberg School of Medicine, Northwestern University, Chicago, IL, USA
- Northwestern Medicine Malnati Brain Tumor Institute, Feinberg School of Medicine, Lurie Comprehensive Cancer Center, Northwestern University, Chicago, IL, USA
| | - Soham Bandyopadhyay
- Nuffield Department of Surgical Sciences, Oxford University Global Surgery Group, University of Oxford, Oxford, UK
- Clinical Neurosciences, Clinical & Experimental Sciences, Faculty of Medicine, University of Southampton, Southampton, Hampshire, UK
- Wessex Neurological Centre, University Hospital Southampton NHS Foundation Trust, Southampton, UK
| | - Edy Henich
- Department of Medicine, McGill University, Montreal, Quebec, Canada
| | - Xuxing Liao
- Department of Neurosurgery, Foshan Sanshui District People's Hospital, Foshan, China
| | - Mark Bernstein
- Division of Neurosurgery, Department of Surgery, University of Toronto, University Health Network, Toronto, Ontario, Canada
- Temmy Latner Center for Palliative Care, Mount Sinai Hospital, University of Toronto, Toronto, Ontario, Canada
| |
Collapse
|
10
|
Ghost on the Windshield: Employing a Virtual Human Character to Communicate Pedestrian Acknowledgement and Vehicle Intention. INFORMATION 2022. [DOI: 10.3390/info13090420] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
Pedestrians base their street-crossing decisions on vehicle-centric as well as driver-centric cues. In the future, however, drivers of autonomous vehicles will be preoccupied with non-driving related activities and will thus be unable to provide pedestrians with relevant communicative cues. External human–machine interfaces (eHMIs) hold promise for filling the expected communication gap by providing information about a vehicle’s situational awareness and intention. In this paper, we present an eHMI concept that employs a virtual human character (VHC) to communicate pedestrian acknowledgement and vehicle intention (non-yielding; cruising; yielding). Pedestrian acknowledgement is communicated via gaze direction while vehicle intention is communicated via facial expression. The effectiveness of the proposed anthropomorphic eHMI concept was evaluated in the context of a monitor-based laboratory experiment where the participants performed a crossing intention task (self-paced, two-alternative forced choice) and their accuracy in making appropriate street-crossing decisions was measured. In each trial, they were first presented with a 3D animated sequence of a VHC (male; female) that either looked directly at them or clearly to their right while producing either an emotional (smile; angry expression; surprised expression), a conversational (nod; head shake), or a neutral (neutral expression; cheek puff) facial expression. Then, the participants were asked to imagine they were pedestrians intending to cross a one-way street at a random uncontrolled location when they saw an autonomous vehicle equipped with the eHMI approaching from the right and indicate via mouse click whether they would cross the street in front of the oncoming vehicle or not. An implementation of the proposed concept where non-yielding intention is communicated via the VHC producing either an angry expression, a surprised expression, or a head shake; cruising intention is communicated via the VHC puffing its cheeks; and yielding intention is communicated via the VHC nodding, was shown to be highly effective in ensuring the safety of a single pedestrian or even two co-located pedestrians without compromising traffic flow in either case. The implications for the development of intuitive, culture-transcending eHMIs that can support multiple pedestrians in parallel are discussed.
Collapse
|
11
|
Zhang M, Ding H, Naumceska M, Zhang Y. Virtual Reality Technology as an Educational and Intervention Tool for Children with Autism Spectrum Disorder: Current Perspectives and Future Directions. Behav Sci (Basel) 2022; 12:138. [PMID: 35621435 PMCID: PMC9137951 DOI: 10.3390/bs12050138] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2022] [Revised: 04/30/2022] [Accepted: 05/07/2022] [Indexed: 02/01/2023] Open
Abstract
The worldwide rising trend of autism spectrum disorder (ASD) calls for innovative and efficacious techniques for assessment and treatment. Virtual reality (VR) technology gains theoretical support from rehabilitation and pedagogical theories and offers a variety of capabilities in educational and interventional contexts with affordable products. VR is attracting increasing attention in the medical and healthcare industry, as it provides fully interactive three-dimensional simulations of real-world settings and social situations, which are particularly suitable for cognitive and performance training, including social and interaction skills. This review article offers a summary of current perspectives and evidence-based VR applications for children with ASD, with a primary focus on social communication, including social functioning, emotion recognition, and speech and language. Technology- and design-related limitations, as well as disputes over the application of VR to autism research and therapy, are discussed, and future directions of this emerging field are highlighted with regards to application expansion and improvement, technology enhancement, linguistic diversity, and the development of theoretical models and brain-based research.
Collapse
Affiliation(s)
- Minyue Zhang
- Speech-Language-Hearing Center, School of Foreign Languages, Shanghai Jiao Tong University, Shanghai 200240, China;
| | - Hongwei Ding
- Speech-Language-Hearing Center, School of Foreign Languages, Shanghai Jiao Tong University, Shanghai 200240, China;
| | - Meri Naumceska
- 70 Flowers Association for Early Intervention and Education of Children with Autism, 1000 Skopje, North Macedonia;
| | - Yang Zhang
- Department of Speech-Language-Hearing Sciences, University of Minnesota, Minneapolis, MN 55455, USA
| |
Collapse
|
12
|
Robles M, Namdarian N, Otto J, Wassiljew E, Navab N, Falter-Wagner C, Roth D. A Virtual Reality Based System for the Screening and Classification of Autism. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:2168-2178. [PMID: 35171773 DOI: 10.1109/tvcg.2022.3150489] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Autism - also known as Autism Spectrum Disorders or Autism Spectrum Conditions - is a neurodevelopmental condition characterized by repetitive behaviours and differences in communication and social interaction. As a consequence, many autistic individuals may struggle in everyday life, which sometimes manifests in depression, unemployment, or addiction. One crucial problem in patient support and treatment is the long waiting time to diagnosis, which was approximated to thirteen months on average. Yet, the earlier an intervention can take place the better the patient can be supported, which was identified as a crucial factor. We propose a system to support the screening of Autism Spectrum Disorders based on a virtual reality social interaction, namely a shopping experience, with an embodied agent. During this everyday interaction, behavioral responses are tracked and recorded. We analyze this behavior with machine learning approaches to classify participants from an autistic participant sample in comparison to a typically developed individuals control sample with high accuracy, demonstrating the feasibility of the approach. We believe that such tools can strongly impact the way mental disorders are assessed and may help to further find objective criteria and categorization.
Collapse
|
13
|
Karami B, Koushki R, Arabgol F, Rahmani M, Vahabie AH. Effectiveness of Virtual/Augmented Reality-Based Therapeutic Interventions on Individuals With Autism Spectrum Disorder: A Comprehensive Meta-Analysis. Front Psychiatry 2021; 12:665326. [PMID: 34248702 PMCID: PMC8260941 DOI: 10.3389/fpsyt.2021.665326] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/07/2021] [Accepted: 05/12/2021] [Indexed: 11/13/2022] Open
Abstract
In recent years, the application of virtual reality (VR) for therapeutic purposes has escalated dramatically. Favorable properties of VR for engaging patients with autism, in particular, have motivated an enormous body of investigations targeting autism-related disabilities with this technology. This study aims to provide a comprehensive meta-analysis for evaluating the effectiveness of VR on the rehabilitation and training of individuals diagnosed with an autism spectrum disorder. Accordingly, we conducted a systematic search of related databases and, after screening for inclusion criteria, reviewed 33 studies for more detailed analysis. Results revealed that individuals undergoing VR training have remarkable improvements with a relatively large effect size with Hedges g of 0.74. Furthermore, the results of the analysis of different skills indicated diverse effectiveness. The strongest effect was observed for daily living skills (g = 1.15). This effect was moderate for other skills: g = 0.45 for cognitive skills, g = 0.46 for emotion regulation and recognition skills, and g = 0.69 for social and communication skills. Moreover, five studies that had used augmented reality also showed promising efficacy (g = 0.92) that calls for more research on this tool. In conclusion, the application of VR-based settings in clinical practice is highly encouraged, although their standardization and customization need more research.
Collapse
Affiliation(s)
- Behnam Karami
- School of Cognitive Sciences, Institute for Research in Fundamental Sciences (IPM), Tehran, Iran
- Cognitive Neuroscience Laboratory, German Primate Center, Goettingen, Germany
- School of Medicine, Shahid Beheshti University of Medical Sciences, Tehran, Iran
| | - Roxana Koushki
- School of Cognitive Sciences, Institute for Research in Fundamental Sciences (IPM), Tehran, Iran
- School of Medicine, Shahid Beheshti University of Medical Sciences, Tehran, Iran
| | - Fariba Arabgol
- School of Medicine, Shahid Beheshti University of Medical Sciences, Tehran, Iran
- Behavioral Science Research Center, Shahid Beheshti University of Medical Sciences, Tehran, Iran
| | - Maryam Rahmani
- School of Medicine, Shahid Beheshti University of Medical Sciences, Tehran, Iran
| | - Abdol-Hossein Vahabie
- Department of Psychology, Faculty of Psychology and Education, University of Tehran, Tehran, Iran
- Control and Intelligent Processing Center of Excellence (CIPCE), Cognitive Systems Laboratory, School of Electrical and Computer Engineering, College of Engineering, University of Tehran, Tehran, Iran
| |
Collapse
|
14
|
Kegel LC, Frühholz S, Grunwald T, Mersch D, Rey A, Jokeit H. Temporal lobe epilepsy alters neural responses to human and avatar facial expressions in the face perception network. Brain Behav 2021; 11:e02140. [PMID: 33951323 PMCID: PMC8213650 DOI: 10.1002/brb3.2140] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/30/2020] [Revised: 03/21/2021] [Accepted: 03/22/2021] [Indexed: 12/19/2022] Open
Abstract
BACKGROUND AND OBJECTIVE Although avatars are now widely used in advertisement, entertainment, and business today, no study has investigated whether brain lesions in neurological patients interfere with brain activation in response to dynamic avatar facial expressions. The aim of our event-related fMRI study was to compare brain activation differences in people with epilepsy and controls during the processing of fearful and neutral dynamic expressions displayed by human or avatar faces. METHODS Using functional magnetic resonance imaging (fMRI), we examined brain responses to dynamic facial expressions of trained actors and their avatar look-alikes in 16 people with temporal lobe epilepsy (TLE) and 26 controls. The actors' fearful and neutral expressions were recorded on video and conveyed onto their avatar look-alikes by face tracking. RESULTS Our fMRI results show that people with TLE exhibited reduced response differences between fearful and neutral expressions displayed by humans in the right amygdala and the left superior temporal sulcus (STS). Further, TLE was associated with reduced response differences between human and avatar fearful expressions in the dorsal pathway of the face perception network (STS and inferior frontal gyrus) as well as in the medial prefrontal cortex. CONCLUSIONS Taken together, these findings suggest that brain responses to dynamic facial expressions are altered in people with TLE compared to neurologically healthy individuals-regardless of whether the face is human or computer-generated. In TLE, areas sensitive to dynamic facial features and associated with processes relating to the self and others are particularly affected when processing dynamic human and avatar expressions. Our findings highlight that the impact of TLE on facial emotion processing must be extended to artificial faces and should be considered when applying dynamic avatars in the context of neurological conditions.
Collapse
Affiliation(s)
- Lorena Chantal Kegel
- Swiss Epilepsy Center, Zurich, Switzerland.,Department of Psychology, University of Zurich, Zurich, Switzerland
| | - Sascha Frühholz
- Department of Psychology, University of Zurich, Zurich, Switzerland
| | | | - Dieter Mersch
- Institute for Critical Theory, Zurich University of the Arts, Zurich, Switzerland
| | - Anton Rey
- Institute for the Performing Arts and Film, Zurich University of the Arts, Zurich, Switzerland
| | - Hennric Jokeit
- Swiss Epilepsy Center, Zurich, Switzerland.,Department of Psychology, University of Zurich, Zurich, Switzerland
| |
Collapse
|
15
|
Drewes J, Feder S, Einhäuser W. Gaze During Locomotion in Virtual Reality and the Real World. Front Neurosci 2021; 15:656913. [PMID: 34108857 PMCID: PMC8180583 DOI: 10.3389/fnins.2021.656913] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2021] [Accepted: 04/27/2021] [Indexed: 11/20/2022] Open
Abstract
How vision guides gaze in realistic settings has been researched for decades. Human gaze behavior is typically measured in laboratory settings that are well controlled but feature-reduced and movement-constrained, in sharp contrast to real-life gaze control that combines eye, head, and body movements. Previous real-world research has shown environmental factors such as terrain difficulty to affect gaze; however, real-world settings are difficult to control or replicate. Virtual reality (VR) offers the experimental control of a laboratory, yet approximates freedom and visual complexity of the real world (RW). We measured gaze data in 8 healthy young adults during walking in the RW and simulated locomotion in VR. Participants walked along a pre-defined path inside an office building, which included different terrains such as long corridors and flights of stairs. In VR, participants followed the same path in a detailed virtual reconstruction of the building. We devised a novel hybrid control strategy for movement in VR: participants did not actually translate: forward movements were controlled by a hand-held device, rotational movements were executed physically and transferred to the VR. We found significant effects of terrain type (flat corridor, staircase up, and staircase down) on gaze direction, on the spatial spread of gaze direction, and on the angular distribution of gaze-direction changes. The factor world (RW and VR) affected the angular distribution of gaze-direction changes, saccade frequency, and head-centered vertical gaze direction. The latter effect vanished when referencing gaze to a world-fixed coordinate system, and was likely due to specifics of headset placement, which cannot confound any other analyzed measure. Importantly, we did not observe a significant interaction between the factors world and terrain for any of the tested measures. This indicates that differences between terrain types are not modulated by the world. The overall dwell time on navigational markers did not differ between worlds. The similar dependence of gaze behavior on terrain in the RW and in VR indicates that our VR captures real-world constraints remarkably well. High-fidelity VR combined with naturalistic movement control therefore has the potential to narrow the gap between the experimental control of a lab and ecologically valid settings.
Collapse
Affiliation(s)
- Jan Drewes
- Institute of Brain and Psychological Sciences, Sichuan Normal University, Chengdu, China
- Physics of Cognition Group, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
| | - Sascha Feder
- Cognitive Systems Lab, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
| | - Wolfgang Einhäuser
- Physics of Cognition Group, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
| |
Collapse
|
16
|
Virtual Reality Immersion Rescales Regulation of Interpersonal Distance in Controls but not in Autism Spectrum Disorder. J Autism Dev Disord 2021; 50:4317-4328. [PMID: 32266686 DOI: 10.1007/s10803-020-04484-6] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/03/2023]
Abstract
Interpersonal distance (IPD) is a simple social regulation metric which is altered in autism. We performed a stop-distance paradigm to evaluate IPD regulation in autism spectrum disorder (ASD) and control groups in a real versus a virtual environment mimicking in detail the real one. We found a bimodal pattern of IPDs only in ASD. Both groups showed high IPD correlations between real and virtual environments, but the significantly larger slope in the control group suggests rescaling, which was absent in ASD. We argue that loss of nuances like non-verbal communication, such as perception of subtle body gestures in the virtual environment, lead to changed regulation of IPD in controls, whilst ASD participants show similar deficits in perceiving such subtle cues in both environments.
Collapse
|
17
|
Birch I, Birch M, Lall J. The accuracy and validity of the Sheffield Features of Gait Tool. Sci Justice 2020; 61:72-78. [PMID: 33357829 DOI: 10.1016/j.scijus.2020.08.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/25/2020] [Revised: 06/26/2020] [Accepted: 08/03/2020] [Indexed: 11/16/2022]
Abstract
Gait is now widely used in the UK as a contributor to identification, and increasing interest is being shown in its use in both Europe and the US. One of the long standing criticisms of the use of gait as evidence has been the lack of a validated standard methodology. With the publication of the 'Code of practice for forensic gait analysis', and the adoption of the code as part of the 'Codes of Practice and Conduct for forensic science providers and practitioners in the Criminal Justice System' by the Forensic Science Regulator, forensic gait analysts are now required to provide evidence of the testing of the methods used. The Sheffield Features of Gait Tool is specifically designed to assist observational gait analysis in the forensic context, and was developed by forensic gait analysis practitioners based on their casework and trial experience. Birch et al 2019 reported the findings of a study undertaken to assess the repeatability and reproducibility of the tool. This paper reports the findings of a study undertaken to assess the accuracy with which analysts identified features of gait when using the tool. Fourteen participants, with experience in observational gait analysis, viewed footage of computer generated avatars walking, and completed the features of gait tool on multiple occasions. The results showed a mean accuracy score of 134.92 out of a possible 180 (74.96%), a standard deviation of 9.49 (5.27%) and a coefficient of variation of 7.03%, demonstrating a good degree of consistency between the scores (Cronbach's alpha <0.90; ANOVA p-value <0.05). The findings of this study, coupled with those of the Birch et al 2019 study which showed there to be good levels of both repeatability and reproducibility of observations of features of gait made by the participants, suggest that the Sheffield Features of Gait Tool is a valid and fit for purpose method of observing and recording features of gait in the forensic context. The use of the tool provides the basis of a standardised methodology for observational gait analysis in the forensic context.
Collapse
Affiliation(s)
- Ivan Birch
- Sheffield Teaching Hospitals NHS Foundation Trust Woodhouse Clinic, 3 Skelton Lane, Sheffield, England S13 7LY, United Kingdom.
| | - Maria Birch
- University of Brighton School of Health Sciences, 49 Darley Road, Eastbourne, England BN20 7UR, United Kingdom.
| | - Jalmeen Lall
- University of Brighton School of Health Sciences, 49 Darley Road, Eastbourne, England BN20 7UR, United Kingdom
| |
Collapse
|
18
|
Brandi ML, Kaifel D, Lahnakoski JM, Schilbach L. A naturalistic paradigm simulating gaze-based social interactions for the investigation of social agency. Behav Res Methods 2020; 52:1044-1055. [PMID: 31712998 PMCID: PMC7280341 DOI: 10.3758/s13428-019-01299-x] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Sense of agency describes the experience of being the cause of one's own actions and the resulting effects. In a social interaction, one's actions may also have a perceivable effect on the actions of others. In this article, we refer to the experience of being responsible for the behavior of others as social agency, which has important implications for the success or failure of social interactions. Gaze-contingent eyetracking paradigms provide a useful tool to analyze social agency in an experimentally controlled manner, but the current methods are lacking in terms of their ecological validity. We applied this technique in a novel task using video stimuli of real gaze behavior to simulate a gaze-based social interaction. This enabled us to create the impression of a live interaction with another person while being able to manipulate the gaze contingency and congruency shown by the simulated interaction partner in a continuous manner. Behavioral data demonstrated that participants believed they were interacting with a real person and that systematic changes in the responsiveness of the simulated partner modulated the experience of social agency. More specifically, gaze contingency (temporal relatedness) and gaze congruency (gaze direction relative to the participant's gaze) influenced the explicit sense of being responsible for the behavior of the other. In general, our study introduces a new naturalistic task to simulate gaze-based social interactions and demonstrates that it is suitable to studying the explicit experience of social agency.
Collapse
Affiliation(s)
- Marie-Luise Brandi
- Independent Max Planck Research Group for Social Neuroscience, Max Planck Institute of Psychiatry, Munich, Germany.
| | - Daniela Kaifel
- Independent Max Planck Research Group for Social Neuroscience, Max Planck Institute of Psychiatry, Munich, Germany
| | - Juha M Lahnakoski
- Independent Max Planck Research Group for Social Neuroscience, Max Planck Institute of Psychiatry, Munich, Germany
| | - Leonhard Schilbach
- Independent Max Planck Research Group for Social Neuroscience, Max Planck Institute of Psychiatry, Munich, Germany
| |
Collapse
|
19
|
Malihi M, Nguyen J, Cardy RE, Eldon S, Petta C, Kushki A. Data-Driven Discovery of Predictors of Virtual Reality Safety and Sense of Presence for Children With Autism Spectrum Disorder: A Pilot Study. Front Psychiatry 2020; 11:669. [PMID: 32903670 PMCID: PMC7438752 DOI: 10.3389/fpsyt.2020.00669] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/14/2019] [Accepted: 06/26/2020] [Indexed: 11/13/2022] Open
Abstract
Virtual reality (VR) offers children with autism spectrum disorder (ASD) an inexpensive and motivating medium to learn and practice skills in a personalized, controlled, and safe setting; however, outcomes of VR interventions can vary widely. In particular, there is a need to understand the predictors of VR experience in children with ASD to inform the design of these interventions. To address this gap, a sample of children with ASD (n=35, mean age: 13.0 ± 2.6 years; 10 female) participated in a pilot study involving an immersive VR experience delivered through a head-mounted display. A data-driven approach was used to discover predictors of VR safety and sense of presence among a range of demographic and phenotypic user characteristics. Our results suggest that IQ may be a key predictor of VR sense of presence and that anxiety may modify the association between IQ and sense of presence. In particular, in low-anxiety participants, IQ was linearly related to experienced spatial presence and engagement, whereas, in high-anxiety participants, this association followed a quadratic form. The results of this pilot study, when replicated in larger samples, will inform the design of future studies on VR interventions for children with ASD.
Collapse
Affiliation(s)
- Mahan Malihi
- Institute of Biomedical Engineering, University of Toronto, Toronto, ON, Canada.,Autism Research Centre, Bloorview Research Institute, Holland Bloorview Kids Rehabilitation Hospital, Toronto, ON, Canada
| | - Jenny Nguyen
- Institute of Biomedical Engineering, University of Toronto, Toronto, ON, Canada.,Autism Research Centre, Bloorview Research Institute, Holland Bloorview Kids Rehabilitation Hospital, Toronto, ON, Canada
| | - Robyn E Cardy
- Autism Research Centre, Bloorview Research Institute, Holland Bloorview Kids Rehabilitation Hospital, Toronto, ON, Canada
| | - Salina Eldon
- Autism Research Centre, Bloorview Research Institute, Holland Bloorview Kids Rehabilitation Hospital, Toronto, ON, Canada
| | - Cathy Petta
- Autism Research Centre, Bloorview Research Institute, Holland Bloorview Kids Rehabilitation Hospital, Toronto, ON, Canada
| | - Azadeh Kushki
- Institute of Biomedical Engineering, University of Toronto, Toronto, ON, Canada.,Autism Research Centre, Bloorview Research Institute, Holland Bloorview Kids Rehabilitation Hospital, Toronto, ON, Canada
| |
Collapse
|
20
|
Jording M, Engemann D, Eckert H, Bente G, Vogeley K. Distinguishing Social From Private Intentions Through the Passive Observation of Gaze Cues. Front Hum Neurosci 2019; 13:442. [PMID: 31920600 PMCID: PMC6928136 DOI: 10.3389/fnhum.2019.00442] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2019] [Accepted: 12/02/2019] [Indexed: 11/13/2022] Open
Abstract
Observing others’ gaze is most informative during social encounters between humans: We can learn about potentially salient objects in the shared environment, infer others’ mental states and detect their communicative intentions. We almost automatically follow the gaze of others in order to check the relevance of the target of the other’s attention. This phenomenon called gaze cueing can be conceptualized as a triadic interaction involving a gaze initiator, a gaze follower and a gaze target, i.e., an object or person of interest in the environment. Gaze cueing can occur as “gaze pointing” with a communicative or “social” intention by the initiator, telling the observer that she/he is meant to follow, or as an incidental event, in which the observer follows spontaneously without any intention of the observed person. Here, we investigate which gaze cues let an observer ascribe a social intention to the observed person’s gaze and whether and to which degree previous eye contact in combination with an object fixation contributes to this ascription. We varied the orientation of the starting position of gaze toward the observer and the orientation of the end position of a lateral gaze shift. In two experiments participants had to infer from the gaze behavior either mere approach (“the person looked at me”) vs. a social (“the person wanted to show me something”) or a social vs. a private motivation (“the person was interested in something”). Participants differentially attributed either approach behavior, a social, or a private intention to the agent solely based on the passive observation of the two specific gaze cues of start and end position. While for the attribution of privately motivated behavior, participants relied solely on the end position of the gaze shift, the social interpretation of the observed behavior depended additionally upon initial eye contact. Implications of these results for future social gaze and social cognition research in general are discussed.
Collapse
Affiliation(s)
- Mathis Jording
- Cognitive Neuroscience (INM-3), Institute of Neuroscience and Medicine, Research Center Jülich, Jülich, Germany.,Department of Psychiatry and Psychotherapy, University Hospital Cologne, Cologne, Germany
| | - Denis Engemann
- Cognitive Neuroscience (INM-3), Institute of Neuroscience and Medicine, Research Center Jülich, Jülich, Germany.,Université Paris-Saclay, Inria, CEA, Palaiseau, France
| | - Hannah Eckert
- Department of Psychiatry and Psychotherapy, University Hospital Cologne, Cologne, Germany
| | - Gary Bente
- Department of Communication, Michigan State University, East Lansing, MI, United States
| | - Kai Vogeley
- Cognitive Neuroscience (INM-3), Institute of Neuroscience and Medicine, Research Center Jülich, Jülich, Germany.,Department of Psychiatry and Psychotherapy, University Hospital Cologne, Cologne, Germany
| |
Collapse
|
21
|
Georgescu AL, Koehler JC, Weiske J, Vogeley K, Koutsouleris N, Falter-Wagner C. Machine Learning to Study Social Interaction Difficulties in ASD. Front Robot AI 2019; 6:132. [PMID: 33501147 PMCID: PMC7805744 DOI: 10.3389/frobt.2019.00132] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2019] [Accepted: 11/13/2019] [Indexed: 11/27/2022] Open
Abstract
Autism Spectrum Disorder (ASD) is a spectrum of neurodevelopmental conditions characterized by difficulties in social communication and social interaction as well as repetitive behaviors and restricted interests. Prevalence rates have been rising, and existing diagnostic methods are both extremely time and labor consuming. There is an urgent need for more economic and objective automatized diagnostic tools that are independent of language and experience of the diagnostician and that can help deal with the complexity of the autistic phenotype. Technological advancements in machine learning are offering a potential solution, and several studies have employed computational approaches to classify ASD based on phenomenological, behavioral or neuroimaging data. Despite of being at the core of ASD diagnosis and having the potential to be used as a behavioral marker for machine learning algorithms, only recently have movement parameters been used as features in machine learning classification approaches. In a proof-of-principle analysis of data from a social interaction study we trained a classification algorithm on intrapersonal synchrony as an automatically and objectively measured phenotypic feature from 29 autistic and 29 typically developed individuals to differentiate those individuals with ASD from those without ASD. Parameters included nonverbal motion energy values from 116 videos of social interactions. As opposed to previous studies to date, our classification approach has been applied to non-verbal behavior objectively captured during naturalistic and complex interactions with a real human interaction partner assuring high external validity. A machine learning approach lends itself particularly for capturing heterogeneous and complex behavior in real social interactions and will be essential in developing automatized and objective classification methods in ASD.
Collapse
Affiliation(s)
- Alexandra Livia Georgescu
- Department of Psychology, Institute of Psychiatry, Psychology and Neuroscience, King's College London, London, United Kingdom.,Department of Psychiatry and Psychotherapy, University Hospital of Cologne, Cologne, Germany
| | - Jana Christina Koehler
- Department of Psychiatry and Psychotherapy, Medical Faculty, LMU Munich, Munich, Germany
| | - Johanna Weiske
- Department of Psychiatry and Psychotherapy, Medical Faculty, LMU Munich, Munich, Germany
| | - Kai Vogeley
- Department of Psychiatry and Psychotherapy, University Hospital of Cologne, Cologne, Germany.,Institute of Neuroscience and Medicine, Cognitive Neuroscience (INM-3), Research Center Juelich, Jülich, Germany
| | - Nikolaos Koutsouleris
- Department of Psychiatry and Psychotherapy, Medical Faculty, LMU Munich, Munich, Germany
| | - Christine Falter-Wagner
- Department of Psychiatry and Psychotherapy, Medical Faculty, LMU Munich, Munich, Germany.,Institute of Medical Psychology, Medical Faculty, LMU Munich, Munich, Germany
| |
Collapse
|
22
|
Jording M, Hartz A, Bente G, Schulte-Rüther M, Vogeley K. Inferring Interactivity From Gaze Patterns During Triadic Person-Object-Agent Interactions. Front Psychol 2019; 10:1913. [PMID: 31496976 PMCID: PMC6712091 DOI: 10.3389/fpsyg.2019.01913] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2019] [Accepted: 08/05/2019] [Indexed: 11/13/2022] Open
Abstract
Observing others' gaze informs us about relevant matters in the environment. Humans' sensitivity to gaze cues and our ability to use this information to focus our own attention is crucial to learning, social coordination, and survival. Gaze can also be a deliberate social signal which captures and directs the gaze of others toward an object of interest. In the current study, we investigated whether the intention to actively communicate one's own attentional focus can be inferred from the dynamics of gaze alone. We used a triadic gaze interaction paradigm based on the recently proposed classification of attentional states and respective gaze patterns in person-object-person interactions, the so-called "social gaze space (SGS)." Twenty-eight participants interacted with a computer controlled virtual agent while they assumed to interact with a real human. During the experiment, the virtual agent engaged in various gaze patterns which were determined by the agent's attentional communicative state, as described by the concept of SGS. After each interaction, participants were asked to judge whether the other person was trying to deliberately interact with them. Results show that participants were able to infer the communicative intention solely from the agent's gaze behavior. The results substantiate claims about the pivotal role of gaze in social coordination and relationship formation. Our results further reveal that social expectations are reflected in differential responses to the displayed gaze patterns and may be crucial for impression formation during gaze-based interaction. To the best of our knowledge, this is the first study to document the experience of interactivity in continuous and contingent triadic gaze interactions.
Collapse
Affiliation(s)
- Mathis Jording
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Research Center Jülich, Jülich, Germany.,Department of Psychiatry, University Hospital Cologne, Cologne, Germany
| | - Arne Hartz
- JARA-BRAIN, Aachen, Germany.,Translational Brain Research in Psychiatry and Neurology, Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, RWTH Aachen University, Aachen, Germany
| | - Gary Bente
- Department of Communication, Michigan State University, East Lansing, MI, United States
| | - Martin Schulte-Rüther
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Research Center Jülich, Jülich, Germany.,JARA-BRAIN, Aachen, Germany.,Translational Brain Research in Psychiatry and Neurology, Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, RWTH Aachen University, Aachen, Germany
| | - Kai Vogeley
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Research Center Jülich, Jülich, Germany.,Department of Psychiatry, University Hospital Cologne, Cologne, Germany
| |
Collapse
|
23
|
Bloch C, Vogeley K, Georgescu AL, Falter-Wagner CM. INTRApersonal Synchrony as Constituent of INTERpersonal Synchrony and Its Relevance for Autism Spectrum Disorder. Front Robot AI 2019; 6:73. [PMID: 33501088 PMCID: PMC7805712 DOI: 10.3389/frobt.2019.00073] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2019] [Accepted: 07/30/2019] [Indexed: 11/13/2022] Open
Abstract
INTERpersonal synchrony leads to increased empathy, rapport and understanding, enabling successful human-human interactions and reciprocal bonding. Research shows that individuals with Autism Spectrum Disorder (ASD) exhibit difficulties to INTERpersonally synchronize but underlying causes are yet unknown. In order to successfully synchronize with others, INTRApersonal synchronization of communicative signals appears to be a necessary prerequisite. We understand INTRApersonal synchrony as an implicit factor of INTERpersonal synchrony and therefore hypothesize that atypicalities of INTRApersonal synchrony may add to INTERpersonal synchrony problems in ASD and their interaction partners. In this perspective article, we first review evidence for INTERpersonal dissynchrony in ASD, with respect to different approaches and assessment methods. Second, we draft a theoretical conceptualization of INTRApersonal dissynchrony in ASD based on a temporal model of human interaction. We will outline literature indicating INTRApersonal dissynchrony in ASD, therefore highlighting findings of atypical timing functions and findings from clinical and behavioral studies that indicate peculiar motion patterns and communicative signal production in ASD. Third, we hypothesize that findings from these domains suggest an assessment and investigation of temporal parameters of social behavior in individuals with ASD. We will further propose specific goals of empirical approaches on INTRApersonal dissynchrony. Finally we present implications of research on INTRApersonal timing in ASD for diagnostic and therapeutic purposes, what in our opinion warrants the increase of research efforts in this domain.
Collapse
Affiliation(s)
- Carola Bloch
- Department of Psychiatry and Psychotherapy, Medical Faculty, University of Cologne, Cologne, Germany
- Department of Psychiatry and Psychotherapy, Medical Faculty, Ludwig-Maximilians-University, Munich, Germany
| | - Kai Vogeley
- Department of Psychiatry and Psychotherapy, Medical Faculty, University of Cologne, Cologne, Germany
- Institute of Neuroscience and Medicine (INM3), Research Center Jülich, Jülich, Germany
| | - Alexandra L. Georgescu
- Department of Psychology, Institute of Psychiatry, Psychology and Neuroscience, King's College London, London, United Kingdom
| | - Christine M. Falter-Wagner
- Department of Psychiatry and Psychotherapy, Medical Faculty, Ludwig-Maximilians-University, Munich, Germany
- Department of Psychology, Faculty of Human Science, University of Cologne, Cologne, Germany
| |
Collapse
|
24
|
The mind minds minds: The effect of intentional stance on the neural encoding of joint attention. COGNITIVE AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2019; 19:1479-1491. [DOI: 10.3758/s13415-019-00734-y] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
25
|
Feng X, Hao X, Xin R, Gao X, Liu M, Li F, Wang Y, Shi R, Zhao S, Zhou F. Detecting Methylomic Biomarkers of Pediatric Autism in the Peripheral Blood Leukocytes. Interdiscip Sci 2019; 11:237-246. [DOI: 10.1007/s12539-019-00328-9] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2018] [Revised: 03/25/2019] [Accepted: 03/28/2019] [Indexed: 12/12/2022]
|
26
|
Amaral C, Mouga S, Simões M, Pereira HC, Bernardino I, Quental H, Playle R, McNamara R, Oliveira G, Castelo-Branco M. A Feasibility Clinical Trial to Improve Social Attention in Autistic Spectrum Disorder (ASD) Using a Brain Computer Interface. Front Neurosci 2018; 12:477. [PMID: 30061811 PMCID: PMC6055058 DOI: 10.3389/fnins.2018.00477] [Citation(s) in RCA: 36] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2018] [Accepted: 06/25/2018] [Indexed: 12/27/2022] Open
Abstract
Deficits in the interpretation of others' intentions from gaze-direction or other social attention cues are well-recognized in ASD. Here we investigated whether an EEG brain computer interface (BCI) can be used to train social cognition skills in ASD patients. We performed a single-arm feasibility clinical trial and enrolled 15 participants (mean age 22y 2m) with high-functioning ASD (mean full-scale IQ 103). Participants were submitted to a BCI training paradigm using a virtual reality interface over seven sessions spread over 4 months. The first four sessions occurred weekly, and the remainder monthly. In each session, the subject was asked to identify objects of interest based on the gaze direction of an avatar. Attentional responses were extracted from the EEG P300 component. A final follow-up assessment was performed 6-months after the last session. To analyze responses to joint attention cues participants were assessed pre and post intervention and in the follow-up, using an ecologic “Joint-attention task.” We used eye-tracking to identify the number of social attention items that a patient could accurately identify from an avatar's action cues (e.g., looking, pointing at). As secondary outcome measures we used the Autism Treatment Evaluation Checklist (ATEC) and the Vineland Adaptive Behavior Scale (VABS). Neuropsychological measures related to mood and depression were also assessed. In sum, we observed a decrease in total ATEC and rated autism symptoms (Sociability; Sensory/Cognitive Awareness; Health/Physical/Behavior); an evident improvement in Adapted Behavior Composite and in the DLS subarea from VABS; a decrease in Depression (from POMS) and in mood disturbance/depression (BDI). BCI online performance and tolerance were stable along the intervention. Average P300 amplitude and alpha power were also preserved across sessions. We have demonstrated the feasibility of BCI in this kind of intervention in ASD. Participants engage successfully and consistently in the task. Although the primary outcome (rate of automatic responses to joint attention cues) did not show changes, most secondary neuropsychological outcome measures showed improvement, yielding promise for a future efficacy trial. (clinical-trial ID: NCT02445625—clinicaltrials.gov).
Collapse
Affiliation(s)
- Carlos Amaral
- CNC.IBILI-Institute for Biomedical Imaging and Life Sciences, Faculty of Medicine, University of Coimbra, Coimbra, Portugal
| | - Susana Mouga
- CNC.IBILI-Institute for Biomedical Imaging and Life Sciences, Faculty of Medicine, University of Coimbra, Coimbra, Portugal.,Unidade de Neurodesenvolvimento e Autismo do Serviço do Centro de Desenvolvimento da Criança, Pediatric Hospital, Centro Hospitalar e Universitário de Coimbra, Coimbra, Portugal
| | - Marco Simões
- CNC.IBILI-Institute for Biomedical Imaging and Life Sciences, Faculty of Medicine, University of Coimbra, Coimbra, Portugal.,Center for Informatics and Systems, University of Coimbra, Coimbra, Portugal
| | - Helena C Pereira
- CNC.IBILI-Institute for Biomedical Imaging and Life Sciences, Faculty of Medicine, University of Coimbra, Coimbra, Portugal
| | - Inês Bernardino
- CNC.IBILI-Institute for Biomedical Imaging and Life Sciences, Faculty of Medicine, University of Coimbra, Coimbra, Portugal
| | - Hugo Quental
- CNC.IBILI-Institute for Biomedical Imaging and Life Sciences, Faculty of Medicine, University of Coimbra, Coimbra, Portugal
| | - Rebecca Playle
- Centre for Trials Research, Cardiff University, Cardiff, Wales
| | - Rachel McNamara
- Centre for Trials Research, Cardiff University, Cardiff, Wales
| | - Guiomar Oliveira
- Unidade de Neurodesenvolvimento e Autismo do Serviço do Centro de Desenvolvimento da Criança, Pediatric Hospital, Centro Hospitalar e Universitário de Coimbra, Coimbra, Portugal.,University Clinic of Pediatrics, Faculty of Medicine, University of Coimbra, Coimbra, Portugal.,Centro de Investigação e Formação Clínica, Hospital Pediátrico, Centro Hospitalar e Universitário de Coimbra, Coimbra, Portugal.,Faculty of Medicine, University of Coimbra, Coimbra, Portugal
| | - Miguel Castelo-Branco
- Faculty of Medicine, University of Coimbra, Coimbra, Portugal.,CIBIT, Coimbra Institute for Biomedical Imaging and Translational Research, ICNAS - Institute of Nuclear Sciences Applied to Health, University of Coimbra, Coimbra, Portugal.,ICNAS-Produção Unipessoal, Coimbra, Portugal
| |
Collapse
|
27
|
Bernard F, Lemée JM, Aubin G, Ter Minassian A, Menei P. Using a Virtual Reality Social Network During Awake Craniotomy to Map Social Cognition: Prospective Trial. J Med Internet Res 2018; 20:e10332. [PMID: 29945859 PMCID: PMC6039768 DOI: 10.2196/10332] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2018] [Revised: 05/14/2018] [Accepted: 05/19/2018] [Indexed: 11/17/2022] Open
Abstract
Background In awake craniotomy, it is possible to temporarily inactivate regions of the brain using direct electrical stimulation, while the patient performs neuropsychological tasks. If the patient shows decreased performance in a given task, the neurosurgeon will not remove these regions, so as to maintain all brain functions. Objective The objective of our study was to describe our experience of using a virtual reality (VR) social network during awake craniotomy and discuss its future applications for perioperative mapping of nonverbal language, empathy, and theory of mind. Methods This was a single-center, prospective, unblinded trial. During wound closure, different VR experiences with a VR headset were proposed to the patient. This project sought to explore interactions with the neuropsychologist’s avatar in virtual locations using a VR social network as an available experience. Results Three patients experienced VR. Despite some limitations due to patient positioning during the operation and the limitation of nonverbal cues inherent to the app, the neuropsychologist, as an avatar, could communicate with the patient and explore gesture communication while wearing a VR headset. Conclusions With some improvements, VR social networks can be used in the near future to map social cognition during awake craniotomy. Trial Registration ClinicalTrials.gov NCT03010943; https://clinicaltrials.gov/ct2/show/NCT03010943 (Archived at WebCite at http://www.webcitation.org/70CYDil0P)
Collapse
Affiliation(s)
- Florian Bernard
- Neurosurgery, CHU Angers, Angers, France.,Laboratoire d'Anatomie, Faculté de Médecine d'Angers, Angers, France
| | - Jean-Michel Lemée
- Neurosurgery, CHU Angers, Angers, France.,CRCINA, INSERM, Université de Nantes, Université d'Angers, Angers, France
| | | | - Aram Ter Minassian
- CHU Angers, Département d'Anesthésie-Réanimation, Angers, France.,CHU Angers, LARIS EA 7315, Image Signal et Sciences du Vivant, Angers, France
| | | |
Collapse
|
28
|
Geiger A, Cleeremans A, Bente G, Vogeley K. Social Cues Alter Implicit Motor Learning in a Serial Reaction Time Task. Front Hum Neurosci 2018; 12:197. [PMID: 29867420 PMCID: PMC5960666 DOI: 10.3389/fnhum.2018.00197] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2017] [Accepted: 04/25/2018] [Indexed: 11/25/2022] Open
Abstract
Learning is a central ability for human development. Many skills we learn, such as language, are learned through observation or imitation in social contexts. Likewise, many skills are learned implicitly, that is, without an explicit intent to learn and without full awareness of the acquired knowledge. Here, we asked whether performance in a motor learning task is modulated by social vs. object cues of varying validity. To address this question, we asked participants to carry out a serial reaction time (SRT) task in which, on each trial, people have to respond as fast and as accurately as possible to the appearance of a stimulus at one of four possible locations. Unbeknownst to participants, the sequence of successive locations was sequentially structured, so that knowledge of the sequence facilitates anticipation of the next stimulus and hence faster motor responses. Crucially, each trial also contained a cue pointing to the next stimulus location. Participants could thus learn based on the cue, or on learning about the sequence of successive locations, or on a combination of both. Results show an interaction between cue type and cue validity for the motor responses: social cues (vs. object cues) led to faster responses in the low validity (LV) condition only. Concerning the extent to which learning was implicit, results show that in the cued blocks only, the highly valid social cue led to implicit learning. In the uncued blocks, participants showed no implicit learning in the highly valid social cue condition, but did in all other combinations of stimulus type and cueing validity. In conclusion, our results suggest that implicit learning is context-dependent and can be influenced by the cue type, e.g., social and object cues.
Collapse
Affiliation(s)
- Alexander Geiger
- Institute of Neuroscience and Medicine, Cognitive Neuroscience (INM-3), Research Centre Juelich, Juelich, Germany.,Brain Imaging Lab, Department of Psychiatry, University Hospital Cologne, Cologne, Germany
| | - Axel Cleeremans
- Consciousness, Cognition & Computation Group, Center for Research in Cognition & Neurosciences, ULB Neuroscience Institute, Université Libre de Bruxelles, Bruxelles, Belgium
| | - Gary Bente
- Department of Psychology, Faculty of Human Sciences, University of Cologne, Cologne, Germany.,Department of Communication, Michigan State University, East Lansing, MI, United States
| | - Kai Vogeley
- Institute of Neuroscience and Medicine, Cognitive Neuroscience (INM-3), Research Centre Juelich, Juelich, Germany.,Brain Imaging Lab, Department of Psychiatry, University Hospital Cologne, Cologne, Germany
| |
Collapse
|
29
|
Kocsis BJ, Yellowlees P. Telepsychotherapy and the Therapeutic Relationship: Principles, Advantages, and Case Examples. Telemed J E Health 2018; 24:329-334. [DOI: 10.1089/tmj.2017.0088] [Citation(s) in RCA: 40] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Affiliation(s)
- Barbara J. Kocsis
- Department of Psychiatry and Behavioral Sciences, University of California Davis, Sacramento, California
| | - Peter Yellowlees
- Department of Psychiatry and Behavioral Sciences, University of California Davis, Sacramento, California
| |
Collapse
|
30
|
Vogeley K. Two social brains: neural mechanisms of intersubjectivity. Philos Trans R Soc Lond B Biol Sci 2018; 372:rstb.2016.0245. [PMID: 28673921 DOI: 10.1098/rstb.2016.0245] [Citation(s) in RCA: 44] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/13/2017] [Indexed: 12/11/2022] Open
Abstract
It is the aim of this article to present an empirically justified hypothesis about the functional roles of the two social neural systems, namely the so-called 'mirror neuron system' (MNS) and the 'mentalizing system' (MENT, also 'theory of mind network' or 'social neural network'). Both systems are recruited during cognitive processes that are either related to interaction or communication with other conspecifics, thereby constituting intersubjectivity. The hypothesis is developed in the following steps: first, the fundamental distinction that we make between persons and things is introduced; second, communication is presented as the key process that allows us to interact with others; third, the capacity to 'mentalize' or to understand the inner experience of others is emphasized as the fundamental cognitive capacity required to establish successful communication. On this background, it is proposed that MNS serves comparably early stages of social information processing related to the 'detection' of spatial or bodily signals, whereas MENT is recruited during comparably late stages of social information processing related to the 'evaluation' of emotional and psychological states of others. This hypothesis of MNS as a social detection system and MENT as a social evaluation system is illustrated by findings in the field of psychopathology. Finally, new research questions that can be derived from this hypothesis are discussed.This article is part of the themed issue 'Physiological determinants of social behaviour in animals'.
Collapse
Affiliation(s)
- Kai Vogeley
- Department of Psychiatry, University Hospital Cologne, Kerpener Street 62, 50924 Cologne, Germany .,Institute for Neuroscience and Medicine-Cognitive Neuroscience (INM3), Research Center Juelich, Wilhelm-Johnen Strasse, 52428 Juelich, Germany
| |
Collapse
|
31
|
Brief Report: A Pilot Study of the Use of a Virtual Reality Headset in Autism Populations. J Autism Dev Disord 2017; 46:3166-76. [PMID: 27272115 DOI: 10.1007/s10803-016-2830-5] [Citation(s) in RCA: 38] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
The application of virtual reality technologies (VRTs) for users with autism spectrum disorder (ASD) has been studied for decades. However, a gap remains in our understanding surrounding VRT head-mounted displays (HMDs). As newly designed HMDs have become commercially available (in this study the Oculus Rift™) the need to investigate newer devices is immediate. This study explored willingness, acceptance, sense of presence and immersion of ASD participants. Results revealed that all 29 participants (mean age = 32; 33 % with IQ < 70) were willing to wear the HMD. The majority of the participants reported an enjoyable experience, high levels of 'presence', and were likely to use HMDs again. IQ was found to be independent of the willingness to use HMDs and related VRT immersion experience.
Collapse
|
32
|
Forbes PAG, Pan X, de C Hamilton AF. Reduced Mimicry to Virtual Reality Avatars in Autism Spectrum Disorder. J Autism Dev Disord 2017; 46:3788-3797. [PMID: 27696183 PMCID: PMC5110595 DOI: 10.1007/s10803-016-2930-2] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
Mimicry involves unconsciously copying the actions of others. Increasing evidence suggests that autistic people can copy the goal of an observed action but show differences in their mimicry. We investigated mimicry in autism spectrum disorder (ASD) within a two-dimensional virtual reality environment. Participants played an imitation game with a socially engaged avatar and socially disengaged avatar. Despite being told only to copy the goal of the observed action, autistic participants and matched neurotypical participants mimicked the kinematics of the avatars’ movements. However, autistic participants mimicked less. Social engagement did not modulate mimicry in either group. The results demonstrate the feasibility of using virtual reality to induce mimicry and suggest mimicry differences in ASD may also occur when interacting with avatars.
Collapse
Affiliation(s)
- Paul A G Forbes
- Institute of Cognitive Neuroscience, University College London, London, UK.
| | - Xueni Pan
- Department of Computing, Goldsmiths, University of London, London, UK
| | | |
Collapse
|
33
|
Caruana N, Spirou D, Brock J. Human agency beliefs influence behaviour during virtual social interactions. PeerJ 2017; 5:e3819. [PMID: 28948104 PMCID: PMC5610555 DOI: 10.7717/peerj.3819] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2017] [Accepted: 08/28/2017] [Indexed: 11/30/2022] Open
Abstract
In recent years, with the emergence of relatively inexpensive and accessible virtual reality technologies, it is now possible to deliver compelling and realistic simulations of human-to-human interaction. Neuroimaging studies have shown that, when participants believe they are interacting via a virtual interface with another human agent, they show different patterns of brain activity compared to when they know that their virtual partner is computer-controlled. The suggestion is that users adopt an “intentional stance” by attributing mental states to their virtual partner. However, it remains unclear how beliefs in the agency of a virtual partner influence participants’ behaviour and subjective experience of the interaction. We investigated this issue in the context of a cooperative “joint attention” game in which participants interacted via an eye tracker with a virtual onscreen partner, directing each other’s eye gaze to different screen locations. Half of the participants were correctly informed that their partner was controlled by a computer algorithm (“Computer” condition). The other half were misled into believing that the virtual character was controlled by a second participant in another room (“Human” condition). Those in the “Human” condition were slower to make eye contact with their partner and more likely to try and guide their partner before they had established mutual eye contact than participants in the “Computer” condition. They also responded more rapidly when their partner was guiding them, although the same effect was also found for a control condition in which they responded to an arrow cue. Results confirm the influence of human agency beliefs on behaviour in this virtual social interaction context. They further suggest that researchers and developers attempting to simulate social interactions should consider the impact of agency beliefs on user experience in other social contexts, and their effect on the achievement of the application’s goals.
Collapse
Affiliation(s)
- Nathan Caruana
- Department of Cognitive Science, Macquarie University, Sydney, New South Wales, Australia.,Perception in Action Research Centre, Sydney, New South Wales, Australia.,Centre for Atypical Neurodevelopment, Sydney, New South Wales, Australia.,ARC Centre of Excellence in Cognition and its Disorders, Syndey, New South Wales, Australia
| | - Dean Spirou
- Department of Psychology, Macquarie University, Sydney, New South Wales, Australia
| | - Jon Brock
- Centre for Atypical Neurodevelopment, Sydney, New South Wales, Australia.,ARC Centre of Excellence in Cognition and its Disorders, Syndey, New South Wales, Australia.,Department of Psychology, Macquarie University, Sydney, New South Wales, Australia
| |
Collapse
|
34
|
Virtual Reality for Research in Social Neuroscience. Brain Sci 2017; 7:brainsci7040042. [PMID: 28420150 PMCID: PMC5406699 DOI: 10.3390/brainsci7040042] [Citation(s) in RCA: 74] [Impact Index Per Article: 10.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2017] [Revised: 03/28/2017] [Accepted: 04/12/2017] [Indexed: 12/16/2022] Open
Abstract
The emergence of social neuroscience has significantly advanced our understanding of the relationship that exists between social processes and their neurobiological underpinnings. Social neuroscience research often involves the use of simple and static stimuli lacking many of the potentially important aspects of real world activities and social interactions. Whilst this research has merit, there is a growing interest in the presentation of dynamic stimuli in a manner that allows researchers to assess the integrative processes carried out by perceivers over time. Herein, we discuss the potential of virtual reality for enhancing ecological validity while maintaining experimental control in social neuroscience research. Virtual reality is a technology that allows for the creation of fully interactive, three-dimensional computerized models of social situations that can be fully controlled by the experimenter. Furthermore, the introduction of interactive virtual characters—either driven by a human or by a computer—allows the researcher to test, in a systematic and independent manner, the effects of various social cues. We first introduce key technical features and concepts related to virtual reality. Next, we discuss the potential of this technology for enhancing social neuroscience protocols, drawing on illustrative experiments from the literature.
Collapse
|
35
|
Sampogna G, Pugliese R, Elli M, Vanzulli A, Forgione A. Routine clinical application of virtual reality in abdominal surgery. MINIM INVASIV THER 2017; 26:135-143. [PMID: 28084141 DOI: 10.1080/13645706.2016.1275016] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
BACKGROUND The advantages of 3D reconstruction, immersive virtual reality (VR) and 3D printing in abdominal surgery have been enunciated for many years, but still today their application in routine clinical practice is almost nil. We investigate their feasibility, user appreciation and clinical impact. MATERIAL AND METHODS Fifteen patients undergoing pancreatic, hepatic or renal surgery were studied realizing a 3D reconstruction of target anatomy. Then, an immersive VR environment was developed to import 3D models, and some details of the 3D scene were printed. All the phases of our workflow employed open-source software and low-cost hardware, easily implementable by other surgical services. A qualitative evaluation of the three approaches was performed by 20 surgeons, who filled in a specific questionnaire regarding a clinical case for each organ considered. RESULTS Preoperative surgical planning and intraoperative guidance was feasible for all patients included in the study. The vast majority of surgeons interviewed scored their quality and usefulness as very good. CONCLUSIONS Despite extra time, costs and efforts necessary to implement these systems, the benefits shown by the analysis of questionnaires recommend to invest more resources to train physicians to adopt these technologies routinely, even if further and larger studies are still mandatory.
Collapse
Affiliation(s)
- Gianluca Sampogna
- a Department of Biomedical and Clinical Sciences 'L. Sacco' , Università degli Studi di Milano , Milan , Italy.,b Advanced International Mini-invasive Surgery (AIMS) Academy, Ospedale Niguarda Ca' Granda , Milan , Italy
| | - Raffaele Pugliese
- b Advanced International Mini-invasive Surgery (AIMS) Academy, Ospedale Niguarda Ca' Granda , Milan , Italy.,c Department of General Surgical Oncology and Minimally Invasive Surgery , Ospedale Niguarda Ca' Granda , Milan , Italy
| | - Marco Elli
- a Department of Biomedical and Clinical Sciences 'L. Sacco' , Università degli Studi di Milano , Milan , Italy
| | - Angelo Vanzulli
- a Department of Biomedical and Clinical Sciences 'L. Sacco' , Università degli Studi di Milano , Milan , Italy.,d Department of Radiology , Ospedale Niguarda Ca' Granda , Milan , Italy
| | - Antonello Forgione
- b Advanced International Mini-invasive Surgery (AIMS) Academy, Ospedale Niguarda Ca' Granda , Milan , Italy.,c Department of General Surgical Oncology and Minimally Invasive Surgery , Ospedale Niguarda Ca' Granda , Milan , Italy
| |
Collapse
|
36
|
van Bennekom MJ, de Koning PP, Denys D. Virtual Reality Objectifies the Diagnosis of Psychiatric Disorders: A Literature Review. Front Psychiatry 2017; 8:163. [PMID: 28928677 PMCID: PMC5591833 DOI: 10.3389/fpsyt.2017.00163] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/18/2017] [Accepted: 08/21/2017] [Indexed: 11/30/2022] Open
Abstract
BACKGROUND To date, a diagnosis in psychiatry is largely based on a clinical interview and questionnaires. The retrospective and subjective nature of these methods leads to recall and interviewer biases. Therefore, there is a clear need for more objective and standardized assessment methods to support the diagnostic process. The introduction of virtual reality (VR) creates the possibility to simultaneously provoke and measure psychiatric symptoms. Therefore, VR could contribute to the objectivity and reliability in the assessment of psychiatric disorders. OBJECTIVE In this literature review, we will evaluate the assessment of psychiatric disorders by means of VR environments. First, we investigate if these VR environments are capable of simultaneously provoking and measuring psychiatric symptoms. Next, we compare these measures with traditional diagnostic measures. METHODS We performed a systematic search using PubMed, Embase, and Psycinfo; references of selected articles were checked for eligibility. We identified studies from 1990 to 2016 on VR used in the assessment of psychiatric disorders. Studies were excluded if VR was used for therapeutic purposes, if a different technique was used, or in case of limitation to a non-clinical sample. RESULTS A total of 39 studies were included for further analysis. The disorders most frequently studied included schizophrenia (n = 15), developmental disorders (n = 12), eating disorders (n = 3), and anxiety disorders (n = 6). In attention-deficit hyperactivity disorder, the most comprehensive measurement was used including several key symptoms of the disorder. Most of the studies, however, concerned the use of VR to assess a single aspect of a psychiatric disorder. DISCUSSION In general, nearly all VR environments studied were able to simultaneously provoke and measure psychiatric symptoms. Furthermore, in 14 studies, significant correlations were found between VR measures and traditional diagnostic measures. Relatively small clinical sample sizes were used, impeding definite conclusions. Based on this review, the innovative technique of VR shows potential to contribute to objectivity and reliability in the psychiatric diagnostic process.
Collapse
Affiliation(s)
- Martine J van Bennekom
- Academic Medical Center, Department of Psychiatry, University of Amsterdam, Amsterdam, Netherlands
| | - Pelle P de Koning
- Academic Medical Center, Department of Psychiatry, University of Amsterdam, Amsterdam, Netherlands
| | - Damiaan Denys
- Academic Medical Center, Department of Psychiatry, University of Amsterdam, Amsterdam, Netherlands.,The Netherlands Institute for Neuroscience, Royal Netherlands Academy of Arts and Sciences, Amsterdam, Netherlands
| |
Collapse
|
37
|
Gillespie A, Corti K. The Body That Speaks: Recombining Bodies and Speech Sources in Unscripted Face-to-Face Communication. Front Psychol 2016; 7:1300. [PMID: 27660616 PMCID: PMC5015481 DOI: 10.3389/fpsyg.2016.01300] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2016] [Accepted: 08/15/2016] [Indexed: 11/13/2022] Open
Abstract
This article examines advances in research methods that enable experimental substitution of the speaking body in unscripted face-to-face communication. A taxonomy of six hybrid social agents is presented by combining three types of bodies (mechanical, virtual, and human) with either an artificial or human speech source. Our contribution is to introduce and explore the significance of two particular hybrids: (1) the cyranoid method that enables humans to converse face-to-face through the medium of another person's body, and (2) the echoborg method that enables artificial intelligence to converse face-to-face through the medium of a human body. These two methods are distinct in being able to parse the unique influence of the human body when combined with various speech sources. We also introduce a new framework for conceptualizing the body's role in communication, distinguishing three levels: self's perspective on the body, other's perspective on the body, and self's perspective of other's perspective on the body. Within each level the cyranoid and echoborg methodologies make important research questions tractable. By conceptualizing and synthesizing these methods, we outline a novel paradigm of research on the role of the body in unscripted face-to-face communication.
Collapse
Affiliation(s)
- Alex Gillespie
- Department of Social Psychology, London School of Economics and Political Science London, UK
| | - Kevin Corti
- Department of Social Psychology, London School of Economics and Political Science London, UK
| |
Collapse
|
38
|
Caruana N, de Lissa P, McArthur G. Beliefs about human agency influence the neural processing of gaze during joint attention. Soc Neurosci 2016; 12:194-206. [PMID: 26942996 DOI: 10.1080/17470919.2016.1160953] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
The current study measured adults' P350 and N170 ERPs while they interacted with a character in a virtual reality paradigm. Some participants believed the character was controlled by a human ("avatar" condition, n = 19); others believed it was controlled by a computer program ("agent" condition, n = 19). In each trial, participants initiated joint attention in order to direct the character's gaze toward a target. In 50% of trials, the character gazed toward the target (congruent responses), and in 50% of trials the character gazed to a different location (incongruent response). In the avatar condition, the character's incongruent gaze responses generated significantly larger P350 peaks at centro-parietal sites than congruent gaze responses. In the agent condition, the P350 effect was strikingly absent. Left occipitotemporal N170 responses were significantly smaller in the agent condition compared to the avatar condition for both congruent and incongruent gaze shifts. These data suggest that beliefs about human agency may recruit mechanisms that discriminate the social outcome of a gaze shift after approximately 350 ms, and that these mechanisms may modulate the early perceptual processing of gaze. These findings also suggest that the ecologically valid measurement of social cognition may depend upon paradigms that simulate genuine social interactions.
Collapse
Affiliation(s)
- Nathan Caruana
- a Department of Cognitive Science , Macquarie University , Sydney , Australia.,b ARC Centre of Excellence in Cognition and its Disorders , Sydney , Australia.,c Perception in Action Research Centre , Sydney , Australia
| | - Peter de Lissa
- a Department of Cognitive Science , Macquarie University , Sydney , Australia.,b ARC Centre of Excellence in Cognition and its Disorders , Sydney , Australia.,d Department of Psychology , Macquarie University , Sydney , Australia
| | - Genevieve McArthur
- a Department of Cognitive Science , Macquarie University , Sydney , Australia.,b ARC Centre of Excellence in Cognition and its Disorders , Sydney , Australia
| |
Collapse
|
39
|
Stendal K, Balandin S. Virtual worlds for people with autism spectrum disorder: a case study in Second Life. Disabil Rehabil 2015; 37:1591-8. [PMID: 26023707 DOI: 10.3109/09638288.2015.1052577] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
PURPOSE The purpose of this study is to explore the use of virtual worlds by people with autism spectrum disorder (ASD), with a particular focus on the virtual world Second Life™. METHOD Case study methodology was selected to explore the experiences of Wolf, a participant with ASD, in Second Life. Wolf participated in three in-depth interviews. The interviews were analyzed using a content analysis to identify themes and sub-themes. RESULTS Analysis identified four main themes: social factors and communication, empowerment, virtual world versus physical world, and social cues and body language. CONCLUSION Anecdotally Wolf's experiences suggest that people with ASD enjoy using a virtual world and may feel more comfortable communicating in the virtual world context than the physical world. Virtual worlds offer a venue for people with ASD to be a part of a virtual society, lowers communication barriers experienced in the physical world, and gives the participant a unique opportunity to create and maintain friendships. Virtual worlds offer an arena for people with ASD to meet their peers on equal terms, not being dependent on social cues, which in the physical world can be a barrier for communication for this group. Further research in this area is required.
Collapse
Affiliation(s)
- Karen Stendal
- School of Business and Faculty of Social Sciences, Department of Strategy and Finance, Buskerud and Vestfold University College , Hønefoss , Norway and
| | | |
Collapse
|
40
|
Fromberger P, Meyer S, Kempf C, Jordan K, Müller JL. Virtual Viewing Time: The Relationship between Presence and Sexual Interest in Androphilic and Gynephilic Men. PLoS One 2015; 10:e0127156. [PMID: 25992790 PMCID: PMC4436365 DOI: 10.1371/journal.pone.0127156] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2014] [Accepted: 04/13/2015] [Indexed: 11/18/2022] Open
Abstract
Virtual Reality (VR) has successfully been used in the research of human behavior for more than twenty years. The main advantage of VR is its capability to induce a high sense of presence. This results in emotions and behavior which are very close to those shown in real situations. In the context of sex research, only a few studies have used high-immersive VR so far. The ones that did can be found mostly in the field of forensic psychology. Nevertheless, the relationship between presence and sexual interest still remains unclear. The present study is the first to examine the advantages of high-immersive VR in comparison to a conventional standard desktop system regarding their capability to measure sexual interest. 25 gynephilic and 20 androphilic healthy men underwent three experimental conditions, which differed in their ability to induce a sense of presence. In each condition, participants were asked to rate ten male and ten female virtual human characters regarding their sexual attractiveness. Without their knowledge, the subjects' viewing time was assessed throughout the rating. Subjects were then asked to rate the sense of presence they had experienced as well as their perceived realism of the characters. Results suggested that stereoscopic viewing can significantly enhance the subjective sexual attractiveness of sexually relevant characters. Furthermore, in all three conditions participants looked significantly longer at sexually relevant virtual characters than at sexually non-relevant ones. The high immersion condition provided the best discriminant validity. From a statistical point of view, however, the sense of presence had no significant influence on the discriminant validity of the viewing time task. The study showed that high-immersive virtual environments enhance realism ratings as well as ratings of sexual attractiveness of three-dimensional human stimuli in comparison to standard desktop systems. Results also show that viewing time seems to be influenced neither by sexual attractiveness nor by realism of stimuli. This indicates how important task specific mechanisms of the viewing time effect are.
Collapse
Affiliation(s)
- Peter Fromberger
- Ludwig-Meyer-Institute for Forensic Psychiatry and Psychotherapy, Faculty of Medicine, Georg-August-University, Göttingen, Lower Saxony, Germany
| | - Sabrina Meyer
- Ludwig-Meyer-Institute for Forensic Psychiatry and Psychotherapy, Faculty of Medicine, Georg-August-University, Göttingen, Lower Saxony, Germany
| | - Christina Kempf
- Ludwig-Meyer-Institute for Forensic Psychiatry and Psychotherapy, Faculty of Medicine, Georg-August-University, Göttingen, Lower Saxony, Germany
| | - Kirsten Jordan
- Ludwig-Meyer-Institute for Forensic Psychiatry and Psychotherapy, Faculty of Medicine, Georg-August-University, Göttingen, Lower Saxony, Germany
| | - Jürgen L. Müller
- Ludwig-Meyer-Institute for Forensic Psychiatry and Psychotherapy, Faculty of Medicine, Georg-August-University, Göttingen, Lower Saxony, Germany
| |
Collapse
|