1
|
Caruana N, Nalepka P, Perez GA, Inkley C, Munro C, Rapaport H, Brett S, Kaplan DM, Richardson MJ, Pellicano E. Autistic young people adaptively use gaze to facilitate joint attention during multi-gestural dyadic interactions. AUTISM : THE INTERNATIONAL JOURNAL OF RESEARCH AND PRACTICE 2024; 28:1565-1581. [PMID: 38006222 PMCID: PMC11134991 DOI: 10.1177/13623613231211967] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2023]
Abstract
LAY ABSTRACT Autistic people have been said to have 'problems' with joint attention, that is, looking where someone else is looking. Past studies of joint attention have used tasks that require autistic people to continuously look at and respond to eye-gaze cues. But joint attention can also be done using other social cues, like pointing. This study looked at whether autistic and non-autistic young people use another person's eye gaze during joint attention in a task that did not require them to look at their partner's face. In the task, each participant worked together with their partner to find a computer-generated object in virtual reality. Sometimes the participant had to help guide their partner to the object, and other times, they followed their partner's lead. Participants were told to point to guide one another but were not told to use eye gaze. Both autistic and non-autistic participants often looked at their partner's face during joint attention interactions and were faster to respond to their partner's hand-pointing when the partner also looked at the object before pointing. This shows that autistic people can and do use information from another person's eyes, even when they don't have to. It is possible that, by not forcing autistic young people to look at their partner's face and eyes, they were better able to gather information from their partner's face when needed, without being overwhelmed. This shows how important it is to design tasks that provide autistic people with opportunities to show what they can do.
Collapse
|
2
|
Jording M, Hartz A, Vogel DHV, Schulte-Rüther M, Vogeley K. Impaired recognition of interactive intentions in adults with autism spectrum disorder not attributable to differences in visual attention or coordination via eye contact and joint attention. Sci Rep 2024; 14:8297. [PMID: 38594289 PMCID: PMC11004189 DOI: 10.1038/s41598-024-58696-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2023] [Accepted: 04/01/2024] [Indexed: 04/11/2024] Open
Abstract
Altered nonverbal communication patterns especially with regard to gaze interactions are commonly reported for persons with autism spectrum disorder (ASD). In this study we investigate and differentiate for the first time the interplay of attention allocation, the establishment of shared focus (eye contact and joint attention) and the recognition of intentions in gaze interactions in adults with ASD compared to control persons. Participants interacted via gaze with a virtual character (VC), who they believed was controlled by another person. Participants were instructed to ascertain whether their partner was trying to interact with them. In fact, the VC was fully algorithm-controlled and showed either interactive or non-interactive gaze behavior. Participants with ASD were specifically impaired in ascertaining whether their partner was trying to interact with them or not as compared to participants without ASD whereas neither the allocation of attention nor the ability to establish a shared focus were affected. Thus, perception and production of gaze cues seem preserved while the evaluation of gaze cues appeared to be impaired. An additional exploratory analysis suggests that especially the interpretation of contingencies between the interactants' actions are altered in ASD and should be investigated more closely.
Collapse
Affiliation(s)
- Mathis Jording
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Forschungszentrum Jülich, Jülich, Germany.
- Department of Psychiatry, Faculty of Medicine, University Hospital Cologne, University of Cologne, Cologne, Germany.
| | - Arne Hartz
- Child Neuropsychology Section, Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, University Hospital RWTH, Aachen, Germany
| | - David H V Vogel
- Department of Psychiatry, Faculty of Medicine, University Hospital Cologne, University of Cologne, Cologne, Germany
- Department of Neurology, Faculty of Medicine, University Hospital Cologne, University of Cologne, Cologne, Germany
| | - Martin Schulte-Rüther
- Child Neuropsychology Section, Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, University Hospital RWTH, Aachen, Germany
- Department of Child and Adolescent Psychiatry, Center for Psychosocial Medicine - University Hospital Heidelberg, Ruprechts-Karls University Heidelberg, Heidelberg, Germany
- Department of Child and Adolescent Psychiatry and Psychotherapy, University Medical Center Göttingen, Georg-August University Göttingen, Göttingen, Germany
| | - Kai Vogeley
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Forschungszentrum Jülich, Jülich, Germany
- Department of Psychiatry, Faculty of Medicine, University Hospital Cologne, University of Cologne, Cologne, Germany
| |
Collapse
|
3
|
Bloch C, Tepest R, Koeroglu S, Feikes K, Jording M, Vogeley K, Falter-Wagner CM. Interacting with autistic virtual characters: intrapersonal synchrony of nonverbal behavior affects participants' perception. Eur Arch Psychiatry Clin Neurosci 2024:10.1007/s00406-023-01750-3. [PMID: 38270620 DOI: 10.1007/s00406-023-01750-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/29/2023] [Accepted: 12/18/2023] [Indexed: 01/26/2024]
Abstract
Temporal coordination of communicative behavior is not only located between but also within interaction partners (e.g., gaze and gestures). This intrapersonal synchrony (IaPS) is assumed to constitute interpersonal alignment. Studies show systematic variations in IaPS in individuals with autism, which may affect the degree of interpersonal temporal coordination. In the current study, we reversed the approach and mapped the measured nonverbal behavior of interactants with and without ASD from a previous study onto virtual characters to study the effects of the differential IaPS on observers (N = 68), both with and without ASD (crossed design). During a communication task with both characters, who indicated targets with gaze and delayed pointing gestures, we measured response times, gaze behavior, and post hoc impression formation. Results show that character behavior indicative of ASD resulted in overall enlarged decoding times in observers and this effect was even pronounced in observers with ASD. A classification of observer's gaze types indicated differentiated decoding strategies. Whereas non-autistic observers presented with a rather consistent eyes-focused strategy associated with efficient and fast responses, observers with ASD presented with highly variable decoding strategies. In contrast to communication efficiency, impression formation was not influenced by IaPS. The results underline the importance of timing differences in both production and perception processes during multimodal nonverbal communication in interactants with and without ASD. In essence, the current findings locate the manifestation of reduced reciprocity in autism not merely in the person, but in the interactional dynamics of dyads.
Collapse
Affiliation(s)
- Carola Bloch
- Department of Psychiatry and Psychotherapy, Medical Faculty, LMU Clinic, Ludwig-Maximilians-University, 80336, Munich, Germany.
- Department of Psychiatry and Psychotherapy, Faculty of Medicine and University Hospital Cologne, University of Cologne, 50937, Cologne, Germany.
| | - Ralf Tepest
- Department of Psychiatry and Psychotherapy, Faculty of Medicine and University Hospital Cologne, University of Cologne, 50937, Cologne, Germany
| | - Sevim Koeroglu
- Department of Psychiatry and Psychotherapy, Faculty of Medicine and University Hospital Cologne, University of Cologne, 50937, Cologne, Germany
| | - Kyra Feikes
- Department of Psychiatry and Psychotherapy, Faculty of Medicine and University Hospital Cologne, University of Cologne, 50937, Cologne, Germany
| | - Mathis Jording
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Forschungszentrum Juelich, 52425, Juelich, Germany
| | - Kai Vogeley
- Department of Psychiatry and Psychotherapy, Faculty of Medicine and University Hospital Cologne, University of Cologne, 50937, Cologne, Germany
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Forschungszentrum Juelich, 52425, Juelich, Germany
| | - Christine M Falter-Wagner
- Department of Psychiatry and Psychotherapy, Medical Faculty, LMU Clinic, Ludwig-Maximilians-University, 80336, Munich, Germany
| |
Collapse
|
4
|
Alhasan A, Caruana N. Evidence for the adaptive parsing of non-communicative eye movements during joint attention interactions. PeerJ 2023; 11:e16363. [PMID: 38025743 PMCID: PMC10668824 DOI: 10.7717/peerj.16363] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2023] [Accepted: 10/06/2023] [Indexed: 12/01/2023] Open
Abstract
During social interactions, the ability to detect and respond to gaze-based joint attention bids often involves the evaluation of non-communicative eye movements. However, very little is known about how much humans are able to track and parse spatial information from these non-communicative eye movements over time, and the extent to which this influences joint attention outcomes. This was investigated in the current study using an interactive computer-based joint attention game. Using a fully within-subjects design, we specifically examined whether participants were quicker to respond to communicative joint attention bids that followed predictive, as opposed to random or no, non-communicative gaze behaviour. Our results suggest that in complex, dynamic tasks, people adaptively use and dismiss non-communicative gaze information depending on whether it informs the locus of an upcoming joint attention bid. We also went further to examine the extent to which this ability to track dynamic spatial information was specific to processing gaze information. This was achieved by comparing performance to a closely matched non-social task where eye gaze cues were replaced with dynamic arrow stimuli. Whilst we found that people are also able to track and use dynamic non-social information from arrows, there was clear evidence for a relative advantage for tracking gaze cues during social interactions. The implications of these findings for social neuroscience and autism research are discussed.
Collapse
Affiliation(s)
- Ayeh Alhasan
- School of Psychological Sciences, Macquarie University, Sydney, New South Wales, Australia
- Perception in Action Research Centre, Macquarie University, Sydney, New South Wales, Australia
| | - Nathan Caruana
- School of Psychological Sciences, Macquarie University, Sydney, New South Wales, Australia
- Perception in Action Research Centre, Macquarie University, Sydney, New South Wales, Australia
| |
Collapse
|
5
|
Bloch C, Viswanathan S, Tepest R, Jording M, Falter-Wagner CM, Vogeley K. Differentiated, rather than shared, strategies for time-coordinated action in social and non-social domains in autistic individuals. Cortex 2023; 166:207-232. [PMID: 37393703 DOI: 10.1016/j.cortex.2023.05.008] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2022] [Revised: 05/15/2023] [Accepted: 05/19/2023] [Indexed: 07/04/2023]
Abstract
Autism spectrum disorder (ASD) is a neurodevelopmental condition with a highly heterogeneous adult phenotype that includes social and non-social behavioral characteristics. The link between the characteristics assignable to the different domains remains unresolved. One possibility is that social and non-social behaviors in autism are modulated by a common underlying deficit. However, here we report evidence supporting an alternative concept that is individual-centered rather than deficit-centered. Individuals are assumed to have a distinctive style in the strategies they adopt to perform social and non-social tasks with these styles presumably being structured differently between autistic individuals and typically-developed (TD) individuals. We tested this hypothesis for the execution of time-coordinated (synchronized) actions. Participants performed (i) a social task that required synchronized gaze and pointing actions to interact with another person, and (ii) a non-social task that required finger-tapping actions synchronized to periodic stimuli at different time-scales and sensory modalities. In both tasks, synchronization behavior differed between ASD and TD groups. However, a principal component analysis of individual behaviors across tasks revealed associations between social and non-social features for the TD persons but such cross-domain associations were strikingly absent for autistic individuals. The highly differentiated strategies between domains in ASD are inconsistent with a general synchronization deficit and instead highlight the individualized developmental heterogeneity in the acquisition of domain-specific behaviors. We propose a cognitive model to help disentangle individual-centered from deficit-centered effects in other domains. Our findings reinforce the importance to identify individually differentiated phenotypes to personalize autism therapies.
Collapse
Affiliation(s)
- Carola Bloch
- Department of Psychiatry and Psychotherapy, LMU University Hospital, LMU Munich, Munich, Germany; Department of Psychiatry, Faculty of Medicine and University Hospital Cologne, University of Cologne, Cologne, Germany.
| | - Shivakumar Viswanathan
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Forschungszentrum Jülich, Jülich, Germany
| | - Ralf Tepest
- Department of Psychiatry, Faculty of Medicine and University Hospital Cologne, University of Cologne, Cologne, Germany
| | - Mathis Jording
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Forschungszentrum Jülich, Jülich, Germany
| | | | - Kai Vogeley
- Department of Psychiatry, Faculty of Medicine and University Hospital Cologne, University of Cologne, Cologne, Germany; Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Forschungszentrum Jülich, Jülich, Germany
| |
Collapse
|
6
|
Gaze estimation in videoconferencing settings. COMPUTERS IN HUMAN BEHAVIOR 2023. [DOI: 10.1016/j.chb.2022.107517] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
7
|
Scandola M, Cross ES, Caruana N, Tidoni E. Body Form Modulates the Prediction of Human and Artificial Behaviour from Gaze Observation. Int J Soc Robot 2023. [DOI: 10.1007/s12369-022-00962-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
AbstractThe future of human–robot collaboration relies on people’s ability to understand and predict robots' actions. The machine-like appearance of robots, as well as contextual information, may influence people’s ability to anticipate the behaviour of robots. We conducted six separate experiments to investigate how spatial cues and task instructions modulate people’s ability to understand what a robot is doing. Participants observed goal-directed and non-goal directed gaze shifts made by human and robot agents, as well as directional cues displayed by a triangle. We report that biasing an observer's attention, by showing just one object an agent can interact with, can improve people’s ability to understand what humanoid robots will do. Crucially, this cue had no impact on people’s ability to predict the upcoming behaviour of the triangle. Moreover, task instructions that focus on the visual and motor consequences of the observed gaze were found to influence mentalising abilities. We suggest that the human-like shape of an agent and its physical capabilities facilitate the prediction of an upcoming action. The reported findings expand current models of gaze perception and may have important implications for human–human and human–robot collaboration.
Collapse
|
8
|
Bloch C, Tepest R, Jording M, Vogeley K, Falter-Wagner CM. Intrapersonal synchrony analysis reveals a weaker temporal coherence between gaze and gestures in adults with autism spectrum disorder. Sci Rep 2022; 12:20417. [PMID: 36437262 PMCID: PMC9701674 DOI: 10.1038/s41598-022-24605-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2022] [Accepted: 11/17/2022] [Indexed: 11/29/2022] Open
Abstract
The temporal encoding of nonverbal signals within individuals, referred to as intrapersonal synchrony (IaPS), is an implicit process and essential feature of human communication. Based on existing evidence, IaPS is thought to be a marker of nonverbal behavior characteristics in autism spectrum disorders (ASD), but there is a lack of empirical evidence. The aim of this study was to quantify IaPS in adults during an experimentally controlled real-life interaction task. A sample of adults with a confirmed ASD diagnosis and a matched sample of typically-developed adults were tested (N = 48). Participants were required to indicate the appearance of a target invisible to their interaction partner nonverbally through gaze and pointing gestures. Special eye-tracking software allowed automated extraction of temporal delays between nonverbal signals and their intrapersonal variability with millisecond temporal resolution as indices for IaPS. Likelihood ratio tests of multilevel models showed enlarged delays between nonverbal signals in ASD. Larger delays were associated with greater intrapersonal variability in delays. The results provide a quantitative constraint on nonverbal temporality in typically-developed adults and suggest weaker temporal coherence between nonverbal signals in adults with ASD. The results provide a potential diagnostic marker and inspire predictive coding theories about the role of IaPS in interpersonal synchronization processes.
Collapse
Affiliation(s)
- Carola Bloch
- Department of Psychiatry and Psychotherapy, Medical Faculty, LMU Clinic, Ludwig-Maximilians-University, Nussbaumstraße 7, 80336, Munich, Germany.
- Department of Psychiatry and Psychotherapy, Faculty of Medicine and University Hospital Cologne, University of Cologne, Cologne, Germany.
| | - Ralf Tepest
- Department of Psychiatry and Psychotherapy, Faculty of Medicine and University Hospital Cologne, University of Cologne, Cologne, Germany
| | - Mathis Jording
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Forschungszentrum Jülich, Jülich, Germany
| | - Kai Vogeley
- Department of Psychiatry and Psychotherapy, Faculty of Medicine and University Hospital Cologne, University of Cologne, Cologne, Germany
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Forschungszentrum Jülich, Jülich, Germany
| | - Christine M Falter-Wagner
- Department of Psychiatry and Psychotherapy, Medical Faculty, LMU Clinic, Ludwig-Maximilians-University, Nussbaumstraße 7, 80336, Munich, Germany.
| |
Collapse
|