1
|
Caruana N, Nalepka P, Perez GA, Inkley C, Munro C, Rapaport H, Brett S, Kaplan DM, Richardson MJ, Pellicano E. Autistic young people adaptively use gaze to facilitate joint attention during multi-gestural dyadic interactions. AUTISM : THE INTERNATIONAL JOURNAL OF RESEARCH AND PRACTICE 2024; 28:1565-1581. [PMID: 38006222 PMCID: PMC11134991 DOI: 10.1177/13623613231211967] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2023]
Abstract
LAY ABSTRACT Autistic people have been said to have 'problems' with joint attention, that is, looking where someone else is looking. Past studies of joint attention have used tasks that require autistic people to continuously look at and respond to eye-gaze cues. But joint attention can also be done using other social cues, like pointing. This study looked at whether autistic and non-autistic young people use another person's eye gaze during joint attention in a task that did not require them to look at their partner's face. In the task, each participant worked together with their partner to find a computer-generated object in virtual reality. Sometimes the participant had to help guide their partner to the object, and other times, they followed their partner's lead. Participants were told to point to guide one another but were not told to use eye gaze. Both autistic and non-autistic participants often looked at their partner's face during joint attention interactions and were faster to respond to their partner's hand-pointing when the partner also looked at the object before pointing. This shows that autistic people can and do use information from another person's eyes, even when they don't have to. It is possible that, by not forcing autistic young people to look at their partner's face and eyes, they were better able to gather information from their partner's face when needed, without being overwhelmed. This shows how important it is to design tasks that provide autistic people with opportunities to show what they can do.
Collapse
|
2
|
Bonnaire J, Dumas G, Cassell J. Bringing together multimodal and multilevel approaches to study the emergence of social bonds between children and improve social AI. FRONTIERS IN NEUROERGONOMICS 2024; 5:1290256. [PMID: 38827377 PMCID: PMC11140154 DOI: 10.3389/fnrgo.2024.1290256] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/07/2023] [Accepted: 04/29/2024] [Indexed: 06/04/2024]
Abstract
This protocol paper outlines an innovative multimodal and multilevel approach to studying the emergence and evolution of how children build social bonds with their peers, and its potential application to improving social artificial intelligence (AI). We detail a unique hyperscanning experimental framework utilizing functional near-infrared spectroscopy (fNIRS) to observe inter-brain synchrony in child dyads during collaborative tasks and social interactions. Our proposed longitudinal study spans middle childhood, aiming to capture the dynamic development of social connections and cognitive engagement in naturalistic settings. To do so we bring together four kinds of data: the multimodal conversational behaviors that dyads of children engage in, evidence of their state of interpersonal rapport, collaborative performance on educational tasks, and inter-brain synchrony. Preliminary pilot data provide foundational support for our approach, indicating promising directions for identifying neural patterns associated with productive social interactions. The planned research will explore the neural correlates of social bond formation, informing the creation of a virtual peer learning partner in the field of Social Neuroergonomics. This protocol promises significant contributions to understanding the neural basis of social connectivity in children, while also offering a blueprint for designing empathetic and effective social AI tools, particularly for educational contexts.
Collapse
Affiliation(s)
| | - Guillaume Dumas
- Research Center of the CHU Sainte-Justine, Department of Psychiatry, University of Montréal, Montreal, QC, Canada
- Mila–Quebec Artificial Intelligence Institute, Montreal, QC, Canada
| | - Justine Cassell
- Inria Paris Centre, Paris, France
- School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, United States
| |
Collapse
|
3
|
Dolinski D, Grzyb T. Obedience to authority as a function of the physical proximity of the student, teacher, and experimenter. THE JOURNAL OF SOCIAL PSYCHOLOGY 2024:1-13. [PMID: 38696401 DOI: 10.1080/00224545.2024.2348479] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2023] [Accepted: 04/18/2024] [Indexed: 05/04/2024]
Abstract
The authors are proposing a theoretical model explaining the behavior of individuals tested through experiments on obedience toward authority conducted according to Milgram's paradigm. Their assumption is that the participant faces typical avoidance-avoidance conflict conditions. Participant does not want to hurt the learner in the adjacent room but he or she also does not want to harm the experimenter. The solution to this conflict, entailing hurting on of the two, may be different depending on the spatial organization of the experiment. In the study, experimental conditions were modified, so that the participant was (vs. was not) in the same room as the experimenter and was (vs. was not) in the same room as the learner. Forty individuals (20 women and 20 men) were tested in each of the four experimental conditions. It turns out that the physical presence of the experimenter was conducive to obedience, while the physical presence of the learner reduced it.
Collapse
|
4
|
Jording M, Hartz A, Vogel DHV, Schulte-Rüther M, Vogeley K. Impaired recognition of interactive intentions in adults with autism spectrum disorder not attributable to differences in visual attention or coordination via eye contact and joint attention. Sci Rep 2024; 14:8297. [PMID: 38594289 PMCID: PMC11004189 DOI: 10.1038/s41598-024-58696-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2023] [Accepted: 04/01/2024] [Indexed: 04/11/2024] Open
Abstract
Altered nonverbal communication patterns especially with regard to gaze interactions are commonly reported for persons with autism spectrum disorder (ASD). In this study we investigate and differentiate for the first time the interplay of attention allocation, the establishment of shared focus (eye contact and joint attention) and the recognition of intentions in gaze interactions in adults with ASD compared to control persons. Participants interacted via gaze with a virtual character (VC), who they believed was controlled by another person. Participants were instructed to ascertain whether their partner was trying to interact with them. In fact, the VC was fully algorithm-controlled and showed either interactive or non-interactive gaze behavior. Participants with ASD were specifically impaired in ascertaining whether their partner was trying to interact with them or not as compared to participants without ASD whereas neither the allocation of attention nor the ability to establish a shared focus were affected. Thus, perception and production of gaze cues seem preserved while the evaluation of gaze cues appeared to be impaired. An additional exploratory analysis suggests that especially the interpretation of contingencies between the interactants' actions are altered in ASD and should be investigated more closely.
Collapse
Affiliation(s)
- Mathis Jording
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Forschungszentrum Jülich, Jülich, Germany.
- Department of Psychiatry, Faculty of Medicine, University Hospital Cologne, University of Cologne, Cologne, Germany.
| | - Arne Hartz
- Child Neuropsychology Section, Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, University Hospital RWTH, Aachen, Germany
| | - David H V Vogel
- Department of Psychiatry, Faculty of Medicine, University Hospital Cologne, University of Cologne, Cologne, Germany
- Department of Neurology, Faculty of Medicine, University Hospital Cologne, University of Cologne, Cologne, Germany
| | - Martin Schulte-Rüther
- Child Neuropsychology Section, Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, University Hospital RWTH, Aachen, Germany
- Department of Child and Adolescent Psychiatry, Center for Psychosocial Medicine - University Hospital Heidelberg, Ruprechts-Karls University Heidelberg, Heidelberg, Germany
- Department of Child and Adolescent Psychiatry and Psychotherapy, University Medical Center Göttingen, Georg-August University Göttingen, Göttingen, Germany
| | - Kai Vogeley
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Forschungszentrum Jülich, Jülich, Germany
- Department of Psychiatry, Faculty of Medicine, University Hospital Cologne, University of Cologne, Cologne, Germany
| |
Collapse
|
5
|
Polzer L, Schenk M, Raji N, Kleber S, Lemler C, Kitzerow-Cleven J, Kim Z, Freitag CM, Bast N. Temporal progression of pupil dilation and gaze behavior to emotion expressions in preschoolers with autism spectrum disorder. Sci Rep 2024; 14:7843. [PMID: 38570565 PMCID: PMC10991397 DOI: 10.1038/s41598-024-58480-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2023] [Accepted: 03/29/2024] [Indexed: 04/05/2024] Open
Abstract
Previous work has shown divergent pupil dilation (PD) and gaze behavior in individuals with autism spectrum disorder (ASD), which may relate to the development of social difficulties in early life. Here, we investigated temporal dynamics of both phenotypes during naturalistic videos of a person displaying facial emotion expressions in 61 autistic and 61 non-autistic preschoolers. PD was segmented into three serial time components derived from a principal component analysis. Growth curve analysis was applied to analyze changes in looking time on eye and mouth regions over time. Groups did not differ in PD time components. Growth curve analysis revealed initially shorter looking times on the eyes and longer looking times on the mouth in autistic versus non-autistic preschoolers. However, a reversion of this pattern was observed over time, suggesting a delayed compensatory increase in eye attention during prolonged viewing periods in autistic children. Positive and negative associations of PD components and gaze behavior over time indicated a dynamic temporal relationship during emotion viewing. Our findings emphasize the need to apply time-sensitive measures in ecologically valid research, which may index etiological mechanisms of social difficulties in ASD.
Collapse
Affiliation(s)
- Leonie Polzer
- Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, Autism Research and Intervention Center of Excellence, University Hospital Frankfurt, Goethe-University, Deutschordenstraße 50, 60528, Frankfurt am Main, Germany.
| | - Marc Schenk
- Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, Autism Research and Intervention Center of Excellence, University Hospital Frankfurt, Goethe-University, Deutschordenstraße 50, 60528, Frankfurt am Main, Germany
| | - Naisan Raji
- Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, Autism Research and Intervention Center of Excellence, University Hospital Frankfurt, Goethe-University, Deutschordenstraße 50, 60528, Frankfurt am Main, Germany
| | - Solvejg Kleber
- Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, Autism Research and Intervention Center of Excellence, University Hospital Frankfurt, Goethe-University, Deutschordenstraße 50, 60528, Frankfurt am Main, Germany
| | - Christian Lemler
- Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, Autism Research and Intervention Center of Excellence, University Hospital Frankfurt, Goethe-University, Deutschordenstraße 50, 60528, Frankfurt am Main, Germany
| | - Janina Kitzerow-Cleven
- Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, Autism Research and Intervention Center of Excellence, University Hospital Frankfurt, Goethe-University, Deutschordenstraße 50, 60528, Frankfurt am Main, Germany
| | - Ziyon Kim
- Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, Autism Research and Intervention Center of Excellence, University Hospital Frankfurt, Goethe-University, Deutschordenstraße 50, 60528, Frankfurt am Main, Germany
| | - Christine M Freitag
- Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, Autism Research and Intervention Center of Excellence, University Hospital Frankfurt, Goethe-University, Deutschordenstraße 50, 60528, Frankfurt am Main, Germany
| | - Nico Bast
- Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, Autism Research and Intervention Center of Excellence, University Hospital Frankfurt, Goethe-University, Deutschordenstraße 50, 60528, Frankfurt am Main, Germany
| |
Collapse
|
6
|
Williams EH, Chakrabarti B. The integration of head and body cues during the perception of social interactions. Q J Exp Psychol (Hove) 2024; 77:776-788. [PMID: 37232389 PMCID: PMC10960325 DOI: 10.1177/17470218231181001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2022] [Revised: 03/10/2023] [Accepted: 05/10/2023] [Indexed: 05/27/2023]
Abstract
Humans spend a large proportion of time participating in social interactions. The ability to accurately detect and respond to human interactions is vital for social functioning, from early childhood through to older adulthood. This detection ability arguably relies on integrating sensory information from the interactants. Within the visual modality, directional information from a person's eyes, head, and body are integrated to inform where another person is looking and who they are interacting with. To date, social cue integration research has focused largely on the perception of isolated individuals. Across two experiments, we investigated whether observers integrate body information with head information when determining whether two people are interacting, and manipulated frame of reference (one of the interactants facing observer vs. facing away from observer) and the eye-region visibility of the interactant. Results demonstrate that individuals integrate information from the body with head information when perceiving dyadic interactions, and that integration is influenced by the frame of reference and visibility of the eye-region. Interestingly, self-reported autistics traits were associated with a stronger influence of body information on interaction perception, but only when the eye-region was visible. This study investigated the recognition of dyadic interactions using whole-body stimuli while manipulating eye visibility and frame of reference, and provides crucial insights into social cue integration, as well as how autistic traits affect cue integration, during perception of social interactions.
Collapse
Affiliation(s)
- Elin H Williams
- Centre for Autism, School of Psychology and Clinical Language Sciences, University of Reading, Reading, UK
| | - Bhismadev Chakrabarti
- Centre for Autism, School of Psychology and Clinical Language Sciences, University of Reading, Reading, UK
- India Autism Centre, Kolkata, India
- Department of Psychology, Ashoka University, Sonipat, India
| |
Collapse
|
7
|
Yin W, Lee YC. How different face mask types affect interpersonal distance perception and threat feeling in social interaction. Cogn Process 2024:10.1007/s10339-024-01179-z. [PMID: 38492094 DOI: 10.1007/s10339-024-01179-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2023] [Accepted: 02/05/2024] [Indexed: 03/18/2024]
Abstract
Due to the easing of the pandemic, public policies no longer mandated people to wear masks. People can choose to no wear or wear different types of masks based on personal preferences and safety perceptions during daily interaction. Available information about the influence of face mask type on interpersonal distance (IPD) by different aging populations is still lacking. Thus, this study aimed to investigate the face mask type (no wear, cloth, medical and N95 mask) and age group effect of avatars (children, adults and older adults) on IPD perception, threat feeling and physiological skin conductance response under active and passive approaching. One hundred participants with a range from 20 to 35 years old were recruited for this study. Twelve avatars (three age groups*four face mask conditions) were created and applied in a virtual reality environment. The results showed that age group, mask type and approach mode had significant effects on IPD and subjective threat feeling. A non-significant effect was found on skin conductance responses. Participants maintained a significantly longer IPD when facing the older adults, followed by adults and then children. In the passive approach condition, people tended to maintain a significantly greater comfort distance than during the active approach. For the mask type effect, people kept a significantly largest and shortest IPD when facing an avatar with no mask or the N95 mask, respectively. A non-significant IPD difference was found between the N95 and medical mask. Additionally, based on the subjective threat feeling, facing an avatar wearing a medical mask generated the lowest threat feeling compared to the others. The findings of this study indicated that wearing medical masks provided a benefit in bringing people closer for interaction during specific situations. Understanding that mask-wearing, especially medical one, brought to shortest IPD when compared to the unmasked condition can be utilized to enhance safety measures in crowded public spaces and health-care settings. This information could guide the development of physical distancing recommendations, taking into account both the type of mask and the age groups involved, to ensure the maintenance of appropriate distances.
Collapse
Affiliation(s)
- Wenjing Yin
- School of Design, South China University of Technology, Guangzhou, China
| | - Yu-Chi Lee
- Department of Industrial Engineering and Management, National Taipei University of Technology, 1, Sec. 3, Zhongxiao E. Rd., Taipei, 10608, Taiwan.
| |
Collapse
|
8
|
Wohltjen S, Wheatley T. Interpersonal eye-tracking reveals the dynamics of interacting minds. Front Hum Neurosci 2024; 18:1356680. [PMID: 38532792 PMCID: PMC10963423 DOI: 10.3389/fnhum.2024.1356680] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2023] [Accepted: 02/20/2024] [Indexed: 03/28/2024] Open
Abstract
The human eye is a rich source of information about where, when, and how we attend. Our gaze paths indicate where and what captures our attention, while changes in pupil size can signal surprise, revealing our expectations. Similarly, the pattern of our blinks suggests levels of alertness and when our attention shifts between external engagement and internal thought. During interactions with others, these cues reveal how we coordinate and share our mental states. To leverage these insights effectively, we need accurate, timely methods to observe these cues as they naturally unfold. Advances in eye-tracking technology now enable real-time observation of these cues, shedding light on mutual cognitive processes that foster shared understanding, collaborative thought, and social connection. This brief review highlights these advances and the new opportunities they present for future research.
Collapse
Affiliation(s)
- Sophie Wohltjen
- Department of Psychology, University of Wisconsin–Madison, Madison, WI, United States
| | - Thalia Wheatley
- Department of Psychological and Brain Sciences, Consortium for Interacting Minds, Dartmouth College, Hanover, NH, United States
- Santa Fe Institute, Santa Fe, NM, United States
| |
Collapse
|
9
|
Suslow T, Hoepfel D, Kersting A, Bodenschatz CM. Depressive symptoms and visual attention to others' eyes in healthy individuals. BMC Psychiatry 2024; 24:184. [PMID: 38448877 PMCID: PMC10916197 DOI: 10.1186/s12888-024-05633-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/01/2023] [Accepted: 02/23/2024] [Indexed: 03/08/2024] Open
Abstract
BACKGROUND Eye contact is a fundamental part of social interaction. In clinical studies, it has been observed that patients suffering from depression make less eye contact during interviews than healthy individuals, which could be a factor contributing to their social functioning impairments. Similarly, results from mood induction studies with healthy persons indicate that attention to the eyes diminishes as a function of sad mood. The present screen-based eye-tracking study examined whether depressive symptoms in healthy individuals are associated with reduced visual attention to other persons' direct gaze during free viewing. METHODS Gaze behavior of 44 individuals with depressive symptoms and 49 individuals with no depressive symptoms was analyzed in a free viewing task. Grouping was based on the Beck Depression Inventory using the cut-off proposed by Hautzinger et al. (2006). Participants saw pairs of faces with direct gaze showing emotional or neutral expressions. One-half of the face pairs was shown without face masks, whereas the other half was presented with face masks. Participants' dwell times and first fixation durations were analyzed. RESULTS In case of unmasked facial expressions, participants with depressive symptoms looked shorter at the eyes compared to individuals without symptoms across all expression conditions. No group difference in first fixation duration on the eyes of masked and unmasked faces was observed. Individuals with depressive symptoms dwelled longer on the mouth region of unmasked faces. For masked faces, no significant group differences in dwell time on the eyes were found. Moreover, when specifically examining dwell time on the eyes of faces with an emotional expression there were also no significant differences between groups. Overall, participants gazed significantly longer at the eyes in masked compared to unmasked faces. CONCLUSIONS For faces without mask, our results suggest that depressiveness in healthy individuals goes along with less visual attention to other persons' eyes but not with less visual attention to others' faces. When factors come into play that generally amplify the attention directed to the eyes such as face masks or emotions then no relationship between depressiveness and visual attention to the eyes can be established.
Collapse
Affiliation(s)
- Thomas Suslow
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig Medical Center, Semmelweisstr. 10, 04103, Leipzig, Germany.
| | - Dennis Hoepfel
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig Medical Center, Semmelweisstr. 10, 04103, Leipzig, Germany
| | - Anette Kersting
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig Medical Center, Semmelweisstr. 10, 04103, Leipzig, Germany
| | - Charlott Maria Bodenschatz
- Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig Medical Center, Semmelweisstr. 10, 04103, Leipzig, Germany
| |
Collapse
|
10
|
Menendez E, Martínez S, Díaz-de-María F, Balaguer C. Integrating Egocentric and Robotic Vision for Object Identification Using Siamese Networks and Superquadric Estimations in Partial Occlusion Scenarios. Biomimetics (Basel) 2024; 9:100. [PMID: 38392146 PMCID: PMC10886810 DOI: 10.3390/biomimetics9020100] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2023] [Revised: 01/31/2024] [Accepted: 02/06/2024] [Indexed: 02/24/2024] Open
Abstract
This paper introduces a novel method that enables robots to identify objects based on user gaze, tracked via eye-tracking glasses. This is achieved without prior knowledge of the objects' categories or their locations and without external markers. The method integrates a two-part system: a category-agnostic object shape and pose estimator using superquadrics and Siamese networks. The superquadrics-based component estimates the shapes and poses of all objects, while the Siamese network matches the object targeted by the user's gaze with the robot's viewpoint. Both components are effectively designed to function in scenarios with partial occlusions. A key feature of the system is the user's ability to move freely around the scenario, allowing dynamic object selection via gaze from any position. The system is capable of handling significant viewpoint differences between the user and the robot and adapts easily to new objects. In tests under partial occlusion conditions, the Siamese networks demonstrated an 85.2% accuracy in aligning the user-selected object with the robot's viewpoint. This gaze-based Human-Robot Interaction approach demonstrates its practicality and adaptability in real-world scenarios.
Collapse
Affiliation(s)
- Elisabeth Menendez
- System Engineering and Automation Department, University Carlos III, Av de la Universidad, 30, 28911 Madrid, Spain
| | - Santiago Martínez
- System Engineering and Automation Department, University Carlos III, Av de la Universidad, 30, 28911 Madrid, Spain
| | - Fernando Díaz-de-María
- Signal Theory and Communications Department, University Carlos III, Av de la Universidad, 30, 28911 Madrid, Spain
| | - Carlos Balaguer
- System Engineering and Automation Department, University Carlos III, Av de la Universidad, 30, 28911 Madrid, Spain
| |
Collapse
|
11
|
Vanoncini M, Hoehl S, Elsner B, Wallot S, Boll-Avetisyan N, Kayhan E. Mother-infant social gaze dynamics relate to infant brain activity and word segmentation. Dev Cogn Neurosci 2024; 65:101331. [PMID: 38113766 PMCID: PMC10770595 DOI: 10.1016/j.dcn.2023.101331] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2023] [Revised: 11/24/2023] [Accepted: 12/13/2023] [Indexed: 12/21/2023] Open
Abstract
The 'social brain', consisting of areas sensitive to social information, supposedly gates the mechanisms involved in human language learning. Early preverbal interactions are guided by ostensive signals, such as gaze patterns, which are coordinated across body, brain, and environment. However, little is known about how the infant brain processes social gaze in naturalistic interactions and how this relates to infant language development. During free-play of 9-month-olds with their mothers, we recorded hemodynamic cortical activity of ´social brain` areas (prefrontal cortex, temporo-parietal junctions) via fNIRS, and micro-coded mother's and infant's social gaze. Infants' speech processing was assessed with a word segmentation task. Using joint recurrence quantification analysis, we examined the connection between infants' ´social brain` activity and the temporal dynamics of social gaze at intrapersonal (i.e., infant's coordination, maternal coordination) and interpersonal (i.e., dyadic coupling) levels. Regression modeling revealed that intrapersonal dynamics in maternal social gaze (but not infant's coordination or dyadic coupling) coordinated significantly with infant's cortical activity. Moreover, recurrence quantification analysis revealed that intrapersonal maternal social gaze dynamics (in terms of entropy) were the best predictor of infants' word segmentation. The findings support the importance of social interaction in language development, particularly highlighting maternal social gaze dynamics.
Collapse
Affiliation(s)
- Monica Vanoncini
- Department of Developmental Psychology, University of Potsdam, Karl-Liebknecht-Str. 24-25, 14476 Potsdam, Germany; Department of Linguistics, University of Potsdam, Karl-Liebknecht-Str. 24-25, 14476 Potsdam, Germany; Department of Developmental and Educational Psychology, University of Vienna, Liebiggasse 5, 1010 Vienna, Austria; Research Focus Cognitive Sciences, University of Potsdam, Germany.
| | - Stefanie Hoehl
- Department of Developmental and Educational Psychology, University of Vienna, Liebiggasse 5, 1010 Vienna, Austria
| | - Birgit Elsner
- Department of Developmental Psychology, University of Potsdam, Karl-Liebknecht-Str. 24-25, 14476 Potsdam, Germany; Research Focus Cognitive Sciences, University of Potsdam, Germany
| | - Sebastian Wallot
- Institute for Sustainability Education and Psychology (ISEP), Leuphana Universität Lüneburg, Universitätsallee 1, 21335, Lüneburg, Germany
| | - Natalie Boll-Avetisyan
- Department of Linguistics, University of Potsdam, Karl-Liebknecht-Str. 24-25, 14476 Potsdam, Germany; Research Focus Cognitive Sciences, University of Potsdam, Germany
| | - Ezgi Kayhan
- Department of Developmental Psychology, University of Potsdam, Karl-Liebknecht-Str. 24-25, 14476 Potsdam, Germany; Research Focus Cognitive Sciences, University of Potsdam, Germany; Max Planck Institute for Human Cognitive and Brain Sciences, Stephanstraße 1a, 04103 Leipzig, Germany
| |
Collapse
|
12
|
Ochi K, Kojima M, Ono N, Kuroda M, Owada K, Sagayama S, Yamasue H. Objective assessment of autism spectrum disorder based on performance in structured interpersonal acting-out tasks with prosodic stability and variability. Autism Res 2024; 17:395-409. [PMID: 38151701 DOI: 10.1002/aur.3080] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2023] [Accepted: 12/01/2023] [Indexed: 12/29/2023]
Abstract
In this study, we sought to objectively and quantitatively characterize the prosodic features of autism spectrum disorder (ASD) via the characteristics of prosody in a newly developed structured speech experiment. Male adults with high-functioning ASD and age/intelligence-matched men with typical development (TD) were asked to read 29 brief scripts aloud in response to preceding auditory stimuli. To investigate whether (1) highly structured acting-out tasks can uncover the prosodic of difference between those with ASD and TD, and (2) the prosodic stableness and flexibleness can be used for objective automatic assessment of ASD, we compared prosodic features such as fundamental frequency, intensity, and mora duration. The results indicate that individuals with ASD exhibit stable pitch registers or volume levels in some affective vocal-expression scenarios, such as those involving anger or sadness, compared with TD and those with TD. However, unstable prosody was observed in some timing control or emphasis tasks in the participants with ASD. Automatic classification of the ASD and TD groups using a support vector machine (SVM) with speech features exhibited an accuracy of 90.4%. A machine learning-based assessment of the degree of ASD core symptoms using support vector regression (SVR) also had good performance. These results may inform the development of a new easy-to-use assessment tool for ASD core symptoms using recorded audio signals.
Collapse
Affiliation(s)
- Keiko Ochi
- Graduate School of Informatics, Kyoto University, Kyoto, Japan
| | - Masaki Kojima
- Graduate School of Medicine, University of Tokyo, Tokyo, Japan
| | - Nobutaka Ono
- Graduate School of Systems Design, Tokyo Metropolitan University, Tokyo, Japan
| | - Miho Kuroda
- Graduate School of Medicine, University of Tokyo, Tokyo, Japan
| | - Keiho Owada
- Graduate School of Medicine, University of Tokyo, Tokyo, Japan
| | | | - Hidenori Yamasue
- Graduate School of Medicine, University of Tokyo, Tokyo, Japan
- Department of Psychiatry, Hamamatsu University School of Medicine, Hamamatsu City, Japan
| |
Collapse
|
13
|
Adiani D, Breen M, Migovich M, Wade J, Hunt S, Tauseef M, Khan N, Colopietro K, Lanthier M, Swanson A, Vogus TJ, Sarkar N. Multimodal job interview simulator for training of autistic individuals. Assist Technol 2024; 36:22-39. [PMID: 37000014 DOI: 10.1080/10400435.2023.2188907] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/02/2023] [Indexed: 04/01/2023] Open
Abstract
Autistic individuals face difficulties in finding and maintaining employment, and studies have shown that the job interview is often a significant barrier to obtaining employment. Prior computer-based job interview training interventions for autistic individuals have been associated with better interview outcomes. These previous interventions, however, do not leverage the use of multimodal data that could give insight into the emotional underpinnings of autistic individuals' challenges in job interviews. In this article, the authors present the design of a novel multimodal job interview training platform called CIRVR that simulates job interviews through spoken interaction and collects eye gaze, facial expressions, and physiological responses of the participants to understand their stress response and their affective state. Results from a feasibility study with 23 autistic participants who interacted with CIRVR are presented. In addition, qualitative feedback was gathered from stakeholders on visualizations of data on CIRVR's visualization tool called the Dashboard. The data gathered indicate the potential of CIRVR along with the Dashboard to be used in the creation of individualized job interview training of autistic individuals.
Collapse
Affiliation(s)
- Deeksha Adiani
- Computer Science, Vanderbilt University, Nashville, Tennessee, USA
| | - Michael Breen
- Mechanical Engineering, Vanderbilt University, Nashville, Tennessee, USA
| | - Miroslava Migovich
- Mechanical Engineering, Vanderbilt University, Nashville, Tennessee, USA
| | - Joshua Wade
- Mechanical Engineering, Vanderbilt University, Nashville, Tennessee, USA
| | - Spencer Hunt
- Mechanical Engineering, Vanderbilt University, Nashville, Tennessee, USA
| | - Mahrukh Tauseef
- Electrical and Computer Engineering, Vanderbilt University, Nashville, Tennessee, USA
| | - Nibraas Khan
- Computer Science, Vanderbilt University, Nashville, Tennessee, USA
| | - Kelley Colopietro
- Mechanical Engineering, Vanderbilt University, Nashville, Tennessee, USA
| | - Megan Lanthier
- TRIAD, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| | - Amy Swanson
- TRIAD, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| | - Timothy J Vogus
- Owen School of Management, Vanderbilt University, Nashville, Tennessee, USA
| | - Nilanjan Sarkar
- Computer Science, Vanderbilt University, Nashville, Tennessee, USA
- Mechanical Engineering, Vanderbilt University, Nashville, Tennessee, USA
- Electrical and Computer Engineering, Vanderbilt University, Nashville, Tennessee, USA
| |
Collapse
|
14
|
Quarmley M, Zelinsky G, Athar S, Yang Z, Drucker JH, Samaras D, Jarcho JM. Nonverbal behavioral patterns predict social rejection elicited aggression. Biol Psychol 2023; 183:108670. [PMID: 37652178 PMCID: PMC10591947 DOI: 10.1016/j.biopsycho.2023.108670] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2023] [Revised: 08/02/2023] [Accepted: 08/28/2023] [Indexed: 09/02/2023]
Abstract
Aggression elicited by social rejection is costly, prevalent, and often lethal. Attempts to predict rejection-elicited aggression using trait-based data have had little success. This may be because in-the-moment aggression is a complex process influenced by current states of attention, arousal, and affect which are poorly predicted by trait-level characteristics. In a study of young adults (N = 89; 18-25 years), machine learning tested the extent to which nonverbal behavioral indices of attention (eye gaze), arousal (pupillary reactivity), and affect (facial expressions) during a novel social interaction paradigm predicted subsequent aggression towards rejecting and accepting peers. Eye gaze and pupillary reactivity predicted aggressive behavior; predictions were more successful than measures of trait-based aggression and harsh parenting. These preliminary results suggest that nonverbal behavior may elucidate underlying mechanisms of in-the-moment aggression.
Collapse
Affiliation(s)
- M Quarmley
- Department of Psychology, Temple University, Philadelphia, PA, United States
| | - G Zelinsky
- Department of Psychology, Stony Brook University, Stony Brook, NY, United States
| | - S Athar
- Department of Computer Science, Stony Brook University, Stony Brook, NY, United States
| | - Z Yang
- Department of Computer Science, Stony Brook University, Stony Brook, NY, United States
| | | | - D Samaras
- Department of Computer Science, Stony Brook University, Stony Brook, NY, United States
| | - J M Jarcho
- Department of Psychology, Temple University, Philadelphia, PA, United States.
| |
Collapse
|
15
|
Jayashankar A, Bynum B, Butera C, Kilroy E, Harrison L, Aziz-Zadeh L. Connectivity differences between inferior frontal gyrus and mentalizing network in autism as compared to developmental coordination disorder and non-autistic youth. Cortex 2023; 167:115-131. [PMID: 37549452 PMCID: PMC10543516 DOI: 10.1016/j.cortex.2023.06.014] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2023] [Revised: 06/08/2023] [Accepted: 06/15/2023] [Indexed: 08/09/2023]
Abstract
Prior studies have compared neural connectivity during mentalizing tasks in autism (ASD) to non-autistic individuals and found reduced connectivity between the inferior frontal gyrus (IFG) and mentalizing regions. However, given that the IFG is involved in motor processing, and about 80% of autistic individuals have motor-related difficulties, it is necessary to explore if these differences are specific to ASD or instead similar across other developmental motor disorders, such as developmental coordination disorder (DCD). Participants (29 ASD, 20 DCD, 31 typically developing [TD]; ages 8-17) completed a mentalizing task in the fMRI scanner, where they were asked to think about why someone was performing an action. Results indicated that the ASD group, as compared to both TD and DCD groups, showed significant functional connectivity differences when mentalizing about other's actions. The left IFG seed revealed ASD connectivity differences with the: bilateral temporoparietal junction (TPJ), left insular cortex, and bilateral dorsolateral prefrontal cortex (DLPFC). Connectivity differences using the right IFG seed revealed ASD differences in the: left insula, and right DLPFC. These results indicate that connectivity differences between the IFG, mentalizing regions, emotion and motor processing regions are specific to ASD and not a result of potentially co-occurring motor differences.
Collapse
Affiliation(s)
- Aditya Jayashankar
- Center for Neuroscience of Embodied Cognition (CeNEC), Brain and Creativity Institute, Dornsife College of Letters, Arts and Sciences, University of Southern California, Los Angeles, CA, USA; USC Mrs. T.H. Chan Division of Occupational Science and Occupational Therapy, University of Southern California, Los Angeles, CA, USA
| | - Brittany Bynum
- Center for Neuroscience of Embodied Cognition (CeNEC), Brain and Creativity Institute, Dornsife College of Letters, Arts and Sciences, University of Southern California, Los Angeles, CA, USA; USC Mark and Mary Stevens Neuroimaging and Informatics Institute, University of Southern California, Los Angeles, CA, USA
| | - Christiana Butera
- Center for Neuroscience of Embodied Cognition (CeNEC), Brain and Creativity Institute, Dornsife College of Letters, Arts and Sciences, University of Southern California, Los Angeles, CA, USA; USC Mrs. T.H. Chan Division of Occupational Science and Occupational Therapy, University of Southern California, Los Angeles, CA, USA
| | - Emily Kilroy
- Center for Neuroscience of Embodied Cognition (CeNEC), Brain and Creativity Institute, Dornsife College of Letters, Arts and Sciences, University of Southern California, Los Angeles, CA, USA; USC Mrs. T.H. Chan Division of Occupational Science and Occupational Therapy, University of Southern California, Los Angeles, CA, USA
| | - Laura Harrison
- Center for Neuroscience of Embodied Cognition (CeNEC), Brain and Creativity Institute, Dornsife College of Letters, Arts and Sciences, University of Southern California, Los Angeles, CA, USA; USC Mrs. T.H. Chan Division of Occupational Science and Occupational Therapy, University of Southern California, Los Angeles, CA, USA
| | - Lisa Aziz-Zadeh
- Center for Neuroscience of Embodied Cognition (CeNEC), Brain and Creativity Institute, Dornsife College of Letters, Arts and Sciences, University of Southern California, Los Angeles, CA, USA; USC Mrs. T.H. Chan Division of Occupational Science and Occupational Therapy, University of Southern California, Los Angeles, CA, USA.
| |
Collapse
|
16
|
Trujillo JP, Holler J. Interactionally Embedded Gestalt Principles of Multimodal Human Communication. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2023; 18:1136-1159. [PMID: 36634318 PMCID: PMC10475215 DOI: 10.1177/17456916221141422] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/14/2023]
Abstract
Natural human interaction requires us to produce and process many different signals, including speech, hand and head gestures, and facial expressions. These communicative signals, which occur in a variety of temporal relations with each other (e.g., parallel or temporally misaligned), must be rapidly processed as a coherent message by the receiver. In this contribution, we introduce the notion of interactionally embedded, affordance-driven gestalt perception as a framework that can explain how this rapid processing of multimodal signals is achieved as efficiently as it is. We discuss empirical evidence showing how basic principles of gestalt perception can explain some aspects of unimodal phenomena such as verbal language processing and visual scene perception but require additional features to explain multimodal human communication. We propose a framework in which high-level gestalt predictions are continuously updated by incoming sensory input, such as unfolding speech and visual signals. We outline the constituent processes that shape high-level gestalt perception and their role in perceiving relevance and prägnanz. Finally, we provide testable predictions that arise from this multimodal interactionally embedded gestalt-perception framework. This review and framework therefore provide a theoretically motivated account of how we may understand the highly complex, multimodal behaviors inherent in natural social interaction.
Collapse
Affiliation(s)
- James P. Trujillo
- Donders Institute for Brain, Cognition, and Behaviour, Nijmegen, the Netherlands
- Max Planck Institute for Psycholinguistics, Nijmegen, the Netherlands
| | - Judith Holler
- Donders Institute for Brain, Cognition, and Behaviour, Nijmegen, the Netherlands
- Max Planck Institute for Psycholinguistics, Nijmegen, the Netherlands
| |
Collapse
|
17
|
Landsiedel J, Koldewyn K. Auditory dyadic interactions through the "eye" of the social brain: How visual is the posterior STS interaction region? IMAGING NEUROSCIENCE (CAMBRIDGE, MASS.) 2023; 1:1-20. [PMID: 37719835 PMCID: PMC10503480 DOI: 10.1162/imag_a_00003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/16/2023] [Accepted: 05/17/2023] [Indexed: 09/19/2023]
Abstract
Human interactions contain potent social cues that meet not only the eye but also the ear. Although research has identified a region in the posterior superior temporal sulcus as being particularly sensitive to visually presented social interactions (SI-pSTS), its response to auditory interactions has not been tested. Here, we used fMRI to explore brain response to auditory interactions, with a focus on temporal regions known to be important in auditory processing and social interaction perception. In Experiment 1, monolingual participants listened to two-speaker conversations (intact or sentence-scrambled) and one-speaker narrations in both a known and an unknown language. Speaker number and conversational coherence were explored in separately localised regions-of-interest (ROI). In Experiment 2, bilingual participants were scanned to explore the role of language comprehension. Combining univariate and multivariate analyses, we found initial evidence for a heteromodal response to social interactions in SI-pSTS. Specifically, right SI-pSTS preferred auditory interactions over control stimuli and represented information about both speaker number and interactive coherence. Bilateral temporal voice areas (TVA) showed a similar, but less specific, profile. Exploratory analyses identified another auditory-interaction sensitive area in anterior STS. Indeed, direct comparison suggests modality specific tuning, with SI-pSTS preferring visual information while aSTS prefers auditory information. Altogether, these results suggest that right SI-pSTS is a heteromodal region that represents information about social interactions in both visual and auditory domains. Future work is needed to clarify the roles of TVA and aSTS in auditory interaction perception and further probe right SI-pSTS interaction-selectivity using non-semantic prosodic cues.
Collapse
Affiliation(s)
- Julia Landsiedel
- Department of Psychology, School of Human and Behavioural Sciences, Bangor University, Bangor, United Kingdom
| | - Kami Koldewyn
- Department of Psychology, School of Human and Behavioural Sciences, Bangor University, Bangor, United Kingdom
| |
Collapse
|
18
|
Wolf A, Tripanpitak K, Umeda S, Otake-Matsuura M. Eye-tracking paradigms for the assessment of mild cognitive impairment: a systematic review. Front Psychol 2023; 14:1197567. [PMID: 37546488 PMCID: PMC10399700 DOI: 10.3389/fpsyg.2023.1197567] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2023] [Accepted: 06/19/2023] [Indexed: 08/08/2023] Open
Abstract
Mild cognitive impairment (MCI), representing the 'transitional zone' between normal cognition and dementia, has become a novel topic in clinical research. Although early detection is crucial, it remains logistically challenging at the same time. While traditional pen-and-paper tests require in-depth training to ensure standardized administration and accurate interpretation of findings, significant technological advancements are leading to the development of procedures for the early detection of Alzheimer's disease (AD) and facilitating the diagnostic process. Some of the diagnostic protocols, however, show significant limitations that hamper their widespread adoption. Concerns about the social and economic implications of the increasing incidence of AD underline the need for reliable, non-invasive, cost-effective, and timely cognitive scoring methodologies. For instance, modern clinical studies report significant oculomotor impairments among patients with MCI, who perform poorly in visual paired-comparison tasks by ascribing less attentional resources to novel stimuli. To accelerate the Global Action Plan on the Public Health Response to Dementia 2017-2025, this work provides an overview of research on saccadic and exploratory eye-movement deficits among older adults with MCI. The review protocol was drafted based on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. Electronic databases were systematically searched to identify peer-reviewed articles published between 2017 and 2022 that examined visual processing in older adults with MCI and reported gaze parameters as potential biomarkers. Moreover, following the contemporary trend for remote healthcare technologies, we reviewed studies that implemented non-commercial eye-tracking instrumentation in order to detect information processing impairments among the MCI population. Based on the gathered literature, eye-tracking-based paradigms may ameliorate the screening limitations of traditional cognitive assessments and contribute to early AD detection. However, in order to translate the findings pertaining to abnormal gaze behavior into clinical applications, it is imperative to conduct longitudinal investigations in both laboratory-based and ecologically valid settings.
Collapse
Affiliation(s)
- Alexandra Wolf
- Cognitive Behavioral Assistive Technology (CBAT), Goal-Oriented Technology Group, RIKEN Center for Advanced Intelligence Project (AIP), Tokyo, Japan
- Department of Neuropsychiatry, Graduate School of Medical Sciences, Kyushu University, Fukuoka, Japan
| | - Kornkanok Tripanpitak
- Cognitive Behavioral Assistive Technology (CBAT), Goal-Oriented Technology Group, RIKEN Center for Advanced Intelligence Project (AIP), Tokyo, Japan
| | - Satoshi Umeda
- Department of Psychology, Keio University, Tokyo, Japan
| | - Mihoko Otake-Matsuura
- Cognitive Behavioral Assistive Technology (CBAT), Goal-Oriented Technology Group, RIKEN Center for Advanced Intelligence Project (AIP), Tokyo, Japan
| |
Collapse
|
19
|
Morillo-Mendez L, Stower R, Sleat A, Schreiter T, Leite I, Mozos OM, Schrooten MGS. Can the robot "see" what I see? Robot gaze drives attention depending on mental state attribution. Front Psychol 2023; 14:1215771. [PMID: 37519379 PMCID: PMC10374202 DOI: 10.3389/fpsyg.2023.1215771] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2023] [Accepted: 06/27/2023] [Indexed: 08/01/2023] Open
Abstract
Mentalizing, where humans infer the mental states of others, facilitates understanding and interaction in social situations. Humans also tend to adopt mentalizing strategies when interacting with robotic agents. There is an ongoing debate about how inferred mental states affect gaze following, a key component of joint attention. Although the gaze from a robot induces gaze following, the impact of mental state attribution on robotic gaze following remains unclear. To address this question, we asked forty-nine young adults to perform a gaze cueing task during which mental state attribution was manipulated as follows. Participants sat facing a robot that turned its head to the screen at its left or right. Their task was to respond to targets that appeared either at the screen the robot gazed at or at the other screen. At the baseline, the robot was positioned so that participants would perceive it as being able to see the screens. We expected faster response times to targets at the screen the robot gazed at than targets at the non-gazed screen (i.e., gaze cueing effect). In the experimental condition, the robot's line of sight was occluded by a physical barrier such that participants would perceive it as unable to see the screens. Our results revealed gaze cueing effects in both conditions although the effect was reduced in the occluded condition compared to the baseline. These results add to the expanding fields of social cognition and human-robot interaction by suggesting that mentalizing has an impact on robotic gaze following.
Collapse
Affiliation(s)
| | - Rebecca Stower
- Division of Robotics, Perception and Learning, KTH, Stockholm, Sweden
| | - Alex Sleat
- Division of Robotics, Perception and Learning, KTH, Stockholm, Sweden
| | - Tim Schreiter
- Centre for Applied Autonomous Sensor Systems, Örebro University, Örebro, Sweden
| | - Iolanda Leite
- Division of Robotics, Perception and Learning, KTH, Stockholm, Sweden
| | | | | |
Collapse
|
20
|
Saggar M, Bruno JL, Hall SS. Brief intensive social gaze training reorganizes functional brain connectivity in boys with fragile X syndrome. Cereb Cortex 2023; 33:5218-5227. [PMID: 36376964 PMCID: PMC10151883 DOI: 10.1093/cercor/bhac411] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2022] [Revised: 09/19/2022] [Accepted: 09/20/2022] [Indexed: 11/16/2022] Open
Abstract
Boys with fragile X syndrome (FXS), the leading known genetic cause of autism spectrum disorder (ASD), demonstrate significant impairments in social gaze and associated weaknesses in communication, social interaction, and other areas of adaptive functioning. Little is known, however, concerning the impact of behavioral treatments for these behaviors on functional brain connectivity in this population. As part of a larger study, boys with FXS (mean age 13.23 ± 2.31 years) and comparison boys with ASD (mean age 12.15 ± 2.76 years) received resting-state functional magnetic resonance imaging scans prior to and following social gaze training administered by a trained behavior therapist in our laboratory. Network-agnostic connectome-based predictive modeling of pretreatment resting-state functional connectivity data revealed a set of positive (FXS > ASD) and negative (FXS < ASD) edges that differentiated the groups significantly and consistently across all folds of cross-validation. Following administration of the brief training, the FXS and ASD groups demonstrated reorganization of connectivity differences. The divergence in the spatial pattern of reorganization response, based on functional connectivity differences pretreatment, suggests a unique pattern of response to treatment in the FXS and ASD groups. These results provide further support for implementing targeted behavioral treatments to ameliorate syndrome-specific behavioral features in FXS.
Collapse
Affiliation(s)
- Manish Saggar
- Division of Interdisciplinary Brain Sciences, Department of Psychiatry and Behavioral Sciences, Stanford University School of Medicine, Stanford, CA 94304, United States
| | - Jennifer L Bruno
- Division of Interdisciplinary Brain Sciences, Department of Psychiatry and Behavioral Sciences, Stanford University School of Medicine, Stanford, CA 94304, United States
| | - Scott S Hall
- Division of Interdisciplinary Brain Sciences, Department of Psychiatry and Behavioral Sciences, Stanford University School of Medicine, Stanford, CA 94304, United States
| |
Collapse
|
21
|
Freeth M, Morgan EJ. I see you, you see me: the impact of social presence on social interaction processes in autistic and non-autistic people. Philos Trans R Soc Lond B Biol Sci 2023; 378:20210479. [PMID: 36871584 PMCID: PMC9985964 DOI: 10.1098/rstb.2021.0479] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2022] [Accepted: 12/23/2022] [Indexed: 03/07/2023] Open
Abstract
Environments that require social interaction are complex, challenging and sometimes experienced as overwhelming by autistic people. However, all too often theories relating to social interaction processes are created, and interventions are proposed, on the basis of data collected from studies that do not involve genuine social encounters nor do they consider the perception of social presence to be a potentially influential factor. In this review, we begin by considering why face-to-face interaction research is important in this field. We then discuss how the perception of social agency and social presence can influence conclusions about social interaction processes. We then outline some insights gained from face-to-face interaction research conducted with both autistic and non-autistic people. We finish by considering the impact of social presence on cognitive processes more broadly, including theory of mind. Overall, we demonstrate that choice of stimuli in studies assessing social interaction processes has the potential to substantially alter conclusions drawn. Ecological validity matters and social presence, in particular, is a critical factor that fundamentally impacts social interaction processes in both autistic and non-autistic people. This article is part of a discussion meeting issue 'Face2face: advancing the science of social interaction'.
Collapse
Affiliation(s)
- Megan Freeth
- Department of Psychology, The University of Sheffield, Sheffield, Sheffield S1 2LT, UK
| | - Emma J. Morgan
- Department of Psychology, The University of Sheffield, Sheffield, Sheffield S1 2LT, UK
| |
Collapse
|
22
|
Bloch C, Tepest R, Jording M, Vogeley K, Falter-Wagner CM. Intrapersonal synchrony analysis reveals a weaker temporal coherence between gaze and gestures in adults with autism spectrum disorder. Sci Rep 2022; 12:20417. [PMID: 36437262 PMCID: PMC9701674 DOI: 10.1038/s41598-022-24605-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2022] [Accepted: 11/17/2022] [Indexed: 11/29/2022] Open
Abstract
The temporal encoding of nonverbal signals within individuals, referred to as intrapersonal synchrony (IaPS), is an implicit process and essential feature of human communication. Based on existing evidence, IaPS is thought to be a marker of nonverbal behavior characteristics in autism spectrum disorders (ASD), but there is a lack of empirical evidence. The aim of this study was to quantify IaPS in adults during an experimentally controlled real-life interaction task. A sample of adults with a confirmed ASD diagnosis and a matched sample of typically-developed adults were tested (N = 48). Participants were required to indicate the appearance of a target invisible to their interaction partner nonverbally through gaze and pointing gestures. Special eye-tracking software allowed automated extraction of temporal delays between nonverbal signals and their intrapersonal variability with millisecond temporal resolution as indices for IaPS. Likelihood ratio tests of multilevel models showed enlarged delays between nonverbal signals in ASD. Larger delays were associated with greater intrapersonal variability in delays. The results provide a quantitative constraint on nonverbal temporality in typically-developed adults and suggest weaker temporal coherence between nonverbal signals in adults with ASD. The results provide a potential diagnostic marker and inspire predictive coding theories about the role of IaPS in interpersonal synchronization processes.
Collapse
Affiliation(s)
- Carola Bloch
- Department of Psychiatry and Psychotherapy, Medical Faculty, LMU Clinic, Ludwig-Maximilians-University, Nussbaumstraße 7, 80336, Munich, Germany.
- Department of Psychiatry and Psychotherapy, Faculty of Medicine and University Hospital Cologne, University of Cologne, Cologne, Germany.
| | - Ralf Tepest
- Department of Psychiatry and Psychotherapy, Faculty of Medicine and University Hospital Cologne, University of Cologne, Cologne, Germany
| | - Mathis Jording
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Forschungszentrum Jülich, Jülich, Germany
| | - Kai Vogeley
- Department of Psychiatry and Psychotherapy, Faculty of Medicine and University Hospital Cologne, University of Cologne, Cologne, Germany
- Cognitive Neuroscience, Institute of Neuroscience and Medicine (INM-3), Forschungszentrum Jülich, Jülich, Germany
| | - Christine M Falter-Wagner
- Department of Psychiatry and Psychotherapy, Medical Faculty, LMU Clinic, Ludwig-Maximilians-University, Nussbaumstraße 7, 80336, Munich, Germany.
| |
Collapse
|
23
|
Nicklas A, Rückel LM, Noël B, Varga M, Kleinert J, Boss M, Klatt S. Gaze behavior in social interactions between beach volleyball players—An exploratory approach. Front Psychol 2022; 13:945389. [DOI: 10.3389/fpsyg.2022.945389] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2022] [Accepted: 09/13/2022] [Indexed: 11/13/2022] Open
Abstract
Previous research has indicated that social interactions and gaze behavior analyses in a group setting could be essential tools in accomplishing group objectives. However, only a few studies have examined the impact of social interactions on group dynamics in team sports and their influence on team performance. This study aimed to investigate the effects of game performance pressure on the gaze behavior within social interactions between beach volleyball players during game-like situations. Therefore, 18 expert beach volleyball players conducted a high and a low game performance pressure condition while wearing an eye tracking system. The results indicate that higher game performance pressure leads to more and longer fixation on teammates’ faces. A higher need for communication without misunderstandings could explain this adaptation. The longer and more frequent look at the face could improve the receiving of verbal and non-verbal information of the teammate’s face. Further, players showed inter-individual strategies to cope with high game performance pressure regarding their gaze behavior, for example, increasing the number of fixations and the fixation duration on the teammate’s face. Thereby, this study opens a new avenue for research on social interaction and how it is influenced in/through sport.
Collapse
|
24
|
Morillo-Mendez L, Schrooten MGS, Loutfi A, Mozos OM. Age-Related Differences in the Perception of Robotic Referential Gaze in Human-Robot Interaction. Int J Soc Robot 2022:1-13. [PMID: 36185773 PMCID: PMC9510350 DOI: 10.1007/s12369-022-00926-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/08/2022] [Indexed: 11/12/2022]
Abstract
There is an increased interest in using social robots to assist older adults during their daily life activities. As social robots are designed to interact with older users, it becomes relevant to study these interactions under the lens of social cognition. Gaze following, the social ability to infer where other people are looking at, deteriorates with older age. Therefore, the referential gaze from robots might not be an effective social cue to indicate spatial locations to older users. In this study, we explored the performance of older adults, middle-aged adults, and younger controls in a task assisted by the referential gaze of a Pepper robot. We examined age-related differences in task performance, and in self-reported social perception of the robot. Our main findings show that referential gaze from a robot benefited task performance, although the magnitude of this facilitation was lower for older participants. Moreover, perceived anthropomorphism of the robot varied less as a result of its referential gaze in older adults. This research supports that social robots, even if limited in their gazing capabilities, can be effectively perceived as social entities. Additionally, this research suggests that robotic social cues, usually validated with young participants, might be less optimal signs for older adults. Supplementary Information The online version contains supplementary material available at 10.1007/s12369-022-00926-6.
Collapse
Affiliation(s)
- Lucas Morillo-Mendez
- Centre for Applied Autonomous Sensor Systems, Örebro University, Fakultetsgatan 1, Örebro, 702 81 Sweden
| | | | - Amy Loutfi
- Centre for Applied Autonomous Sensor Systems, Örebro University, Fakultetsgatan 1, Örebro, 702 81 Sweden
| | - Oscar Martinez Mozos
- Centre for Applied Autonomous Sensor Systems, Örebro University, Fakultetsgatan 1, Örebro, 702 81 Sweden
| |
Collapse
|
25
|
Artiran S, Ravisankar R, Luo S, Chukoskie L, Cosman P. Measuring Social Modulation of Gaze in Autism Spectrum Condition With Virtual Reality Interviews. IEEE Trans Neural Syst Rehabil Eng 2022; 30:2373-2384. [PMID: 35969548 DOI: 10.1109/tnsre.2022.3198933] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Gaze behavior in dyadic conversations can indicate active listening and attention. However, gaze behavior that is different from the engagement expected during neurotypical social interaction cues may be interpreted as uninterested or inattentive, which can be problematic in both personal and professional situations. Neurodivergent individuals, such as those with autism spectrum conditions, often exhibit social communication differences broadly including via gaze behavior. This project aims to support situational social gaze practice through a virtual reality (VR) mock job interview practice using the HTC Vive Pro Eye VR headset. We show how gaze behavior varies in the mock job interview between neurodivergent and neurotypical participants. We also investigate the social modulation of gaze behavior based on conversational role (speaking and listening). Our three main contributions are: (i) a system for fully-automatic analysis of social modulation of gaze behavior using a portable VR headset with a novel realistic mock job interview, (ii) a signal processing pipeline, which employs Kalman filtering and spatial-temporal density-based clustering techniques, that can improve the accuracy of the headset's built-in eye-tracker, and (iii) being the first to investigate social modulation of gaze behavior among neurotypical/divergent individuals in the realm of immersive VR.
Collapse
|
26
|
McCrackin SD, Provencher S, Mendell E, Ristic J. Transparent masks reduce the negative impact of opaque masks on understanding emotional states but not on sharing them. Cogn Res Princ Implic 2022; 7:59. [PMID: 35796906 PMCID: PMC9261140 DOI: 10.1186/s41235-022-00411-8] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2021] [Accepted: 06/20/2022] [Indexed: 12/04/2022] Open
Abstract
While face masks provide necessary protection against disease spread, they occlude the lower face parts (chin, mouth, nose) and consequently impair the ability to accurately perceive facial emotions. Here we examined how wearing face masks impacted making inferences about emotional states of others (i.e., affective theory of mind; Experiment 1) and sharing of emotions with others (i.e., affective empathy; Experiment 2). We also investigated whether wearing transparent masks ameliorated the occlusion impact of opaque masks. Participants viewed emotional faces presented within matching positive (happy), negative (sad), or neutral contexts. The faces wore opaque masks, transparent masks, or no masks. In Experiment 1, participants rated the protagonists’ emotional valence and intensity. In Experiment 2, they indicated their empathy for the protagonist and the valence of their emotion. Wearing opaque masks impacted both affective theory of mind and affective empathy ratings. Compared to no masks, wearing opaque masks resulted in assumptions that the protagonist was feeling less intense and more neutral emotions. Wearing opaque masks also reduced positive empathy for the protagonist and resulted in more neutral shared valence ratings. Wearing transparent masks restored the affective theory of mind ratings but did not restore empathy ratings. Thus, wearing face masks impairs nonverbal social communication, with transparent masks able to restore some of the negative effects brought about by opaque masks. Implications for the theoretical understanding of socioemotional processing as well as for educational and professional settings are discussed.
Collapse
Affiliation(s)
- Sarah D McCrackin
- Department of Psychology, McGill University, 2001 McGill College Avenue, Montreal, QC, H3A 1G1, Canada.
| | - Sabrina Provencher
- Department of Psychology, McGill University, 2001 McGill College Avenue, Montreal, QC, H3A 1G1, Canada
| | - Ethan Mendell
- Department of Psychology, McGill University, 2001 McGill College Avenue, Montreal, QC, H3A 1G1, Canada
| | - Jelena Ristic
- Department of Psychology, McGill University, 2001 McGill College Avenue, Montreal, QC, H3A 1G1, Canada.
| |
Collapse
|
27
|
Bowsher-Murray C, Gerson S, von dem Hagen E, Jones CRG. The Components of Interpersonal Synchrony in the Typical Population and in Autism: A Conceptual Analysis. Front Psychol 2022; 13:897015. [PMID: 35734455 PMCID: PMC9208202 DOI: 10.3389/fpsyg.2022.897015] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2022] [Accepted: 05/16/2022] [Indexed: 01/18/2023] Open
Abstract
Interpersonal synchrony – the tendency for social partners to temporally co-ordinate their behaviour when interacting – is a ubiquitous feature of social interactions. Synchronous interactions play a key role in development, and promote social bonding and a range of pro-social behavioural outcomes across the lifespan. The process of achieving and maintaining interpersonal synchrony is highly complex, with inputs required from across perceptual, temporal, motor, and socio-cognitive domains. In this conceptual analysis, we synthesise evidence from across these domains to establish the key components underpinning successful non-verbal interpersonal synchrony, how such processes interact, and factors that may moderate their operation. We also consider emerging evidence that interpersonal synchrony is reduced in autistic populations. We use our account of the components contributing to interpersonal synchrony in the typical population to identify potential points of divergence in interpersonal synchrony in autism. The relationship between interpersonal synchrony and broader aspects of social communication in autism are also considered, together with implications for future research.
Collapse
Affiliation(s)
- Claire Bowsher-Murray
- Wales Autism Research Centre, School of Psychology, Cardiff University, Cardiff, United Kingdom
- Cardiff University Centre for Human Developmental Science, School of Psychology, Cardiff University, Cardiff, United Kingdom
- *Correspondence: Claire Bowsher-Murray,
| | - Sarah Gerson
- Cardiff University Centre for Human Developmental Science, School of Psychology, Cardiff University, Cardiff, United Kingdom
| | - Elisabeth von dem Hagen
- Wales Autism Research Centre, School of Psychology, Cardiff University, Cardiff, United Kingdom
- Cardiff University Centre for Human Developmental Science, School of Psychology, Cardiff University, Cardiff, United Kingdom
- Cardiff University Brain Imaging Research Centre, School of Psychology, Cardiff University, Cardiff, United Kingdom
| | - Catherine R. G. Jones
- Wales Autism Research Centre, School of Psychology, Cardiff University, Cardiff, United Kingdom
- Cardiff University Centre for Human Developmental Science, School of Psychology, Cardiff University, Cardiff, United Kingdom
- Catherine R. G. Jones,
| |
Collapse
|
28
|
Betriana F, Tanioka R, Yokotani T, Matsumoto K, Zhao Y, Osaka K, Miyagawa M, Kai Y, Schoenhofer S, Locsin RC, Tanioka T. Characteristics of interactive communication between Pepper robot, patients with schizophrenia, and healthy persons. BELITUNG NURSING JOURNAL 2022; 8:176-184. [PMID: 37521889 PMCID: PMC10386810 DOI: 10.33546/bnj.1998] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Revised: 01/17/2022] [Accepted: 03/12/2022] [Indexed: 08/01/2023] Open
Abstract
Background Expressing enjoyment when conversing with healthcare robots is an opportunity to enhance the value of human robots with interactive capabilities. In clinical practice, it is common to find verbal dysfunctions in patients with schizophrenia. Thus, interactive communication characteristics may vary between Pepper robot, persons with schizophrenia, and healthy persons. Objective Two case studies aimed to describe the characteristics of interactive communications, 1) between Pepper as a healthcare robot and two patients with schizophrenia, and 2) between Pepper as a healthcare robot and two healthy persons. Case Report The "Intentional Observational Clinical Research Design" was used to collect data. Using audio-video technology, the conversational interactions between the four participants with the Pepper healthcare robot were recorded. Their interactions were observed, with significant events noted. After their interactions, the four participants were interviewed regarding their experience and impressions of interacting with the Pepper healthcare robot. Audio-video recordings were analyzed following the analysis and interpretation protocol, and the interview data were transcribed, analyzed, and interpreted. Discussion There were similarities and differences in the interactive communication characteristics between the Pepper robot and the two participants with schizophrenia and between Pepper and the two healthy participants. The similarities were experiences of human enjoyment while interacting with the Pepper robot. This enjoyment was enhanced with the expectancy of the Pepper robot as able to entertain, and possessing interactive capabilities, indicating two-way conversational abilities. However, different communicating characteristics were found between the healthy participants' impressions of the Pepper robot and the participants with schizophrenia. Healthy participants understood Pepper to be an automaton, with responses to questions often constrained and, on many occasions, displaying inaccurate gaze. Conclusion Pepper robot showed capabilities for effective communication pertaining to expressing enjoyment. The accuracy and appropriateness of gaze remained a critical characteristic regardless of the situation or occasion with interactions between persons with schizophrenia, and between healthy persons. It is important to consider that in the future, for effective use of healthcare robots with multiple users, improvements in the areas of the appropriateness of gaze, response time during the conversation, and entertaining functions are critically observed.
Collapse
Affiliation(s)
- Feni Betriana
- Graduate School of Health Sciences, Tokushima University, Tokushima, Japan
| | - Ryuichi Tanioka
- Graduate School of Health Sciences, Tokushima University, Tokushima, Japan
| | - Tomoya Yokotani
- Graduate School of Health Sciences, Tokushima University, Tokushima, Japan
| | - Kazuyuki Matsumoto
- Graduate School of Technology, Industrial and Social Sciences, Tokushima University, Tokushima, Japan
| | - Yueren Zhao
- Department of Psychiatry, School of Medicine, Fujita Health University, Aichi, Japan
| | - Kyoko Osaka
- Department of Clinical Nursing, Kochi Medical School, Kochi University, Kochi, Japan
| | - Misao Miyagawa
- Department of Nursing, Faculty of Health and Welfare, Tokushima Bunri University, Tokushima, Japan
| | - Yoshihiro Kai
- Department of Mechanical Engineering, Tokai University, Kanagawa, Japan
| | - Savina Schoenhofer
- Anne Boykin Institute, Florida Atlantic University, Boca Raton, FL 33431–0991, USA
| | - Rozzano C. Locsin
- Tokushima University, Tokushima, Japan
- Florida Atlantic University, Boca Raton, FL 33431, USA
| | - Tetsuya Tanioka
- Department of Nursing Outcome Management, Institute of Biomedical Sciences, Tokushima University, Tokushima, Japan
| |
Collapse
|
29
|
Lim K, Rapisarda A, Keefe RSE, Lee J. Social skills, negative symptoms and real-world functioning in individuals at ultra-high risk of psychosis. Asian J Psychiatr 2022; 69:102996. [PMID: 35026654 DOI: 10.1016/j.ajp.2021.102996] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/24/2021] [Revised: 11/28/2021] [Accepted: 12/27/2021] [Indexed: 11/02/2022]
Abstract
BACKGROUND Impairment in real-world social functioning is observed in individuals at Ultra-High Risk (UHR) of psychosis. Both social skills and negative symptoms appear to influence real-world functioning. This study aims to examine the psychometric properties of a social skills measure, the High-Risk Social Challenge task (HiSoC), and evaluate the relationship between social skills, negative symptoms, and real-world functioning in UHR individuals. METHODS HiSoC data was analysed in 87 UHR individuals and 358 healthy controls. Exploratory factor analysis (EFA) was used to evaluate the factor structure of the HiSoC task. Convergent and divergent validity were assessed. Negative symptoms were assessed on the Positive and Negative Syndrome Scale (PANSS) and real-world functioning was indexed by the Global Assessment of Functioning (GAF). Commonality analysis was used to partition unique and shared variance of HiSoC and negative symptoms with real-world functioning. RESULTS EFA yielded a three-factor structure of HiSoC consisting of Affect, Odd behaviour and language, and Social-interpersonal. The HiSoC task discriminated UHR and healthy controls (p < 0.001, Cohen's d = 0.437-0.598). Commonality analysis revealed that the unique variance of the social amotivation subdomain of negative symptoms was the strongest predictor of GAF (p < .001, R2 = .480). Shared variance of 3.7% between HiSoC Social-interpersonal and social amotivation was observed in relation to functioning. CONCLUSION The HiSoC is a psychometrically valid task that is sensitive to identify social skill deficits in UHR. While social skills are related to functioning, experiential negative symptoms appear to be an important target for improving real-world functional outcomes.
Collapse
Affiliation(s)
- Keane Lim
- Research Division, Institute of Mental Health, Singapore
| | - Attilio Rapisarda
- Research Division, Institute of Mental Health, Singapore; Neuroscience and Behavioural Disorders, Duke-NUS Medical School, Singapore
| | - Richard S E Keefe
- Department of Psychiatry and Behavioral Sciences, Duke University Medical Center, Durham, NC, United States
| | - Jimmy Lee
- Research Division, Institute of Mental Health, Singapore; Department of Psychosis, Institute of Mental Health, Singapore; Neuroscience and Mental Health, Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore.
| |
Collapse
|
30
|
Vehlen A, Standard W, Domes G. How to choose the size of facial areas of interest in interactive eye tracking. PLoS One 2022; 17:e0263594. [PMID: 35120188 PMCID: PMC8815978 DOI: 10.1371/journal.pone.0263594] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2021] [Accepted: 01/21/2022] [Indexed: 11/18/2022] Open
Abstract
Advances in eye tracking technology have enabled the development of interactive experimental setups to study social attention. Since these setups differ substantially from the eye tracker manufacturer's test conditions, validation is essential with regard to the quality of gaze data and other factors potentially threatening the validity of this signal. In this study, we evaluated the impact of accuracy and areas of interest (AOIs) size on the classification of simulated gaze (fixation) data. We defined AOIs of different sizes using the Limited-Radius Voronoi-Tessellation (LRVT) method, and simulated gaze data for facial target points with varying accuracy. As hypothesized, we found that accuracy and AOI size had strong effects on gaze classification. In addition, these effects were not independent and differed in falsely classified gaze inside AOIs (Type I errors; false alarms) and falsely classified gaze outside the predefined AOIs (Type II errors; misses). Our results indicate that smaller AOIs generally minimize false classifications as long as accuracy is good enough. For studies with lower accuracy, Type II errors can still be compensated to some extent by using larger AOIs, but at the cost of more probable Type I errors. Proper estimation of accuracy is therefore essential for making informed decisions regarding the size of AOIs in eye tracking research.
Collapse
Affiliation(s)
- Antonia Vehlen
- Department of Psychology, Biological and Clinical Psychology, University of Trier, Trier, Germany
| | - William Standard
- Department of Psychology, Biological and Clinical Psychology, University of Trier, Trier, Germany
| | - Gregor Domes
- Department of Psychology, Biological and Clinical Psychology, University of Trier, Trier, Germany
| |
Collapse
|
31
|
Fairhurst MT, McGlone F, Croy I. Affective touch: a communication channel for social exchange. Curr Opin Behav Sci 2022. [DOI: 10.1016/j.cobeha.2021.07.007] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/30/2022]
|
32
|
Wahn B, Schmitz L, Kingstone A, Böckler-Raettig A. When eyes beat lips: speaker gaze affects audiovisual integration in the McGurk illusion. PSYCHOLOGICAL RESEARCH 2021; 86:1930-1943. [PMID: 34854983 PMCID: PMC9363401 DOI: 10.1007/s00426-021-01618-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2020] [Accepted: 11/10/2021] [Indexed: 11/26/2022]
Abstract
Eye contact is a dynamic social signal that captures attention and plays a critical role in human communication. In particular, direct gaze often accompanies communicative acts in an ostensive function: a speaker directs her gaze towards the addressee to highlight the fact that this message is being intentionally communicated to her. The addressee, in turn, integrates the speaker’s auditory and visual speech signals (i.e., her vocal sounds and lip movements) into a unitary percept. It is an open question whether the speaker’s gaze affects how the addressee integrates the speaker’s multisensory speech signals. We investigated this question using the classic McGurk illusion, an illusory percept created by presenting mismatching auditory (vocal sounds) and visual information (speaker’s lip movements). Specifically, we manipulated whether the speaker (a) moved his eyelids up/down (i.e., open/closed his eyes) prior to speaking or did not show any eye motion, and (b) spoke with open or closed eyes. When the speaker’s eyes moved (i.e., opened or closed) before an utterance, and when the speaker spoke with closed eyes, the McGurk illusion was weakened (i.e., addressees reported significantly fewer illusory percepts). In line with previous research, this suggests that motion (opening or closing), as well as the closed state of the speaker’s eyes, captured addressees’ attention, thereby reducing the influence of the speaker’s lip movements on the addressees’ audiovisual integration process. Our findings reaffirm the power of speaker gaze to guide attention, showing that its dynamics can modulate low-level processes such as the integration of multisensory speech signals.
Collapse
Affiliation(s)
- Basil Wahn
- Department of Psychology, Leibniz Universität Hannover, Hannover, Germany.
| | - Laura Schmitz
- Institute of Sports Science, Leibniz Universität Hannover, Hannover, Germany
| | - Alan Kingstone
- Department of Psychology, University of British Columbia, Vancouver, BC, Canada
| | | |
Collapse
|
33
|
Gillespie-Smith K, Hendry G, Anduuru N, Laird T, Ballantyne C. Using social media to be 'social': Perceptions of social media benefits and risk by autistic young people, and parents. RESEARCH IN DEVELOPMENTAL DISABILITIES 2021; 118:104081. [PMID: 34507053 DOI: 10.1016/j.ridd.2021.104081] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/28/2020] [Revised: 08/06/2021] [Accepted: 08/20/2021] [Indexed: 06/13/2023]
Abstract
Autistic individuals are reported to struggle with aspects of social interaction. Past research has shown that social media use can help to facilitate social functioning, however, the perceptions of risks and benefits when engaging on social media platforms remains unclear. The current study aimed to explore perceptions of social media participation in terms of online risk and online relationships in both autistic young people and parents. Eight autistic young people and six parents of autistic young people took part in semi-structured interviews, with the resultant data being transcribed and analysed using Braun and Clarke's (2006) inductive thematic analysis. Two themes were identified in relation to the impact social media has on autistic young people's relationships (Socialisation; Communication) and two themes were identified in relation to the perceived barriers and risks to engaging online (Abusive interactions; Talking to strangers). These findings show that social interaction is of particular value to young autistic people, in terms of affording them easier social interactions than there would be in 'real life'. The findings also show that the autistic young people were aware of risks online, and considered ways in which they try to manage this risk. Future research is needed to understand if similar interactions and risk take place across all platforms and whether online communication is successful between matched or mixed autistic and non-autistic groups.
Collapse
Affiliation(s)
- Karri Gillespie-Smith
- Department of Clinical Psychology, School of Health in Social Science, University of Edinburgh, Edinburgh, UK.
| | - Gillian Hendry
- Division of Psychology, School of Education and Social Sciences, University of West of Scotland, Paisley, UK
| | - Nicole Anduuru
- Division of Psychology, School of Education and Social Sciences, University of West of Scotland, Paisley, UK
| | - Tracey Laird
- Division of Psychology, School of Education and Social Sciences, University of West of Scotland, Paisley, UK
| | - Carrie Ballantyne
- Division of Psychology, School of Education and Social Sciences, University of West of Scotland, Paisley, UK
| |
Collapse
|
34
|
Caruana N, Inkley C, Nalepka P, Kaplan DM, Richardson MJ. Gaze facilitates responsivity during hand coordinated joint attention. Sci Rep 2021; 11:21037. [PMID: 34702900 PMCID: PMC8548595 DOI: 10.1038/s41598-021-00476-3] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2021] [Accepted: 10/13/2021] [Indexed: 11/18/2022] Open
Abstract
The coordination of attention between individuals is a fundamental part of everyday human social interaction. Previous work has focused on the role of gaze information for guiding responses during joint attention episodes. However, in many contexts, hand gestures such as pointing provide another valuable source of information about the locus of attention. The current study developed a novel virtual reality paradigm to investigate the extent to which initiator gaze information is used by responders to guide joint attention responses in the presence of more visually salient and spatially precise pointing gestures. Dyads were instructed to use pointing gestures to complete a cooperative joint attention task in a virtual environment. Eye and hand tracking enabled real-time interaction and provided objective measures of gaze and pointing behaviours. Initiators displayed gaze behaviours that were spatially congruent with the subsequent pointing gestures. Responders overtly attended to the initiator’s gaze during the joint attention episode. However, both these initiator and responder behaviours were highly variable across individuals. Critically, when responders did overtly attend to their partner’s face, their saccadic reaction times were faster when the initiator’s gaze was also congruent with the pointing gesture, and thus predictive of the joint attention location. These results indicate that humans attend to and process gaze information to facilitate joint attention responsivity, even in contexts where gaze information is implicit to the task and joint attention is explicitly cued by more spatially precise and visually salient pointing gestures.
Collapse
Affiliation(s)
- Nathan Caruana
- Department of Cognitive Science, Macquarie University, 16 University Ave, Sydney, NSW, 2109, Australia. .,Perception in Action Research Centre, Macquarie University, Sydney, Australia.
| | - Christine Inkley
- Department of Cognitive Science, Macquarie University, 16 University Ave, Sydney, NSW, 2109, Australia
| | - Patrick Nalepka
- Perception in Action Research Centre, Macquarie University, Sydney, Australia.,Department of Psychology, Macquarie University, Sydney, Australia.,Centre for Elite Performance, Expertise and Training, Macquarie University, Sydney, Australia
| | - David M Kaplan
- Department of Cognitive Science, Macquarie University, 16 University Ave, Sydney, NSW, 2109, Australia.,Perception in Action Research Centre, Macquarie University, Sydney, Australia.,Centre for Elite Performance, Expertise and Training, Macquarie University, Sydney, Australia
| | - Michael J Richardson
- Perception in Action Research Centre, Macquarie University, Sydney, Australia.,Department of Psychology, Macquarie University, Sydney, Australia.,Centre for Elite Performance, Expertise and Training, Macquarie University, Sydney, Australia
| |
Collapse
|
35
|
Kesner L, Adámek P, Grygarová D. How Neuroimaging Can Aid the Interpretation of Art. Front Hum Neurosci 2021; 15:702473. [PMID: 34594192 PMCID: PMC8476868 DOI: 10.3389/fnhum.2021.702473] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2021] [Accepted: 08/10/2021] [Indexed: 11/24/2022] Open
Abstract
Cognitive neuroscience of art continues to be criticized for failing to provide interesting results about art itself. In particular, results of brain imaging experiments have not yet been utilized in interpretation of particular works of art. Here we revisit a recent study in which we explored the neuronal and behavioral response to painted portraits with a direct versus an averted gaze. We then demonstrate how fMRI results can be related to the art historical interpretation of a specific painting. The evidentiary status of neuroimaging data is not different from any other extra-pictorial facts that art historians uncover in their research and relate to their account of the significance of a work of art. They are not explanatory in a strong sense, yet they provide supportive evidence for the art writer’s inference about the intended meaning of a given work. We thus argue that brain imaging can assume an important role in the interpretation of particular art works.
Collapse
Affiliation(s)
- Ladislav Kesner
- National Institute of Mental Health, Klecany, Czechia.,Faculty of Arts, Masaryk University, Brno, Czechia
| | - Petr Adámek
- National Institute of Mental Health, Klecany, Czechia.,Third Faculty of Medicine, Charles University, Prague, Czechia
| | | |
Collapse
|
36
|
McCrackin SD, Itier RJ. I can see it in your eyes: Perceived gaze direction impacts ERP and behavioural measures of affective theory of mind. Cortex 2021; 143:205-222. [PMID: 34455372 DOI: 10.1016/j.cortex.2021.05.024] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2020] [Revised: 04/12/2021] [Accepted: 05/21/2021] [Indexed: 10/20/2022]
Abstract
Looking at someone's eyes is thought to be important for affective theory of mind (aTOM), our ability to infer their emotional state. However, it is unknown whether an individual's gaze direction influences our aTOM judgements and what the time course of this influence might be. We presented participants with sentences describing individuals in positive, negative or neutral scenarios, followed by direct or averted gaze neutral face pictures of those individuals. Participants made aTOM judgements about each person's mental state, including their affective valence and arousal, and we investigated whether the face gaze direction impacted those judgements. Participants rated that gazers were feeling more positive when they displayed direct gaze as opposed to averted gaze, and that they were feeling more aroused during negative contexts when gaze was averted as opposed to direct. Event-related potentials associated with face perception and affective processing were examined using mass-univariate analyses to track the time-course of this eye-gaze and affective processing interaction at a neural level. Both positive and negative trials were differentiated from neutral trials at many stages of processing. This included the early N200 and EPN components, believed to reflect automatic emotion areas activation and attentional selection respectively. This also included the later P300 and LPP components, thought to reflect elaborative cognitive appraisal of emotional content. Critically, sentence valence and gaze direction interacted over these later components, which may reflect the incorporation of eye-gaze in the cognitive evaluation of another's emotional state. The results suggest that gaze perception directly impacts aTOM processes, and that altered eye-gaze processing in clinical populations may contribute to associated aTOM impairments.
Collapse
Affiliation(s)
| | - Roxane J Itier
- Department of Psychology, University of Waterloo, Waterloo, Canada.
| |
Collapse
|
37
|
Shawahna R, Jaber M, Yahya N, Jawadeh F, Rawajbeh S. Are medical students in Palestine adequately trained to care for individuals with autism spectrum disorders? A multicenter cross-sectional study of their familiarity, knowledge, confidence, and willingness to learn. BMC MEDICAL EDUCATION 2021; 21:424. [PMID: 34376162 PMCID: PMC8356397 DOI: 10.1186/s12909-021-02865-8] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/28/2021] [Accepted: 07/26/2021] [Indexed: 06/13/2023]
Abstract
BACKGROUND Medical students are the future workforce of physicians in primary, secondary, tertiary, and highly specialized care centers. The present study was undertaken to assess familiarity, knowledge, confidence, of medical students with regard to autism spectrum disorders (ASDs). METHODS This multicenter study was conducted in a cross-sectional design among medical students in the 3 main universities in Palestine. In addition to the sociodemographic and academic details, the questionnaire measured familiarity (8-items), knowledge (12-items), confidence and willingness to learn (5-items) with regard to ASDs. RESULTS The questionnaire was completed by309 medical students (response rate = 77.3 %). The median familiarity, knowledge, and confidence scores were 50 % (42.5 %, 57.5 %), 50 % (41.7 %, 66.7 %), and 60.0 % (54.0 %, 68.0 %), respectively. There was a positive moderate correlation between familiarity and knowledge scores (Spearman's rho = 0.29, p-value < 0.001) and familiarity and confidence scores (Spearman's rho = 0.34, p-value < 0.001). Medical students who have received a course on autism were 3.08-fold (95 % C.I. of 1.78-5.31) more likely to score ≥ 50 % on the familiarity items compared to those who did not receive a course. The medical students who were in their clinical academic stage, who received a course on ASDs, and those who interacted with individuals with ASDs were 2.36-fold (95 % C.I. of 1.34-4.18), 2.66-fold (95 % C.I. of 1.52-4.65), and 2.59-fold (95 % C.I. of 1.44-4.63) more likely to score ≥ 50 % on the knowledge items. Medical students who reported high satisfaction with their social life were 2.84-fold (95 % C.I. of 1.15-7.00) more likely to score ≥ 50 % on the confidence items. CONCLUSIONS The present study identified considerable awareness and knowledge gaps among medical students with regard to ASDs. Medical students in this study reported low confidence in their ability to provide healthcare services to individuals with ASDs. Appropriately designed educational interventions might improve familiarity, knowledge, and confidence of medical students. More studies are still needed to investigate if such interventions can improve healthcare services for individuals with ASDs.
Collapse
Affiliation(s)
- Ramzi Shawahna
- Department of Physiology, Pharmacology and Toxicology, Faculty of Medicine and Health Sciences, An-Najah National University, P.O. Box 7, New Campus, Building: 19, Office: 1340, Nablus, Palestine.
- An-Najah BioSciences Unit, Centre for Poisons Control, Chemical and Biological Analyses, An-Najah National University, Nablus, Palestine.
| | - Mohammad Jaber
- Department of Medicine, Faculty of Medicine and Health Sciences, An-Najah National University, Nablus, Palestine
- An-Najah National University Hospital, An-Najah National University, Nablus, Palestine
| | - Nourhan Yahya
- Department of Medicine, Faculty of Medicine and Health Sciences, An-Najah National University, Nablus, Palestine
| | - Firdaous Jawadeh
- Department of Medicine, Faculty of Medicine and Health Sciences, An-Najah National University, Nablus, Palestine
| | - Shahd Rawajbeh
- Department of Medicine, Faculty of Medicine and Health Sciences, An-Najah National University, Nablus, Palestine
| |
Collapse
|
38
|
Me looking at you, looking at me: The stare-in-the-crowd effect and autism spectrum disorder. J Psychiatr Res 2021; 140:101-109. [PMID: 34102517 DOI: 10.1016/j.jpsychires.2021.05.050] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/03/2020] [Revised: 03/31/2021] [Accepted: 05/21/2021] [Indexed: 11/20/2022]
Abstract
INTRODUCTION The stare-in-the-crowd (SITC) effect describes the ability to detect self-directed gaze in a crowd. Given the importance of gaze detection in initiating and maintaining social interactions, there is a need to better characterize the SITC effect. METHODS Autistic and neurotypical young adults were presented with four SITC conditions. Eye tracking outcomes and arousal were compared by diagnosis and condition using repeated measures analysis of variance. Hierarchical regression was used to explore behavioral measures. RESULTS Significant interaction of diagnosis and condition was found for eye tracking outcomes. Overall, autistic participants exhibited less looking than neurotypical participants. Interest area dwell time, fixation count, and second fixation duration were significantly higher for conditions with shifting gaze, as well as conditions with more self-directed gaze across participants. Two hierarchical regression models of gaze behaviors with advanced theory of mind as a predictor were significant. DISCUSSION Autistic individuals respond to various gaze conditions in similar patterns to neurotypical individuals, but to a lesser extent. These findings offer important targets for social interventions.
Collapse
|
39
|
Murata A, Nomura K, Watanabe J, Kumano S. Interpersonal physiological synchrony is associated with first person and third person subjective assessments of excitement during cooperative joint tasks. Sci Rep 2021; 11:12543. [PMID: 34131193 PMCID: PMC8206359 DOI: 10.1038/s41598-021-91831-x] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2020] [Accepted: 05/28/2021] [Indexed: 02/05/2023] Open
Abstract
Interpersonal physiological synchrony has been shown to play important roles in social activities. While most studies have shed light on the effects of physiological synchrony on recognition of the group state, such as cohesion or togetherness, the effect of physiological synchrony on the recognition of emotional experience has not been adequately researched. In this study, we examined how physiological synchrony is associated with first- and third-person emotion recognition during a joint task. Two participants played a cooperative block-stacking game (Jenga), alternating their roles as player and adviser, while their heart rates were recorded. The participants evaluated their own emotional experience for each turn. Bystanders watched the game to evaluate the players' emotions. Results showed that the players' subjective excitement increased not only with their own heart rate, but also with increased heart rate synchrony with their adviser. Heart rate synchrony between player and adviser also related to increased intensity in perceived excitement from the bystanders. Given that both first- and third-person emotion recognition can have cumulative impacts on a group, the relationship between physiological synchrony and emotion recognition observed in the present study will help deepen understanding of the psychophysiological mechanisms underlying larger group phenomena such as crowd excitement.
Collapse
Affiliation(s)
- Aiko Murata
- grid.419819.c0000 0001 2184 8682NTT Communication Science Laboratories, NTT Corporation, Atsugi, Japan
| | - Keishi Nomura
- grid.419819.c0000 0001 2184 8682NTT Communication Science Laboratories, NTT Corporation, Atsugi, Japan ,grid.26999.3d0000 0001 2151 536XGraduate School of Education, The University of Tokyo, Tokyo, Japan
| | - Junji Watanabe
- grid.419819.c0000 0001 2184 8682NTT Communication Science Laboratories, NTT Corporation, Atsugi, Japan
| | - Shiro Kumano
- grid.419819.c0000 0001 2184 8682NTT Communication Science Laboratories, NTT Corporation, Atsugi, Japan
| |
Collapse
|
40
|
Haensel JX, Smith TJ, Senju A. Cultural differences in mutual gaze during face-to-face interactions: A dual head-mounted eye-tracking study. VISUAL COGNITION 2021. [DOI: 10.1080/13506285.2021.1928354] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022]
Affiliation(s)
- Jennifer X. Haensel
- Department of Psychological Sciences, Birkbeck, University of London, London, UK
| | - Tim J. Smith
- Department of Psychological Sciences, Birkbeck, University of London, London, UK
| | - Atsushi Senju
- Department of Psychological Sciences, Birkbeck, University of London, London, UK
| |
Collapse
|
41
|
Aseeri S, Interrante V. The Influence of Avatar Representation on Interpersonal Communication in Virtual Social Environments. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2021; 27:2608-2617. [PMID: 33750710 DOI: 10.1109/tvcg.2021.3067783] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Current avatar representations used in immersive VR applications lack features that may be important for supporting natural behaviors and effective communication among individuals. This study investigates the impact of the visual and nonverbal cues afforded by three different types of avatar representations in the context of several cooperative tasks. The avatar types we compared are No_Avatar (HMD and controllers only), Scanned_Avatar (wearing an HMD), and Heal_Avatar (video-see-through). The subjective and objective measures we used to assess the quality of interpersonal communication include surveys of social presence, interpersonal trust, communication satisfaction, and attention to behavioral cues, plus two behavioral measures: duration of mutual gaze and number of unique words spoken. We found that participants reported higher levels of trustworthiness in the Real_Avatar condition compared to the Scanned_Avatar and No_Avatar conditions. They also reported a greater level of attentional focus on facial expressions compared to the No_Avatar condition and spent more extended time, for some tasks, attempting to engage in mutual gaze behavior compared to the Scanned_Avatar and No_Avatar conditions. In both the Heal_Avatar and Scanned_Avatar conditions, participants reported higher levels of co-presence compared with the No_Avatar condition. In the Scanned_Avatar condition, compared with the Heal_Avatar and No_Avatar conditions, participants reported higher levels of attention to body posture. Overall, our exit survey revealed that a majority of participants (66.67%) reported a preference for the Real_Avatar, compared with 25.00% for the Scanned_Avatar and 8.33% for the No_Avatar, These findings provide novel insight into how a user's experience in a social VR scenario is affected by the type of avatar representation provided.
Collapse
|
42
|
Ramamoorthy N, Jamieson O, Imaan N, Plaisted-Grant K, Davis G. Enhanced detection of gaze toward an object: Sociocognitive influences on visual search. Psychon Bull Rev 2021; 28:494-502. [PMID: 33174087 PMCID: PMC8062376 DOI: 10.3758/s13423-020-01841-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/26/2020] [Indexed: 11/17/2022]
Abstract
Another person's gaze direction is a rich source of social information, especially eyes gazing toward prominent or relevant objects. To guide attention to these important stimuli, visual search mechanisms may incorporate sophisticated coding of eye-gaze and its spatial relationship to other objects. Alternatively, any guidance might reflect the action of simple perceptual 'templates' tuned to visual features of socially relevant objects, or intrinsic salience of direct-gazing eyes for human vision. Previous findings that direct gaze (toward oneself) is prioritised over averted gaze do not distinguish between these accounts. To resolve this issue, we compared search for eyes gazing toward a prominent object versus gazing away, finding more efficient search for eyes 'gazing toward' the object. This effect was most clearly seen in target-present trials when gaze was task-relevant. Visual search mechanisms appear to specify gazer-object relations, a computational building-block of theory of mind.
Collapse
Affiliation(s)
| | - Oliver Jamieson
- Department of Psychology, University of Cambridge, Cambridge, UK
| | - Nahiyan Imaan
- Department of Psychology, University of Cambridge, Cambridge, UK
| | | | - Greg Davis
- Department of Psychology, University of Cambridge, Cambridge, UK
| |
Collapse
|
43
|
Casanova M, Clavreul A, Soulard G, Delion M, Aubin G, Ter Minassian A, Seguier R, Menei P. Immersive Virtual Reality and Ocular Tracking for Brain Mapping During Awake Surgery: Prospective Evaluation Study. J Med Internet Res 2021; 23:e24373. [PMID: 33759794 PMCID: PMC8074984 DOI: 10.2196/24373] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2020] [Revised: 01/26/2021] [Accepted: 02/16/2021] [Indexed: 01/14/2023] Open
Abstract
Background Language mapping during awake brain surgery is currently a standard procedure. However, mapping is rarely performed for other cognitive functions that are important for social interaction, such as visuospatial cognition and nonverbal language, including facial expressions and eye gaze. The main reason for this omission is the lack of tasks that are fully compatible with the restrictive environment of an operating room and awake brain surgery procedures. Objective This study aims to evaluate the feasibility and safety of a virtual reality headset equipped with an eye-tracking device that is able to promote an immersive visuospatial and social virtual reality (VR) experience for patients undergoing awake craniotomy. Methods We recruited 15 patients with brain tumors near language and/or motor areas. Language mapping was performed with a naming task, DO 80, presented on a computer tablet and then in 2D and 3D via the VRH. Patients were also immersed in a visuospatial and social VR experience. Results None of the patients experienced VR sickness, whereas 2 patients had an intraoperative focal seizure without consequence; there was no reason to attribute these seizures to virtual reality headset use. The patients were able to perform the VR tasks. Eye tracking was functional, enabling the medical team to analyze the patients’ attention and exploration of the visual field of the virtual reality headset directly. Conclusions We found that it is possible and safe to immerse the patient in an interactive virtual environment during awake brain surgery, paving the way for new VR-based brain mapping procedures. Trial Registration ClinicalTrials.gov NCT03010943; https://clinicaltrials.gov/ct2/show/NCT03010943.
Collapse
Affiliation(s)
- Morgane Casanova
- Équipe Facial Analysis Synthesis & Tracking Institue of Electronics and Digital Technologies, CentraleSupélec, Rennes, France
| | - Anne Clavreul
- Département de Neurochirurgie, Centre hospitalier universitaire d'Angers, Angers, France.,Centre de Recherche en Cancérologie et Immunologie Nantes Angers, Université d'Angers, Centre hospitalier universitaire d'Angers, Angers, France
| | - Gwénaëlle Soulard
- Département de Neurochirurgie, Centre hospitalier universitaire d'Angers, Angers, France.,Centre de Recherche en Cancérologie et Immunologie Nantes Angers, Université d'Angers, Centre hospitalier universitaire d'Angers, Angers, France
| | - Matthieu Delion
- Département de Neurochirurgie, Centre hospitalier universitaire d'Angers, Angers, France
| | - Ghislaine Aubin
- Département de Neurochirurgie, Centre hospitalier universitaire d'Angers, Angers, France
| | - Aram Ter Minassian
- Département d'Anesthésie-Réanimation, Centre hospitalier universitaire d'Angers, Angers, France
| | - Renaud Seguier
- Équipe Facial Analysis Synthesis & Tracking Institue of Electronics and Digital Technologies, CentraleSupélec, Rennes, France
| | - Philippe Menei
- Département de Neurochirurgie, Centre hospitalier universitaire d'Angers, Angers, France.,Centre de Recherche en Cancérologie et Immunologie Nantes Angers, Université d'Angers, Centre hospitalier universitaire d'Angers, Angers, France
| |
Collapse
|
44
|
The Effectiveness of Mirroring- and Rhythm-Based Interventions for Children with Autism Spectrum Disorder: a Systematic Review. REVIEW JOURNAL OF AUTISM AND DEVELOPMENTAL DISORDERS 2021. [DOI: 10.1007/s40489-021-00236-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/02/2023]
|
45
|
Dissociable effects of averted "gaze" on the priming of bodily representations and motor actions. Acta Psychol (Amst) 2021; 212:103225. [PMID: 33260014 DOI: 10.1016/j.actpsy.2020.103225] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2020] [Revised: 11/12/2020] [Accepted: 11/17/2020] [Indexed: 01/18/2023] Open
Abstract
Gaze direction is an important stimulus that signals key details about social (dis)engagement and objects in our physical environment. Here, we explore how gaze direction influences the perceiver's processing of bodily information. Specifically, we examined how averted versus direct gaze modifies the operation of effector-centered representations (i.e., specific fingers) versus movement-centered representations (i.e., finger actions). Study 1 used a stimulus-response compatibility paradigm that tested the priming of a relevant effector or relevant movement, after observing videos of direct or averted gaze. We found a selective priming of relevant effectors, but only after averted gaze videos. Study 2 found similar priming effects with symbolic direction cues (averted arrows). Study 3 found that averted gaze cues do not influence generic spatial compatibility effects, and thus, are specific to body representations. In sum, this research suggests that both human and symbolic averted cues selectively prime relevant body-part representations, highlighting the dynamic interplay between our bodies, minds, and environments.
Collapse
|
46
|
Pavic K, Oker A, Chetouani M, Chaby L. Age-related changes in gaze behaviour during social interaction: An eye-tracking study with an embodied conversational agent. Q J Exp Psychol (Hove) 2020; 74:1128-1139. [PMID: 33283649 DOI: 10.1177/1747021820982165] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/30/2023]
Abstract
Previous research has highlighted age-related differences in social perception, in particular emotional expression processing. To date, such studies have largely focused on approaches that use static emotional stimuli that the participant has to identify passively without the possibility of any interaction. In this study, we propose an interactive virtual environment to better address age-related variations in social and emotional perception. A group of 22 young (18-30 years) and 20 older (60-80 years) adults were engaged in a face-to-face conversation with an embodied conversational agent. Participants were invited to interact naturally with the agent and to identify his facial expression. Their gaze behaviour was captured by an eye-tracking device throughout the interaction. We also explored whether the Big Five personality traits (particularly extraversion) and anxiety modulated gaze during the social interaction. Findings suggested that age-related differences in gaze behaviour were only apparent when decoding social signals (i.e., listening to a partner's question, identifying facial expressions) and not when communicating social information (i.e., when speaking). Furthermore, higher extraversion levels consistently led to a shorter amount of time gazing towards the eyes, whereas higher anxiety levels led to slight modulations of gaze only when participants were listening to questions. Face-to-face conversation with virtual agents can provide a more naturalistic framework for the assessment of online socio-emotional interaction in older adults, which is not easily observable in classical offline paradigms. This study provides novel and important insights into the specific circumstances in which older adults may experience difficulties in social interactions.
Collapse
Affiliation(s)
- Katarina Pavic
- Institut de psychologie, Université de Paris, Boulogne-Billancourt, France.,Université de Paris, VAC, Boulogne-Billancourt, France
| | - Ali Oker
- Laboratoire Cognition Santé Société (EA 6291), Université de Reims Champagne-Ardenne, Reims, France
| | - Mohamed Chetouani
- Institut des systèmes intelligents et de robotique (ISIR), Sorbonne Université, CNRS UMR7222, Paris, France
| | - Laurence Chaby
- Institut de psychologie, Université de Paris, Boulogne-Billancourt, France.,Institut des systèmes intelligents et de robotique (ISIR), Sorbonne Université, CNRS UMR7222, Paris, France
| |
Collapse
|
47
|
Barzy M, Ferguson HJ, Williams DM. Perspective influences eye movements during real-life conversation: Mentalising about self versus others in autism. AUTISM : THE INTERNATIONAL JOURNAL OF RESEARCH AND PRACTICE 2020; 24:2153-2165. [PMID: 32643399 PMCID: PMC7539613 DOI: 10.1177/1362361320936820] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/19/2022]
Abstract
LAY ABSTRACT Previous lab-based studies suggest that autistic individuals are less attentive to social aspects of their environment. In our study, we recorded the eye movements of autistic and typically developing adults while they engaged in a real-life social interaction with a partner. Results showed that autistic adults were less likely than typically developing adults to look at the experimenter's face, and instead were more likely to look at the background. Moreover, the perspective that was adopted in the conversation (talking about self versus others) modulated the patterns of eye movements in autistic and non-autistic adults. Overall, people spent less time looking at their conversation partner's eyes and face and more time looking at the background, when talking about an unfamiliar other compared to when talking about themselves. This pattern was magnified among autistic adults. We conclude that allocating attention to social information during conversation is cognitively effortful, but this can be mitigated when talking about a topic that is familiar to them.
Collapse
|
48
|
Krol KM, Grossmann T. Impression Formation in the Human Infant Brain. Cereb Cortex Commun 2020; 1:tgaa070. [PMID: 33134930 PMCID: PMC7592636 DOI: 10.1093/texcom/tgaa070] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2020] [Revised: 08/26/2020] [Accepted: 09/19/2020] [Indexed: 01/12/2023] Open
Abstract
Forming an impression of another person is an essential aspect of human social cognition linked to medial prefrontal cortex (mPFC) function in adults. The current study examined the neurodevelopmental origins of impression formation by testing the hypothesis that infants rely on processes localized in mPFC when forming impressions about individuals who appear friendly or threatening. Infants’ brain responses were measured using functional near-infrared spectroscopy while watching 4 different face identities displaying either smiles or frowns directed toward or away from them (N = 77). This was followed by a looking preference test for these face identities (now displaying a neutral expression) using eyetracking. Our results show that infants’ mPFC responses distinguish between smiling and frowning faces when directed at them and that these responses predicted their subsequent person preferences. This suggests that the mPFC is involved in impression formation in human infants, attesting to the early ontogenetic emergence of brain systems supporting person perception and adaptive behavior.
Collapse
Affiliation(s)
- Kathleen M Krol
- Department of Psychology, University of Virginia, Charlottesville, VA 22903, USA
| | - Tobias Grossmann
- Department of Psychology, University of Virginia, Charlottesville, VA 22903, USA
| |
Collapse
|
49
|
Catalano LT, Green MF, Wynn JK, Lee J. People with schizophrenia do not show the normal benefits of social versus nonsocial attentional cues. Neuropsychology 2020; 34:620-628. [PMID: 32338943 PMCID: PMC8513804 DOI: 10.1037/neu0000642] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022] Open
Abstract
OBJECTIVE Schizophrenia is associated with impairments in social motivation. Social attention has been proposed as an underlying mechanism for social motivation. However, studies in schizophrenia have rarely examined social attention, and none of these studies examined the effects with rapidly presented stimuli. METHOD The current study examined whether individuals with schizophrenia have reduced social attention and whether reduced social attention was related to social motivation deficits (measured with the Clinical Assessment Interview for Negative Symptoms) and decreased social functioning (Role Functioning Scale). Thirty-seven outpatients with schizophrenia and 29 healthy participants completed a gaze cueing task with directional social cues (eye gaze) and nonsocial cues (arrows) at varying stimulus onset asynchronies. RESULTS As predicted, schizophrenia participants had reduced social attention relative to nonsocial attention, compared with healthy participants. Healthy participants were quicker to respond to social cues than nonsocial cues, but schizophrenia participants did not exhibit this same pattern. Schizophrenia participants showed higher accuracy when targets appeared in the same location as a directional cue (i.e., congruency) for nonsocial, but not social, cues. Contrary to expectations, reduced social attention was not significantly correlated with clinically rated social motivation deficits or decreased social functioning in the schizophrenia group. CONCLUSION These findings provide evidence for social attention deficits in schizophrenia, but without a clear mapping of its influence on social motivation. (PsycInfo Database Record (c) 2020 APA, all rights reserved).
Collapse
Affiliation(s)
- Lauren T Catalano
- Desert Pacific Mental Illness Research, Education, and Clinical Center
| | - Michael F Green
- Desert Pacific Mental Illness Research, Education, and Clinical Center
| | - Jonathan K Wynn
- Desert Pacific Mental Illness Research, Education, and Clinical Center
| | - Junghee Lee
- Desert Pacific Mental Illness Research, Education, and Clinical Center
| |
Collapse
|
50
|
Cañigueral R, Ward JA, Hamilton AFDC. Effects of being watched on eye gaze and facial displays of typical and autistic individuals during conversation. AUTISM : THE INTERNATIONAL JOURNAL OF RESEARCH AND PRACTICE 2020; 25:210-226. [PMID: 32854524 PMCID: PMC7812513 DOI: 10.1177/1362361320951691] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/30/2022]
Abstract
Communication with others relies on coordinated exchanges of social signals, such as eye gaze and facial displays. However, this can only happen when partners are able to see each other. Although previous studies report that autistic individuals have difficulties in planning eye gaze and making facial displays during conversation, evidence from real-life dyadic tasks is scarce and mixed. Across two studies, here we investigate how eye gaze and facial displays of typical and high-functioning autistic individuals are modulated by the belief in being seen and potential to show true gaze direction. Participants were recorded with an eye-tracking and video-camera system while they completed a structured Q&A task with a confederate under three social contexts: pre-recorded video, video-call and face-to-face. Typical participants gazed less to the confederate and produced more facial displays when they were being watched and when they were speaking. Contrary to our hypotheses, eye gaze and facial motion patterns in autistic participants were overall similar to the typical group. This suggests that high-functioning autistic participants are able to use eye gaze and facial displays as social signals. Future studies will need to investigate to what extent this reflects spontaneous behaviour or the use of compensation strategies.
Collapse
Affiliation(s)
| | - Jamie A Ward
- University College London, UK.,Goldsmiths, University of London, UK
| | | |
Collapse
|