1
|
Wang Y, Cao R, Chakravarthula PN, Yu H, Wang S. Atypical neural encoding of faces in individuals with autism spectrum disorder. Cereb Cortex 2024; 34:172-186. [PMID: 38696606 PMCID: PMC11065108 DOI: 10.1093/cercor/bhae060] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2023] [Revised: 02/02/2024] [Accepted: 02/03/2024] [Indexed: 05/04/2024] Open
Abstract
Individuals with autism spectrum disorder (ASD) experience pervasive difficulties in processing social information from faces. However, the behavioral and neural mechanisms underlying social trait judgments of faces in ASD remain largely unclear. Here, we comprehensively addressed this question by employing functional neuroimaging and parametrically generated faces that vary in facial trustworthiness and dominance. Behaviorally, participants with ASD exhibited reduced specificity but increased inter-rater variability in social trait judgments. Neurally, participants with ASD showed hypo-activation across broad face-processing areas. Multivariate analysis based on trial-by-trial face responses could discriminate participant groups in the majority of the face-processing areas. Encoding social traits in ASD engaged vastly different face-processing areas compared to controls, and encoding different social traits engaged different brain areas. Interestingly, the idiosyncratic brain areas encoding social traits in ASD were still flexible and context-dependent, similar to neurotypicals. Additionally, participants with ASD also showed an altered encoding of facial saliency features in the eyes and mouth. Together, our results provide a comprehensive understanding of the neural mechanisms underlying social trait judgments in ASD.
Collapse
Affiliation(s)
- Yue Wang
- Department of Radiology, Washington University in St. Louis, 4525 Scott Ave, St. Louis, MO 63110, United States
| | - Runnan Cao
- Department of Radiology, Washington University in St. Louis, 4525 Scott Ave, St. Louis, MO 63110, United States
| | - Puneeth N Chakravarthula
- Department of Radiology, Washington University in St. Louis, 4525 Scott Ave, St. Louis, MO 63110, United States
| | - Hongbo Yu
- Department of Psychological & Brain Sciences, University of California Santa Barbara, Santa Barbara, CA 93106, United States
| | - Shuo Wang
- Department of Radiology, Washington University in St. Louis, 4525 Scott Ave, St. Louis, MO 63110, United States
| |
Collapse
|
2
|
Berlijn AM, Hildebrandt LK, Gamer M. Idiosyncratic viewing patterns of social scenes reflect individual preferences. J Vis 2022; 22:10. [PMID: 36583910 PMCID: PMC9807181 DOI: 10.1167/jov.22.13.10] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/31/2022] Open
Abstract
In general, humans preferentially look at conspecifics in naturalistic images. However, such group-based effects might conceal systematic individual differences concerning the preference for social information. Here, we investigated to what degree fixations on social features occur consistently within observers and whether this preference generalizes to other measures of social prioritization in the laboratory as well as the real world. Participants carried out a free viewing task, a relevance taps task that required them to actively select image regions that are crucial for understanding a given scene, and they were asked to freely take photographs outside the laboratory that were later classified regarding their social content. We observed stable individual differences in the fixation and active selection of human heads and faces that were correlated across tasks and partly predicted the social content of self-taken photographs. Such relationship was not observed for human bodies indicating that different social elements need to be dissociated. These findings suggest that idiosyncrasies in the visual exploration and interpretation of social features exist and predict real-world behavior. Future studies should further characterize these preferences and elucidate how they shape perception and interpretation of social contexts in healthy participants and patients with mental disorders that affect social functioning.
Collapse
Affiliation(s)
- Adam M. Berlijn
- Department of Experimental Psychology, Heinrich-Heine-University Düsseldorf, Düsseldorf, Germany,Institute of Clinical Neuroscience and Medical Psychology, Medical Faculty, University Hospital Düsseldorf, Heinrich-Heine University Düsseldorf, Düsseldorf, Germany,Institute of Neuroscience and Medicine (INM-1), Research Centre Jülich, Jülich, Germany,Department of Psychology, Julius-Maximilians-University Würzburg, Würzburg, Germany,
| | - Lea K. Hildebrandt
- Department of Psychology, Julius-Maximilians-University Würzburg, Würzburg, Germany,
| | - Matthias Gamer
- Department of Psychology, Julius-Maximilians-University Würzburg, Würzburg, Germany,
| |
Collapse
|
3
|
Ruan M, Webster PJ, Li X, Wang S. Deep Neural Network Reveals the World of Autism From a First-Person Perspective. Autism Res 2020; 14:333-342. [PMID: 32869953 DOI: 10.1002/aur.2376] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2020] [Revised: 07/26/2020] [Accepted: 07/27/2020] [Indexed: 01/22/2023]
Abstract
People with autism spectrum disorder (ASD) show atypical attention to social stimuli and aberrant gaze when viewing images of the physical world. However, it is unknown how they perceive the world from a first-person perspective. In this study, we used machine learning to classify photos taken in three different categories (people, indoors, and outdoors) as either having been taken by individuals with ASD or by peers without ASD. Our classifier effectively discriminated photos from all three categories, but was particularly successful at classifying photos of people with >80% accuracy. Importantly, visualization of our model revealed critical features that led to successful discrimination and showed that our model adopted a strategy similar to that of ASD experts. Furthermore, for the first time we showed that photos taken by individuals with ASD contained less salient objects, especially in the central visual field. Notably, our model outperformed classification of these photos by ASD experts. Together, we demonstrate an effective and novel method that is capable of discerning photos taken by individuals with ASD and revealing aberrant visual attention in ASD from a unique first-person perspective. Our method may in turn provide an objective measure for evaluations of individuals with ASD. LAY SUMMARY: People with autism spectrum disorder (ASD) demonstrate atypical visual attention to social stimuli. However, it remains largely unclear how they perceive the world from a first-person perspective. In this study, we employed a deep learning approach to analyze a unique dataset of photos taken by people with and without ASD. Our computer modeling was not only able to discern which photos were taken by individuals with ASD, outperforming ASD experts, but importantly, it revealed critical features that led to successful discrimination, revealing aspects of atypical visual attention in ASD from their first-person perspective.
Collapse
Affiliation(s)
- Mindi Ruan
- Lane Department of Computer Science and Electrical Engineering, West Virginia University, Morgantown, West Virginia, USA
| | - Paula J Webster
- Department of Chemical and Biomedical Engineering and Rockefeller Neuroscience Institute, West Virginia University, Morgantown, West Virginia, USA
| | - Xin Li
- Lane Department of Computer Science and Electrical Engineering, West Virginia University, Morgantown, West Virginia, USA
| | - Shuo Wang
- Department of Chemical and Biomedical Engineering and Rockefeller Neuroscience Institute, West Virginia University, Morgantown, West Virginia, USA
| |
Collapse
|
4
|
Kojovic N, Ben Hadid L, Franchini M, Schaer M. Sensory Processing Issues and Their Association with Social Difficulties in Children with Autism Spectrum Disorders. J Clin Med 2019; 8:E1508. [PMID: 31547076 PMCID: PMC6833094 DOI: 10.3390/jcm8101508] [Citation(s) in RCA: 45] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2019] [Revised: 09/11/2019] [Accepted: 09/18/2019] [Indexed: 01/08/2023] Open
Abstract
Sensory processing issues have been frequently reported in individuals with Autism Spectrum Disorders (ASD), but their relationship with social and overall adaptive functioning has not been extensively characterized to date. Here, we investigate how sensory processing atypicalities relate with deficits in social skills, impaired social cognition, and general adaptive functioning in a group of preschoolers with ASD. Sixty-four children with ASD aged 3 to 6 were included in this study, along with 36 age-matched typically-developing (TD) peers. Parent-reported measures of sensory processing, social difficulties and overall adaptive functioning were collected for all children. We also obtained precise measures of social attention deployment using a custom-design eye-tracking task depicting naturalistic social scenes. Within the group of children with ASD, higher intensities of sensory issues were associated with more prominent social difficulties and lower adaptive functioning. We also found that children with ASD who had more sensory issues showed visual exploration patterns of social scenes that strongly deviated from the one seen in the TD group. The association of sensory processing atypicalities with "higher-order" functional domains such as social and adaptive functioning in children with ASD stresses the importance of further research on sensory symptoms in autism.
Collapse
Affiliation(s)
- Nada Kojovic
- University of Geneva, 1211 Geneva, Switzerland (M.S.)
| | | | | | - Marie Schaer
- University of Geneva, 1211 Geneva, Switzerland (M.S.)
| |
Collapse
|
5
|
Kumazaki H, Muramatsu T, Yoshikawa Y, Matsumoto Y, Miyao M, Ishiguro H, Mimura M, Minabe Y, Kikuchi M. How the Realism of Robot Is Needed for Individuals With Autism Spectrum Disorders in an Interview Setting. Front Psychiatry 2019; 10:486. [PMID: 31354547 PMCID: PMC6637027 DOI: 10.3389/fpsyt.2019.00486] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/16/2018] [Accepted: 06/20/2019] [Indexed: 11/13/2022] Open
Abstract
The preliminary efficacy of interview training using an android robot whose appearance and movements resemble those of an actual human for treating social and communication difficulties in individuals with autism spectrum disorders (ASD) has been demonstrated. Patient preferences regarding the appearance of robots are crucial for incentivizing them to undergo robot-assisted therapy. However, very little is known about how the realistic nature of an android robot is related to incentivizing individuals with ASD in an interview setting. In this study, individuals with ASD underwent an interview with a human interviewer and an android robot. Twenty-three individuals with ASD (age, 17-25 years) participated in this study. After the interview, the participants were evaluated in terms of their motivation to practice an interview with an android robot and their impression of the nature of the android robot in terms of humanness. As expected, subjects exhibited higher motivation to undergo interview training with an android robot than with a human interviewer. Higher motivation to undergo an interview with the android robot was negatively correlated with the participants' impressions of the extent to which the android robot exhibited humanness. This study brings us one step closer to understanding how such an android robot should be designed and implemented to provide sufficiently realistic interview training that can be of therapeutic value.
Collapse
Affiliation(s)
- Hirokazu Kumazaki
- Department of Clinical Research on Social Recognition and Memory, Research Center for Child Mental Development, Kanazawa University, Ishikawa, Japan.,National Center of Neurology and Psychiatry, Department of Preventive Intervention for Psychiatric Disorders, National Institute of Mental Health, Tokyo, Japan
| | - Taro Muramatsu
- Department of Neuropsychiatry, Keio University School of Medicine, Tokyo, Japan
| | - Yuichiro Yoshikawa
- Department of Systems Innovation, Graduate School of Engineering Science, Osaka University, Toyonaka, Japan.,JST ERATO ISHIGURO Symbiotic Human-Robot Interaction, Toyonaka, Japan
| | - Yoshio Matsumoto
- Service Robotics Research Group, Intelligent Systems Institute, National Institute of Advanced Industrial Science and Technology, Ibaraki, Japan
| | - Masutomo Miyao
- Department of Psychosocial Medicine, National Center for Child Health and Development, Tokyo, Japan
| | - Hiroshi Ishiguro
- Department of Systems Innovation, Graduate School of Engineering Science, Osaka University, Toyonaka, Japan.,JST ERATO ISHIGURO Symbiotic Human-Robot Interaction, Toyonaka, Japan
| | - Masaru Mimura
- Department of Neuropsychiatry, Keio University School of Medicine, Tokyo, Japan
| | - Yoshio Minabe
- Department of Clinical Research on Social Recognition and Memory, Research Center for Child Mental Development, Kanazawa University, Ishikawa, Japan
| | - Mitsuru Kikuchi
- Department of Clinical Research on Social Recognition and Memory, Research Center for Child Mental Development, Kanazawa University, Ishikawa, Japan
| |
Collapse
|
6
|
Yu H, Duan Y, Zhou X. Guilt in the eyes: Eye movement and physiological evidence for guilt-induced social avoidance. JOURNAL OF EXPERIMENTAL SOCIAL PSYCHOLOGY 2017. [DOI: 10.1016/j.jesp.2017.03.007] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/09/2023]
|