1
|
Domenici V, Collignon O, Lettieri G. Affect in the dark: Navigating the complex landscape of social cognition in blindness. PROGRESS IN BRAIN RESEARCH 2025; 292:175-202. [PMID: 40409920 DOI: 10.1016/bs.pbr.2025.02.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2025]
Abstract
Research on the consequence of blindness has primarily focused on how visual experience influences basic sensory abilities, mainly overlooking the intricate world of social cognition. However, social cognition abilities are crucial as they enable individuals to navigate complex interactions, understand others' perspectives, regulate emotions, and establish meaningful connections, all essential for successful adaptation and integration into society. Emotional and social signals are frequently conveyed through nonverbal visual cues, and understanding the foundational role vision plays in shaping everyday affective experiences is fundamental. Here, we aim to summarize existing research on social cognition in individuals with blindness. By doing so, we strive to offer a comprehensive overview of social processing in sensory deprivation while pinpointing areas that are still largely unexplored. By identifying gaps in current knowledge, this review paves the way for future investigations to reveal how visual experience shapes the development of emotional and social cognition in the mind and the brain.
Collapse
Affiliation(s)
- Veronica Domenici
- Affective Physiology and Interoception Group (API), MoMiLab, IMT School for Advanced Studies Lucca, Lucca, Italy; Social and Affective Neuroscience Group (SANe), MoMiLab, IMT School for Advanced Studies Lucca, Lucca, Italy; University of Camerino, Camerino, Italy
| | - Olivier Collignon
- Crossmodal Perception and Plasticity Laboratory, Institute of Research in Psychology (IPSY) and Institute of Neuroscience (IoNS), Université Catholique de Louvain, Louvain-la-Neuve, Belgium; School of Health Sciences, HES-SO Valais-Wallis, The Sense Innovation and Research Center, Lausanne, Switzerland
| | - Giada Lettieri
- Affective Physiology and Interoception Group (API), MoMiLab, IMT School for Advanced Studies Lucca, Lucca, Italy; Crossmodal Perception and Plasticity Laboratory, Institute of Research in Psychology (IPSY) and Institute of Neuroscience (IoNS), Université Catholique de Louvain, Louvain-la-Neuve, Belgium.
| |
Collapse
|
2
|
Ben-David BM, Chebat DR, Icht M. "Love looks not with the eyes": supranormal processing of emotional speech in individuals with late-blindness versus preserved processing in individuals with congenital-blindness. Cogn Emot 2024; 38:1354-1367. [PMID: 38785380 DOI: 10.1080/02699931.2024.2357656] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2023] [Revised: 04/10/2024] [Accepted: 05/11/2024] [Indexed: 05/25/2024]
Abstract
Processing of emotional speech in the absence of visual information relies on two auditory channels: semantics and prosody. No study to date has investigated how blindness impacts this process. Two theories, Perceptual Deficit, and Sensory Compensation, yiled different expectations about the role of visual experience (or its lack thereof) in processing emotional speech. To test the effect of vision and early visual experience on processing of emotional speech, we compared individuals with congenital blindness (CB, n = 17), individuals with late blindness (LB, n = 15), and sighted controls (SC, n = 21) on identification and selective-attention of semantic and prosodic spoken-emotions. Results showed that individuals with blindness performed at least as well as SC, supporting Sensory Compensation and the role of cortical reorganisation. Individuals with LB outperformed individuals with CB, in accordance with Perceptual Deficit, supporting the role of early visual experience. The LB advantage was moderated by executive functions (working-memory). Namely, the advantage was erased for individuals with CB who showed higher levels of executive functions. Results suggest that vision is not necessary for processing of emotional speech, but early visual experience could improve it. The findings support a combination of the two aforementioned theories and reject a dichotomous view of deficiencies/enhancements of blindness.
Collapse
Affiliation(s)
- Boaz M Ben-David
- Communication, Aging, and Neuropsychology Lab (CANlab), Baruch Ivcher School of Psychology, Reichman University (IDC), Herzliya, Israel
- Department of Speech-Language Pathology, University of Toronto, Toronto, Canada
- KITE, Toronto Rehabilitation Institute, University Health Networks (UHN), Toronto, Canada
| | - Daniel-Robert Chebat
- Visual and Cognitive Neuroscience Laboratory (VCN Lab), The Department of Psychology, Ariel University, Ariel, Israel
- Navigation and Accessibility Research Center (NARCA), Ariel University, Ariel, Israel
| | - Michal Icht
- Department of Communication Disorders, Ariel University, Ariel, Israel
| |
Collapse
|
3
|
Pisanski K, Reby D, Oleszkiewicz A. Humans need auditory experience to produce typical volitional nonverbal vocalizations. COMMUNICATIONS PSYCHOLOGY 2024; 2:65. [PMID: 39242947 PMCID: PMC11332021 DOI: 10.1038/s44271-024-00104-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/03/2023] [Accepted: 05/16/2024] [Indexed: 09/09/2024]
Abstract
Human nonverbal vocalizations such as screams and cries often reflect their evolved functions. Although the universality of these putatively primordial vocal signals and their phylogenetic roots in animal calls suggest a strong reflexive foundation, many of the emotional vocalizations that we humans produce are under our voluntary control. This suggests that, like speech, volitional vocalizations may require auditory input to develop typically. Here, we acoustically analyzed hundreds of volitional vocalizations produced by profoundly deaf adults and typically-hearing controls. We show that deaf adults produce unconventional and homogenous vocalizations of aggression and pain that are unusually high-pitched, unarticulated, and with extremely few harsh-sounding nonlinear phenomena compared to controls. In contrast, fear vocalizations of deaf adults are relatively acoustically typical. In four lab experiments involving a range of perception tasks with 444 participants, listeners were less accurate in identifying the intended emotions of vocalizations produced by deaf vocalizers than by controls, perceived their vocalizations as less authentic, and reliably detected deafness. Vocalizations of congenitally deaf adults with zero auditory experience were most atypical, suggesting additive effects of auditory deprivation. Vocal learning in humans may thus be required not only for speech, but also to acquire the full repertoire of volitional non-linguistic vocalizations.
Collapse
Affiliation(s)
- Katarzyna Pisanski
- ENES Bioacoustics Research Laboratory, CRNL Center for Research in Neuroscience in Lyon, University of Saint-Étienne, 42023, Saint-Étienne, France.
- CNRS French National Centre for Scientific Research, DDL Dynamics of Language Lab, University of Lyon 2, 69007, Lyon, France.
- Institute of Psychology, University of Wrocław, 50-527, Wrocław, Poland.
| | - David Reby
- ENES Bioacoustics Research Laboratory, CRNL Center for Research in Neuroscience in Lyon, University of Saint-Étienne, 42023, Saint-Étienne, France
- Institut Universitaire de France, Paris, France
| | - Anna Oleszkiewicz
- Institute of Psychology, University of Wrocław, 50-527, Wrocław, Poland.
- Department of Otorhinolaryngology, Smell and Taste Clinic, Carl Gustav Carus Medical School, Technische Universitaet Dresden, 01307, Dresden, Germany.
| |
Collapse
|
4
|
Chennaz L, Mascle C, Baltenneck N, Baudouin JY, Picard D, Gentaz E, Valente D. Recognition of facial expressions of emotions in tactile drawings by blind children, children with low vision and sighted children. Acta Psychol (Amst) 2024; 247:104330. [PMID: 38852319 DOI: 10.1016/j.actpsy.2024.104330] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2024] [Revised: 05/02/2024] [Accepted: 06/06/2024] [Indexed: 06/11/2024] Open
Abstract
In the context of blindness, studies on the recognition of facial expressions of emotions by touch are essential to define the compensatory touch abilities and to create adapted tools on emotions. This study is the first to examine the effect of visual experience in the recognition of tactile drawings of facial expressions of emotions by children with different visual experiences. To this end, we compared the recognition rates of tactile drawings of emotions between blind children, children with low vision and sighted children aged 6-12 years. Results revealed no effect of visual experience on recognition rates. However, an effect of emotions and an interaction effect between emotions and visual experience were found. Indeed, while all children had a low average recognition rate, the drawings of fear, anger and disgust were particularly poorly recognized. Moreover, sighted children were significantly better at recognizing the drawings of surprise and sadness than the blind children who only showed high recognition rates for joy. The results of this study support the importance of developing emotion tools that can be understood by children with different visual experiences.
Collapse
Affiliation(s)
- Lola Chennaz
- Laboratory of Sensory-motor Affective and Social Development (SMAS), Faculty of Psychology and Educational Sciences (FAPSE), University of Geneva, Switzerland.
| | - Carolane Mascle
- Inter-university Laboratory for Education and Communication Sciences (LISEC), University of Strasbourg, France.
| | - Nicolas Baltenneck
- Laboratory of Development, Individual, Process, Disability, Education (UR DIPHE), University Lumière Lyon 2, France.
| | - Jean-Yves Baudouin
- Laboratory of Development, Individual, Process, Disability, Education (UR DIPHE), University Lumière Lyon 2, France.
| | | | - Edouard Gentaz
- Laboratory of Sensory-motor Affective and Social Development (SMAS), Faculty of Psychology and Educational Sciences (FAPSE), University of Geneva, Switzerland; Swiss Center for Affective Sciences, University of Geneva, Switzerland.
| | - Dannyelle Valente
- Laboratory of Sensory-motor Affective and Social Development (SMAS), Faculty of Psychology and Educational Sciences (FAPSE), University of Geneva, Switzerland; Laboratory of Development, Individual, Process, Disability, Education (UR DIPHE), University Lumière Lyon 2, France; Swiss Center for Affective Sciences, University of Geneva, Switzerland.
| |
Collapse
|
5
|
Haase CM. Emotion Regulation in Couples Across Adulthood. ANNUAL REVIEW OF DEVELOPMENTAL PSYCHOLOGY 2023; 5:399-421. [PMID: 38939362 PMCID: PMC11210602 DOI: 10.1146/annurev-devpsych-120621-043836] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/29/2024]
Abstract
Intimate relationships are hotbeds of emotion. This article presents key findings and current directions in research on couples' emotion regulation across adulthood as a critical context in which older adults not only maintain functioning but may also outshine younger adults. First, I introduce key concepts, defining qualities (i.e., dynamic, coregulatory, bidirectional, bivalent), and measures (i.e., self-report versus performance-based) of couples' emotion regulation. Second, I highlight a socioemotional turn in our understanding of adult development with the advent of socioemotional selectivity theory. Third, I offer a life-span developmental perspective on emotion regulation in couples (i.e., across infancy, adolescence and young adulthood, midlife, and late life). Finally, I present the idea that emotion regulation may shift from "me to us" across adulthood and discuss how emotion regulation in couples may become more important, better, and increasingly consequential (e.g., for relationship outcomes, well-being, and health) with age. Ideas for future research are then discussed.
Collapse
Affiliation(s)
- Claudia M Haase
- School of Education and Social Policy and (by courtesy) Department of Psychology, Northwestern University, Evanston, Illinois, USA
| |
Collapse
|
6
|
Giraud M, Marelli M, Nava E. Embodied language of emotions: Predicting human intuitions with linguistic distributions in blind and sighted individuals. Heliyon 2023; 9:e17864. [PMID: 37539291 PMCID: PMC10395297 DOI: 10.1016/j.heliyon.2023.e17864] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2022] [Revised: 06/11/2023] [Accepted: 06/29/2023] [Indexed: 08/05/2023] Open
Abstract
Recent constructionist theories have suggested that language and sensory experience play a crucial role not only in how individuals categorise emotions but also in how they experience and shape them, helping to acquire abstract concepts that are used to make sense of bodily perceptions associated with specific emotions. Here, we aimed to investigate the role of sensory experience in conceptualising bodily felt emotions by asking 126 Italian blind participants to freely recall in which part of the body they commonly feel specific emotions (N = 15). Participants varied concerning visual experience in terms of blindness onset (i.e., congenital vs late) and degree of visual experience (i.e., total vs partial sensory loss). Using an Italian semantic model to estimate to what extent discrete emotions are associated with body parts in language experience, we found that all participants' reports correlated with the model predictions. Interestingly, blind - and especially congenitally blind - participants' responses were more strongly correlated with the model, suggesting that language might be one of the possible compensative mechanisms for the lack of visual feedback in constructing bodily felt emotions. Our findings present theoretical implications for the study of emotions, as well as potential real-world applications for blind individuals, by revealing, on the one hand, that vision plays an essential role in the construction of felt emotions and the way we talk about our related bodily (emotional) experiences. On the other hand, evidence that blind individuals rely more strongly on linguistic cues suggests that vision is a strong cue to acquire emotional information from the surrounding world, influencing how we experience emotions. While our findings do not suggest that blind individuals experience emotions in an atypical and dysfunctional way, they nonetheless support the view that promoting the use of non-visual emotional signs and body language since early on might help the blind child to develop a good emotional awareness as well as good emotion regulation abilities.
Collapse
|
7
|
Determination of “Neutral”–“Pain”, “Neutral”–“Pleasure”, and “Pleasure”–“Pain” Affective State Distances by Using AI Image Analysis of Facial Expressions. TECHNOLOGIES 2022. [DOI: 10.3390/technologies10040075] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/27/2023]
Abstract
(1) Background: In addition to verbalizations, facial expressions advertise one’s affective state. There is an ongoing debate concerning the communicative value of the facial expressions of pain and of pleasure, and to what extent humans can distinguish between these. We introduce a novel method of analysis by replacing human ratings with outputs from image analysis software. (2) Methods: We use image analysis software to extract feature vectors of the facial expressions neutral, pain, and pleasure displayed by 20 actresses. We dimension-reduced these feature vectors, used singular value decomposition to eliminate noise, and then used hierarchical agglomerative clustering to detect patterns. (3) Results: The vector norms for pain–pleasure were rarely less than the distances pain–neutral and pleasure–neutral. The pain–pleasure distances were Weibull-distributed and noise contributed 10% to the signal. The noise-free distances clustered in four clusters and two isolates. (4) Conclusions: AI methods of image recognition are superior to human abilities in distinguishing between facial expressions of pain and pleasure. Statistical methods and hierarchical clustering offer possible explanations as to why humans fail. The reliability of commercial software, which attempts to identify facial expressions of affective states, can be improved by using the results of our analyses.
Collapse
|
8
|
Kim HN. The frequency of facial muscles engaged in expressing emotions in people with visual disabilities via cloud-based video communication. THEORETICAL ISSUES IN ERGONOMICS SCIENCE 2022. [DOI: 10.1080/1463922x.2022.2081374] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Affiliation(s)
- Hyung Nam Kim
- Department of Industrial and Systems Engineering, North Carolina A&T State University, Greensboro, NC, USA
| |
Collapse
|
9
|
Chennaz L, Valente D, Baltenneck N, Baudouin JY, Gentaz E. Emotion regulation in blind and visually impaired children aged 3 to 12 years assessed by a parental questionnaire. Acta Psychol (Amst) 2022; 225:103553. [PMID: 35279432 DOI: 10.1016/j.actpsy.2022.103553] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2021] [Revised: 02/03/2022] [Accepted: 02/24/2022] [Indexed: 11/01/2022] Open
Abstract
Emotion regulation develops from the earliest years of a child's life and mostly through visual information. Considering the importance of emotion regulation in daily life situations, it is important to study the effect of visual experience on the development of this ability. This study is the first to examine the effect of visual experience and age in emotion regulation by comparing groups of children with different visual status and age. For this purpose, after testing the reliability and consistency of the French version of Emotion Regulation Checklist (ERC-vf) with 245 parents of blind, visually impaired and sighted children aged 3-5, 6-8 or 9-12 years, we conducted analyses on the effect of visual status and age on emotion regulation composite scores. The first result confirmed that the ERC-vf can be reliably used on populations of blind and visually impaired children. The second result revealed an effect of visual status on ER composite scores of emotion regulation: Blind children and visually impaired children each had significantly lower composite scores than sighted children. Moreover, the effect of age and the interaction between age and visual status were not significant on ER composite scores. The ER subscale results suggest, however, that age may have a variable effect for blind and visually impaired children as blind children's scores become lower and those of visually impaired children become equal to sighted children with age. The results of our study may help the children's entourage to better adapt their interactions in a context of visual impairment.
Collapse
|
10
|
Kim HN, Sutharson SJ. Individual differences in spontaneous facial expressions in people with visual impairment and blindness. BRITISH JOURNAL OF VISUAL IMPAIRMENT 2022. [DOI: 10.1177/02646196211070927] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
People can visualize their spontaneous and voluntary emotions via facial expressions, which play a critical role in social interactions. However, less is known about mechanisms of spontaneous emotion expressions, especially in adults with visual impairment and blindness. Nineteen adults with visual impairment and blindness participated in interviews where the spontaneous facial expressions were observed and analyzed via the Facial Action Coding System (FACS). We found a set of Action Units, primarily engaged in expressing the spontaneous emotions, which were likely to be affected by participants’ different characteristics. The results of this study could serve as evidence to suggest that adults with visual impairment and blindness show individual differences in spontaneous facial expressions of emotions.
Collapse
|
11
|
Pregnancy, Motherhood and Partner Support in Visually Impaired Women: A Qualitative Study. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2022; 19:ijerph19074308. [PMID: 35409989 PMCID: PMC8998677 DOI: 10.3390/ijerph19074308] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 02/17/2022] [Revised: 03/31/2022] [Accepted: 04/02/2022] [Indexed: 12/10/2022]
Abstract
Background: This qualitative study aimed to explore the experiences of women with vision impairments regarding the meaning of motherhood and their mothering-related issues and priorities. Methods: In-depth individual, semi-structured interviews were conducted between July and December 2020 with a group of visually impaired mothers residing in Italy. The interviews explored experiences related to pregnancy, childbirth, and motherhood; support received from partners, family, and friends; ways of interacting and communicating with the child; and the participants’ sense of personal self-efficacy and self-awareness. Results: Fifteen women participated in this study, ten with a congenital visual impairment and five with an acquired disability. The mean age of the sample was 49 years. The qualitative content analysis of the transcripts of the interviews pointed out four main themes or categories: (1) pregnancy and motherhood experiences, (2) family and social support, (3) relationship and communication with the child, and (4) self-efficacy and self-awareness. Conclusions: This study underlined that mothers with visual impairments show a strong desire to be recognized and accepted as women and mothers by their social environment. Adequate social and family support is associated with a better sense of personal self-efficacy and greater confidence in one’s skills as a mother.
Collapse
|
12
|
Saurav S, Saini AK, Saini R, Singh S. Deep learning inspired intelligent embedded system for haptic rendering of facial emotions to the blind. Neural Comput Appl 2022. [DOI: 10.1007/s00521-021-06613-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/31/2022]
|
13
|
Abstract
Imitation is one of the core building blocks of human social cognition, supporting capacities as diverse as empathy, social learning, and knowledge acquisition1. Newborns' ability to match others' motor acts, while quite limited initially, drastically improves during the first months of development2. Of notable importance to human sociality is our tendency to rapidly mimic facial expressions of emotion. Facial mimicry develops around six months of age3, but because of its late emergence, the factors supporting its development are relatively unknown. One possibility is that the development of facial mimicry depends on seeing emotional imitative behavior in others4. Alternatively, the drive to imitate facial expressions of emotion may be independent of visual learning and be supported by modality-general processes. Here we report evidence for the latter, by showing that congenitally blind participants facially imitate smiles heard in speech, despite having never seen a facial expression.
Collapse
Affiliation(s)
- Pablo Arias
- STMS Lab (IRCAM/CNRS/Sorbonne Université), 1 Place Igor Stravinsky, 75004 Paris, France; Lund University Cognitive Science, Lund University, Box 192, 221 00 Lund, Sweden.
| | - Caren Bellmann
- Institut National des Jeunes Aveugles, 56 Bd des Invalides, 75007 Paris, France
| | - Jean-Julien Aucouturier
- STMS Lab (IRCAM/CNRS/Sorbonne Université), 1 Place Igor Stravinsky, 75004 Paris, France; FEMTO-ST Institute (CNRS/Université Bourgogne Franche Comté), 15B Av. des Montboucons, 25000 Besançon, France
| |
Collapse
|
14
|
Abstract
Few questions in science are as controversial as human nature. At stake is whether our basic concepts and emotions are all learned from experience, or whether some are innate. Here, I demonstrate that reasoning about innateness is biased by the basic workings of the human mind. Psychological science suggests that newborns possess core concepts of "object" and "number." Laypeople, however, believe that newborns are devoid of such notions but that they can recognize emotions. Moreover, people presume that concepts are learned, whereas emotions (along with sensations and actions) are innate. I trace these beliefs to two tacit psychological principles: intuitive dualism and essentialism. Essentialism guides tacit reasoning about biological inheritance and suggests that innate traits reside in the body; per intuitive dualism, however, the mind seems ethereal, distinct from the body. It thus follows that, in our intuitive psychology, concepts (which people falsely consider as disembodied) must be learned, whereas emotions, sensations, and emotions (which are considered embodied) are likely innate; these predictions are in line with the experimental results. These conclusions do not speak to the question of whether concepts and emotions are innate, but they suggest caution in its scientific evaluation.
Collapse
Affiliation(s)
- Iris Berent
- Department of Psychology, Northeastern University, Boston, MA 02115
| |
Collapse
|
15
|
Chen X, Liu Z, Lu MH, Yao X. The recognition of emotional prosody in students with blindness: Effects of early visual experience and age development. BRITISH JOURNAL OF DEVELOPMENTAL PSYCHOLOGY 2021; 40:112-129. [PMID: 34467548 DOI: 10.1111/bjdp.12394] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2021] [Revised: 07/23/2021] [Indexed: 01/13/2023]
Abstract
This study examined the role of early visual experience and age in the recognition of emotional prosody among students with visual impairments in China. A total of 75 primary and junior high school students participated in the study. The ability of participants to recognize the prosody of four basic emotions (sadness, anger, happiness, and neutrality) was explored. The findings were as follows. (1) Early visual experience had a significant effect on the recognition of emotional prosody. The accuracy rate of students with congenital blindness was lower than that of students with adventitious blindness, and the performance of students with congenital blindness was lower than that of sighted students. The students with congenital blindness exhibited the slowest recognition speeds. (2) Age had a significant effect on the emotional prosody recognition accuracy of the sighted students, but it had no effect on the students with blindness.
Collapse
Affiliation(s)
- Xiaomeng Chen
- Special Education Department, School of Education, South China Normal University, Guangzhou, China
| | - Zehui Liu
- Yunxiang School of Baiyun District, Guangzhou, China
| | - Ming-Hui Lu
- Special Education Department, School of Education, Guangzhou University, China
| | - Xiaoxue Yao
- Special Education Department, School of Education, South China Normal University, Guangzhou, China
| |
Collapse
|
16
|
Facial expressions can be categorized along the upper-lower facial axis, from a perceptual perspective. Atten Percept Psychophys 2021; 83:2159-2173. [PMID: 33759116 DOI: 10.3758/s13414-021-02281-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/09/2021] [Indexed: 11/08/2022]
Abstract
A critical question, fundamental for building models of emotion, is how to categorize emotions. Previous studies have typically taken one of two approaches: (a) they focused on the pre-perceptual visual cues, how salient facial features or configurations were displayed; or (b) they focused on the post-perceptual affective experiences, how emotions affected behavior. In this study, we attempted to group emotions at a peri-perceptual processing level: it is well known that humans perceive different facial expressions differently, therefore, can we classify facial expressions into distinct categories in terms of their perceptual similarities? Here, using a novel non-lexical paradigm, we assessed the perceptual dissimilarities between 20 facial expressions using reaction times. Multidimensional-scaling analysis revealed that facial expressions were organized predominantly along the upper-lower face axis. Cluster analysis of behavioral data delineated three superordinate categories, and eye-tracking measurements validated these clustering results. Interestingly, these superordinate categories can be conceptualized according to how facial displays interact with acoustic communications: One group comprises expressions that have salient mouth features. They likely link to species-specific vocalization, for example, crying, laughing. The second group comprises visual displays with diagnosing features in both the mouth and the eye regions. They are not directly articulable but can be expressed prosodically, for example, sad, angry. Expressions in the third group are also whole-face expressions but are completely independent of vocalization, and likely being blends of two or more elementary expressions. We propose a theoretical framework to interpret the tripartite division in which distinct expression subsets are interpreted as successive phases in an evolutionary chain.
Collapse
|
17
|
Arioli M, Ricciardi E, Cattaneo Z. Social cognition in the blind brain: A coordinate-based meta-analysis. Hum Brain Mapp 2020; 42:1243-1256. [PMID: 33320395 PMCID: PMC7927293 DOI: 10.1002/hbm.25289] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2020] [Revised: 10/05/2020] [Accepted: 10/31/2020] [Indexed: 01/04/2023] Open
Abstract
Social cognition skills are typically acquired on the basis of visual information (e.g., the observation of gaze, facial expressions, gestures). In light of this, a critical issue is whether and how the lack of visual experience affects neurocognitive mechanisms underlying social skills. This issue has been largely neglected in the literature on blindness, despite difficulties in social interactions may be particular salient in the life of blind individuals (especially children). Here we provide a meta-analysis of neuroimaging studies reporting brain activations associated to the representation of self and others' in early blind individuals and in sighted controls. Our results indicate that early blindness does not critically impact on the development of the "social brain," with social tasks performed on the basis of auditory or tactile information driving consistent activations in nodes of the action observation network, typically active during actual observation of others in sighted individuals. Interestingly though, activations along this network appeared more left-lateralized in the blind than in sighted participants. These results may have important implications for the development of specific training programs to improve social skills in blind children and young adults.
Collapse
Affiliation(s)
- Maria Arioli
- Department of Psychology, University of Milano-Bicocca, Milan, Italy
| | | | - Zaira Cattaneo
- Department of Psychology, University of Milano-Bicocca, Milan, Italy.,IRCCS Mondino Foundation, Pavia, Italy
| |
Collapse
|
18
|
Malsert J, Tran K, Tran TAT, Ha-Vinh T, Gentaz E, Leuchter RHV. Cross-Cultural and Environmental Influences on Facial Emotional Discrimination Sensitivity in 9-Year-Old Children from Swiss and Vietnamese Schools. SWISS JOURNAL OF PSYCHOLOGY 2020. [DOI: 10.1024/1421-0185/a000240] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
Abstract. The Other Race Effect (ORE), i.e., recognition facilitation for own-race faces, is a well-established phenomenon with broad evidence in adults and infants. Nevertheless, the ORE in older children is poorly understood, and even less so for emotional face processing. This research samples 87 9-year-old children from Vietnamese and Swiss schools. In two separate studies, we evaluated the children’s abilities to perceive the disappearance of emotions in Asian and Caucasian faces in an offset task. The first study evaluated an “emotional ORE” in Vietnamese-Asian, Swiss-Caucasian, and Swiss-Multicultural children. Offset times showed an emotional ORE in Vietnamese-Asian children living in an ethnically homogeneous environment, whereas mixed ethnicities in Swiss children seem to have balanced performance between face types. The second study compared socioemotionally trained versus untrained Vietnamese-Asian children. Vietnamese children showed a strong emotional ORE and tend to increase their sensitivity to emotion offset after training. Moreover, an effect of emotion consistent with previous observation in adults could suggest a cultural sensitivity to disapproval signs. Taken together, the results suggest that 9-year-old children can present an emotional ORE, but that a heterogeneous environment or an emotional training could strengthen face-processing abilities without reducing skills on their own-group.
Collapse
Affiliation(s)
- Jennifer Malsert
- SensoriMotor, Affective, and Social Development Lab, University of Geneva, Geneva, Switzerland
- Swiss Center for Affective Sciences, Campus Biotech, Geneva, Switzerland
| | - Khanh Tran
- Eurasia Foundation and Association for Special Education in Vietnam, Ho Chi Minh City, Vietnam
| | - Tu Anh Thi Tran
- University of Education, Hue University, Thua Thien Hue, Vietnam
| | - Tho Ha-Vinh
- Eurasia Foundation and Association for Special Education in Vietnam, Ho Chi Minh City, Vietnam
| | - Edouard Gentaz
- SensoriMotor, Affective, and Social Development Lab, University of Geneva, Geneva, Switzerland
- Swiss Center for Affective Sciences, Campus Biotech, Geneva, Switzerland
| | - Russia Ha-Vinh Leuchter
- Division of Development and Growth, Department of Pediatrics, University of Geneva, Geneva, Switzerland
| |
Collapse
|
19
|
Fazzi E, Micheletti S, Galli J, Rossi A, Gitti F, Molinaro A. Autism in Children With Cerebral and Peripheral Visual Impairment: Fact or Artifact? Semin Pediatr Neurol 2019; 31:57-67. [PMID: 31548026 DOI: 10.1016/j.spen.2019.05.008] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
Abstract
The aim of this study was to evaluate the occurrence and clinical characteristics of autism spectrum disorder in visually impaired children. In total, 273 participants, 214 with cerebral causes of vision impairment and 59 with peripheral causes, were assessed using multiple assessment methods and adapted for individuals with vision loss. We found that autism spectrum disorder was more prevalent in the visually impaired compared to general population, and that the prevalence varied according to the type of visual disorder (2.8% for cerebral and 8.4% for peripheral visual impairment). In subjects with cerebral visual impairment, the presence of autistic symptoms was consistent with the diagnosis of autism spectrum disorder. In children with peripheral visual impairment, certain symptoms related to visual loss overlapped with the clinical features of autism spectrum disorder, thus making clinical diagnosis more challenging. The development of assessment tools that take into account the type and level of visual impairment and validation testing in a larger population sample are needed in order to confirm these initial findings regarding the diagnosis of autism spectrum disorder in visually impaired children.
Collapse
Affiliation(s)
- Elisa Fazzi
- Unit of Child Neurology and Psychiatry, ASST Spedali Civili, Brescia, Italy; Department of Clinical and Experimental Sciences, University of Brescia, Brescia, Italy.
| | - Serena Micheletti
- Unit of Child Neurology and Psychiatry, ASST Spedali Civili, Brescia, Italy
| | - Jessica Galli
- Unit of Child Neurology and Psychiatry, ASST Spedali Civili, Brescia, Italy; Department of Clinical and Experimental Sciences, University of Brescia, Brescia, Italy
| | - Andrea Rossi
- Unit of Child Neurology and Psychiatry, ASST Spedali Civili, Brescia, Italy
| | - Filippo Gitti
- Unit of Child Neurology and Psychiatry, ASST Spedali Civili, Brescia, Italy
| | - Anna Molinaro
- Department of Clinical and Experimental Sciences, University of Brescia, Brescia, Italy
| |
Collapse
|
20
|
Barrett LF, Adolphs R, Marsella S, Martinez A, Pollak SD. Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements. Psychol Sci Public Interest 2019; 20:1-68. [PMID: 31313636 PMCID: PMC6640856 DOI: 10.1177/1529100619832930] [Citation(s) in RCA: 459] [Impact Index Per Article: 76.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
Abstract
It is commonly assumed that a person's emotional state can be readily inferred from his or her facial movements, typically called emotional expressions or facial expressions. This assumption influences legal judgments, policy decisions, national security protocols, and educational practices; guides the diagnosis and treatment of psychiatric illness, as well as the development of commercial applications; and pervades everyday social interactions as well as research in other scientific fields such as artificial intelligence, neuroscience, and computer vision. In this article, we survey examples of this widespread assumption, which we refer to as the common view, and we then examine the scientific evidence that tests this view, focusing on the six most popular emotion categories used by consumers of emotion research: anger, disgust, fear, happiness, sadness, and surprise. The available scientific evidence suggests that people do sometimes smile when happy, frown when sad, scowl when angry, and so on, as proposed by the common view, more than what would be expected by chance. Yet how people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation. Furthermore, similar configurations of facial movements variably express instances of more than one emotion category. In fact, a given configuration of facial movements, such as a scowl, often communicates something other than an emotional state. Scientists agree that facial movements convey a range of information and are important for social communication, emotional or otherwise. But our review suggests an urgent need for research that examines how people actually move their faces to express emotions and other social information in the variety of contexts that make up everyday life, as well as careful study of the mechanisms by which people perceive instances of emotion in one another. We make specific research recommendations that will yield a more valid picture of how people move their faces to express emotions and how they infer emotional meaning from facial movements in situations of everyday life. This research is crucial to provide consumers of emotion research with the translational information they require.
Collapse
Affiliation(s)
- Lisa Feldman Barrett
- Northeastern University, Department of Psychology, Boston, MA
- Massachusetts General Hospital, Department of Psychiatry and the Athinoula A. Martinos Center for Biomedical Imaging, Charlestown, MA
- Harvard Medical School, Department of Psychiatry, Boston MA
| | - Ralph Adolphs
- California Institute of Technology, Departments of Psychology, Neuroscience, and Biology,Pasadena, CA
| | - Stacy Marsella
- Northeastern University, Department of Psychology, Boston, MA
- Northeastern University, College of Computer and Information Science, Boston, MA
- University of Glasgow, Glasgow, Scotland
| | - Aleix Martinez
- The Ohio State University, Department of Electrical and Computer Engineering, and Center for Cognitive and Brain Sciences, Columbus, OH
| | - Seth D. Pollak
- University of Wisconsin - Madison, Department of Psychology, Madison, WI
| |
Collapse
|
21
|
Martins AT, Faísca L, Vieira H, Gonçalves G. Emotional Recognition and Empathy both in Deaf and Blind Adults. JOURNAL OF DEAF STUDIES AND DEAF EDUCATION 2019; 24:119-127. [PMID: 30668877 DOI: 10.1093/deafed/eny046] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/18/2018] [Revised: 12/05/2018] [Accepted: 12/10/2018] [Indexed: 06/09/2023]
Abstract
Studies addressing the recognition of emotions in blind or deaf participants have been carried out only with children and adolescents. Due to these age limits, such studies do not clarify the long-term effects of vision and hearing disabilities on emotion recognition in adults. We assessed the ability to recognize basic emotions in 15 deaf adults (aged 32.4 ± 8.1 yrs) and in 15 blind adults (48.3 ± 10.5 yrs). Auditory and visual stimuli expressing six basic emotional states were presented to participants (Florida Affect Battery). Participants also performed an empathy test. Deaf participants showed difficulties in emotion recognition tasks compared to the typical hearing participants; however, differences were only statistically reliable for Facial Emotion Discrimination and Naming tasks (specifically, naming expressions of fear). Deaf participants also revealed inferior levels of cognitive empathy. Concerning blind participants, their performance was lower than the controls' only when the task required the evaluation of emotional prosody while ignoring the semantic content of the sentence. Overall, although deaf and blind participants performed reasonably well on tasks requiring recognition of basic emotions, sensory loss may hinder their social perception skills when processing subtle emotions or when the extraction of simultaneous prosodic and semantic information is required.
Collapse
Affiliation(s)
- Ana Teresa Martins
- University of Algarve, Centre for Biomedical Research (CBMR) and Centre for Spatial and Organizational Dynamics (CIEO)
| | - Luís Faísca
- University of Algarve, Centre for Biomedical Research (CBMR) and Centre for Spatial and Organizational Dynamics (CIEO)
| | - Helena Vieira
- University of Algarve, Centre for Biomedical Research (CBMR) and Centre for Spatial and Organizational Dynamics (CIEO)
| | - Gabriela Gonçalves
- University of Algarve, Centre for Biomedical Research (CBMR) and Centre for Spatial and Organizational Dynamics (CIEO)
| |
Collapse
|
22
|
Kastrup V, Valente D. How to Make the Body Speak? Visual Disability, Verbalism and Embodied Speech. PSICOLOGIA: CIÊNCIA E PROFISSÃO 2018. [DOI: 10.1590/1982-3703000052018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Abstract Verbalism represents a controversial issue in the field of visual disability. It is frequently stated that blind people use statements with words and expressions which are not based on direct sensory experience. Sometimes it is considered a pathology or something specific to blind people. In taking the work of three blind researchers – Pierre Villey, Joana Belarmino and Bertrand Verine – as a guideline, this paper emphasizes two main points: 1) The usage of words with visual references constitutes a strategy of inclusion in a social environment dominated by vision; 2) The importance to develop new affirmative actions to stimulate embodied and multisensory discourse, favoring experiences of belonging and sharing between the blind and the sighted beyond the hegemony of vision.
Collapse
Affiliation(s)
| | - Dannyelle Valente
- University of Lumière Lyon 2, France; University of Geneva, Switzerland
| |
Collapse
|