1
|
Krendl AC. Disentangling the role of executive function and episodic memory in older adults' performance on dynamic theory of mind tasks. NEUROPSYCHOLOGY, DEVELOPMENT, AND COGNITION. SECTION B, AGING, NEUROPSYCHOLOGY AND COGNITION 2025:1-22. [PMID: 40084971 DOI: 10.1080/13825585.2025.2476586] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/19/2024] [Accepted: 03/03/2025] [Indexed: 03/16/2025]
Abstract
Theory of mind is a core social cognitive ability, and declines over the lifespan. Prior work examining the mechanisms underlying older adults' theory of mind deficits has yielded heterogenous results. One reason for this might be a general reliance on static, rather than dynamic, stimuli. Because dynamic measures may best capture everyday theory of mind engagement, the current study examined whether executive function and/or episodic memory - the primary mechanisms examined in prior work - predicted older adults' static and dynamic theory of mind performance. In Study 1, 153 older adults completed traditional static measures of theory of mind (false belief task, Reading the Mind in the Eyes) and a dynamic theory of mind measure that captured multiple domains of theory of mind (e.g. inferring beliefs, understanding emotions). They also completed comprehensive measures of executive function and episodic memory. Episodic memory, but not executive function, predicted theory of mind performance across tasks. In Study 2, 124 different older adults completed two novel dynamic tasks, and the same cognitive measures from Study 1. The first dynamic task was similar to the Study 1, but was relatively unfamiliar. In the second task, older adults made continuous (e.g. dynamic) awkwardness ratings while watching a video. This task reduces ceiling effects, a frequent limitation of theory of mind research. Replicating the results in Study 1, episodic memory, but not executive function, predicted older adults' performance on both tasks. Together, these findings suggest that episodic memory ability predicts older adults' static and dynamic theory of mind performance.
Collapse
Affiliation(s)
- Anne C Krendl
- Department of Psychological & Brain Sciences, Indiana University, Bloomington, USA
| |
Collapse
|
2
|
Takemoto A, Iwamoto M, Yaegashi H, Yun S, Takashima R. Virtual avatar communication task eliciting pseudo-social isolation and detecting social isolation using non-verbal signal monitoring in older adults. Front Psychol 2025; 16:1507178. [PMID: 40160551 PMCID: PMC11951265 DOI: 10.3389/fpsyg.2025.1507178] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2024] [Accepted: 02/21/2025] [Indexed: 04/02/2025] Open
Abstract
Social isolation and loneliness are two of the main causes of mental health problems or suicide, not only in younger adults but also in older adults. Thus, identifying an effective method to detect social isolation is important in the field of human-machine interaction. However, to the best of our knowledge, no effective method has been developed to elicit pseudosocial isolation tasks to evaluate social isolation detection systems for older adults. This study has two research aims: 1. To develop a virtual avatar conversation cyberball task to evoke pseudosocial isolation in older adults and, 2. to identify non-verbal indicators that replace social isolation in older adults. To achieve these objectives, 22 older men were recruited as participants. They were asked to communicate with two virtual avatars on a monitor and then to rate the follow-up questions provided to evaluate the level of social isolation and emotions; meanwhile, facial expressions and gaze patterns were recorded by a camera and an eye tracker. In the results, the developed virtual avatar conversation cyberball task successfully induced pseudosocial isolation in older adults, and this social isolation was detected by the intensity of inner/outer eyebrow and eyelid movements and the blink frequency.
Collapse
Affiliation(s)
- Ayumi Takemoto
- Institute of Development, Aging and Cancer, Tohoku University, Sendai, Miyagi, Japan
- Bioinformatics Laboratory, Riga Stradins University, Riga, Latvia
| | - Miyuki Iwamoto
- Department of Social System Studies, Doshisha Women's College of Liberal Arts, Kyoto, Japan
- Graduate School of Science and Technology, Kyoto Institute of Technology, Kyoto, Japan
| | - Haruto Yaegashi
- Faculty of Education, Tohoku University, Sendai, Miyagi, Japan
| | - Shan Yun
- Faculty of Health Science, Hokkaido University, Sapporo, Hokkaido, Japan
| | - Risa Takashima
- Faculty of Health Science, Hokkaido University, Sapporo, Hokkaido, Japan
| |
Collapse
|
3
|
Meinhardt-Injac B, Altvater-Mackensen N, Mohs A, Goulet-Pelletier JC, Boutet I. Emotion Processing in Late Adulthood: The Effect of Emotional Valence and Face Age on Behavior and Scanning Patterns. Behav Sci (Basel) 2025; 15:302. [PMID: 40150197 PMCID: PMC11939290 DOI: 10.3390/bs15030302] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/03/2024] [Revised: 02/25/2025] [Accepted: 02/26/2025] [Indexed: 03/29/2025] Open
Abstract
Age-related differences in emotion recognition are well-documented in older adults aged 65 and above, with stimulus valence and the age of the model being key influencing factors. This study examined these variables across three experiments using a novel set of images depicting younger and older models expressing positive and negative emotions (e.g., happy vs. sad; interested vs. bored). Experiment 1 focused on valence-arousal dimensions, Experiment 2 on emotion recognition accuracy, and Experiment 3 on visual fixation patterns. Age-related differences were found in emotion recognition. No significant age-related differences in gaze behavior were found; both age groups looked more at the eye region. The positivity effect-older adults' tendency to prioritize positive over negative information-did not consistently manifest in recognition performance or scanning patterns. However, older adults evaluated positive emotions differently than negative emotions, rating negative facial expressions as less negative and positive emotions as more arousing compared to younger adults. Finally, emotions portrayed by younger models were rated as more intense and more positive than those portrayed by older adults by both older and younger adults. We conclude that the positivity effect and own-age bias may be more complex and nuanced than previously thought.
Collapse
Affiliation(s)
- Bozana Meinhardt-Injac
- Department of Psychology, Catholic University of Applied Sciences Berlin (KHSB), 10318 Berlin, Germany
| | | | - Alexandra Mohs
- School of Humanities, University of Mannheim, 68161 Mannheim, Germany; (N.A.-M.); (A.M.)
| | | | - Isabelle Boutet
- School of Psychology, University Ottawa, Ottawa, ON K1N 6N5, Canada; (J.-C.G.-P.); (I.B.)
| |
Collapse
|
4
|
Yong MH, Waqas M, Ruffman T. Effects of age on behavioural and eye gaze on Theory of Mind using movie for social cognition. Q J Exp Psychol (Hove) 2024; 77:2476-2487. [PMID: 38356176 PMCID: PMC11607846 DOI: 10.1177/17470218241235811] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2023] [Revised: 11/10/2023] [Accepted: 01/08/2024] [Indexed: 02/16/2024]
Abstract
Evidence has shown that older adults have lower accuracy in Theory of Mind (ToM) tasks compared with young adults, but we are still unclear whether the difficulty in decoding mental states in older adults stems from not looking at the critical areas, and more so from the ageing Asian population. Most ToM studies use static images or short vignettes to measure ToM but these stimuli are dissimilar to everyday social interactions. We investigated this question using a dynamic task that measured both accuracy and error types, and examined the links between accuracy and error types to eye gaze fixation at critical areas (e.g., eyes, mouth, body). A total of 82 participants (38 older, 44 young adults) completed the Movie for the Assessment of Social Cognition (MASC) task on the eye tracker. Results showed that older adults had a lower overall accuracy with more errors in the ipo-ToM (under-mentalising) and no-ToM (lack of mentalisation) conditions compared with young adults. We analysed the eye gaze data using principal components analysis and found that increasing age and looking less at the face were related to lower MASC accuracy in our participants. Our findings suggest that ageing deficits in ToM are linked to a visual attention deficit specific to the perception of socially relevant nonverbal cues.
Collapse
|
5
|
Ma J, Liu X, Li Y. A Comparative Study Recognizing the Expression of Information Between Elderly Individuals and Young Individuals. Psychol Res Behav Manag 2024; 17:3111-3120. [PMID: 39253353 PMCID: PMC11382663 DOI: 10.2147/prbm.s471196] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2024] [Accepted: 08/26/2024] [Indexed: 09/11/2024] Open
Abstract
Background Studies have shown that elderly individuals have significantly worse facial expression recognition scores than young adults. Some have suggested that this difference is due to perceptual degradation, while others suggest it is due to decreased attention of elderly individuals to the most informative regions of the face. Methods To resolve this controversy, this study recruited 85 participants and used a behavioral task and eye-tracking techniques (EyeLink 1000 Plus eye tracker). It adopted the "study-recognition" paradigm, and a mixed experimental design of 3 (facial expressions: positive, neutral, negative) × 2 (subjects' age: young, old) × 3 (facial areas of interest: eyes, nose, and mouth) was used to explore whether there was perceptual degradation in older people's attention to facial expressions and investigate the differences in diagnostic areas between young and older people. Results The behavioral results revealed that young participants had significantly higher facial expression recognition scores than older participants did; moreover, the eye-tracking results revealed that younger people generally fixated on faces significantly more than elderly people, demonstrating the perceptual degradation in elderly people. Young people primarily look at the eyes, followed by the nose and, finally, the mouth when examining facial expressions. The elderly participants primarily focus on the eyes, followed by the mouth and then the nose. Conclusion The findings confirmed that young participants have better facial expression recognition performance than elderly participants, which may be related more to perceptual degradation than to decreased attention to informative areas of the face. For elderly people, the duration of gaze toward the facial diagnosis area (such as the eyes) should be increased when recognizing faces to compensate for the disadvantage of decreased facial recognition performance caused by perceptual aging.
Collapse
Affiliation(s)
- Jialin Ma
- Faculty of Education, Henan University, Kaifeng, Henan Province, People's Republic of China
| | - Xiaojing Liu
- Faculty of Education, Henan University, Kaifeng, Henan Province, People's Republic of China
| | - Yongxin Li
- Faculty of Education, Henan University, Kaifeng, Henan Province, People's Republic of China
| |
Collapse
|
6
|
Hamilton LJ, Krendl AC. Evidence for the role of affective theory of mind in face-name associative memory. NEUROPSYCHOLOGY, DEVELOPMENT, AND COGNITION. SECTION B, AGING, NEUROPSYCHOLOGY AND COGNITION 2024; 31:417-437. [PMID: 36999681 PMCID: PMC10544671 DOI: 10.1080/13825585.2023.2194607] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/02/2022] [Accepted: 03/20/2023] [Indexed: 04/01/2023]
Abstract
Poor face-name recall has been associated with age-related impairments in cognitive functioning, namely declines in episodic memory and executive control. However, the role of social cognitive function - the ability to remember, process, and store information about others - has been largely overlooked in this work. Extensive work has shown that social and nonsocial cognitive processes rely on unique, albeit overlapping, mechanisms. In the current study, we explored whether social cognitive functioning - specifically the ability to infer other people's mental states (i.e., theory of mind) - facilitates better face-name learning. To do this, a sample of 289 older and young adults completed a face-name learning paradigm along with standard assessments of episodic memory and executive control alongside two theory of mind measures, one static and one dynamic. In addition to expected age differences, several key effects emerged. Age-related differences in recognition were explained by episodic memory, not social cognition. However, age effects in recall were explained by both episodic memory and social cognition, specifically affective theory of mind in the dynamic task. Altogether, we contend that face-name recall can be supported by social cognitive functioning, namely understanding emotions. While acknowledging the influence of task characteristics (i.e., lures, target ages), we interpret these findings in light of existing accounts of age differences in face-name associative memory.
Collapse
Affiliation(s)
- Lucas J Hamilton
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN, USA
| | - Anne C Krendl
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN, USA
| |
Collapse
|
7
|
González-Gualda LM, Vicente-Querol MA, García AS, Molina JP, Latorre JM, Fernández-Sotos P, Fernández-Caballero A. An exploratory study of the effect of age and gender on face scanning during affect recognition in immersive virtual reality. Sci Rep 2024; 14:5553. [PMID: 38448515 PMCID: PMC10918108 DOI: 10.1038/s41598-024-55774-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/01/2023] [Accepted: 02/26/2024] [Indexed: 03/08/2024] Open
Abstract
A person with impaired emotion recognition is not able to correctly identify facial expressions represented by other individuals. The aim of the present study is to assess eyes gaze and facial emotion recognition in a healthy population using dynamic avatars in immersive virtual reality (IVR). For the first time, the viewing of each area of interest of the face in IVR is studied by gender and age. This work in healthy people is conducted to assess the future usefulness of IVR in patients with deficits in the recognition of facial expressions. Seventy-four healthy volunteers participated in the study. The materials used were a laptop computer, a game controller, and a head-mounted display. Dynamic virtual faces randomly representing the six basic emotions plus neutral expression were used as stimuli. After the virtual human represented an emotion, a response panel was displayed with the seven possible options. Besides storing the hits and misses, the software program internally divided the faces into different areas of interest (AOIs) and recorded how long participants looked at each AOI. As regards the overall accuracy of the participants' responses, hits decreased from the youngest to the middle-aged and older adults. Also, all three groups spent the highest percentage of time looking at the eyes, but younger adults had the highest percentage. It is also noteworthy that attention to the face compared to the background decreased with age. Moreover, the hits between women and men were remarkably similar and, in fact, there were no statistically significant differences between them. In general, men paid more attention to the eyes than women, but women paid more attention to the forehead and mouth. In contrast to previous work, our study indicates that there are no differences between men and women in facial emotion recognition. Moreover, in line with previous work, the percentage of face viewing time for younger adults is higher than for older adults. However, contrary to earlier studies, older adults look more at the eyes than at the mouth.Consistent with other studies, the eyes are the AOI with the highest percentage of viewing time. For men the most viewed AOI is the eyes for all emotions in both hits and misses. Women look more at the eyes for all emotions, except for joy, fear, and anger on hits. On misses, they look more into the eyes for almost all emotions except surprise and fear.
Collapse
Affiliation(s)
- Luz M González-Gualda
- Servicio de Salud de Castilla-La Mancha, Complejo Hospitalario Universitario de Albacete, Servicio de Salud Mental, 02004, Albacete, Spain
| | - Miguel A Vicente-Querol
- Neurocognition and Emotion Unit, Instituto de Investigación en Informática de Albacete, 02071, Albacete, Spain
| | - Arturo S García
- Neurocognition and Emotion Unit, Instituto de Investigación en Informática de Albacete, 02071, Albacete, Spain
- Departmento de Sistemas Informáticos, Universidad de Castilla-La Mancha, 02071, Albacete, Spain
| | - José P Molina
- Neurocognition and Emotion Unit, Instituto de Investigación en Informática de Albacete, 02071, Albacete, Spain
- Departmento de Sistemas Informáticos, Universidad de Castilla-La Mancha, 02071, Albacete, Spain
| | - José M Latorre
- Departmento de Psicología, Universidad de Castilla-La Mancha, 02071, Albacete, Spain
| | - Patricia Fernández-Sotos
- Servicio de Salud de Castilla-La Mancha, Complejo Hospitalario Universitario de Albacete, Servicio de Salud Mental, 02004, Albacete, Spain
- CIBERSAM-ISCIII (Biomedical Research Networking Centre in Mental Health, Instituto de Salud Carlos III), 28016, Madrid, Spain
| | - Antonio Fernández-Caballero
- Neurocognition and Emotion Unit, Instituto de Investigación en Informática de Albacete, 02071, Albacete, Spain.
- Departmento de Sistemas Informáticos, Universidad de Castilla-La Mancha, 02071, Albacete, Spain.
- CIBERSAM-ISCIII (Biomedical Research Networking Centre in Mental Health, Instituto de Salud Carlos III), 28016, Madrid, Spain.
| |
Collapse
|
8
|
Effects of aging on face processing: An ERP study of the own-age bias with neutral and emotional faces. Cortex 2023; 161:13-25. [PMID: 36878097 DOI: 10.1016/j.cortex.2023.01.007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2022] [Revised: 07/25/2022] [Accepted: 01/19/2023] [Indexed: 02/20/2023]
Abstract
Older adults systematically show an enhanced N170 amplitude during the visualization of facial expressions of emotion. The present study aimed to replicate this finding, further investigating if this effect is specific to facial stimuli, present in other neural correlates of face processing, and modulated by own-age faces. To this purpose, younger (n = 25; Mage = 28.36), middle-aged (n = 23; Mage = 48.74), and older adults (n = 25; Mage = 67.36) performed two face/emotion identification tasks during an EEG recording. The results showed that groups did not differ regarding P100 amplitude, but older adults had increased N170 amplitude for both facial and non-facial stimuli. The event-related potentials analysed were not modulated by an own-age bias, but older faces elicited larger N170 in the Emotion Identification Task for all groups. This increased amplitude may reflect a higher ambiguity of older faces due to age-related changes in their physical features, which may elicit higher neural resources to decode. Regarding P250, older faces elicited decreased amplitudes than younger faces, which may reflect a reduced processing of the emotional content of older faces. This interpretation is consistent with the lower accuracy obtained for this category of stimuli across groups. These results have important social implications and suggest that aging may hamper the neural processing of facial expressions of emotion, especially for own-age peers.
Collapse
|
9
|
Ruffman T, Kong Q, Lim HM, Du K, Tiainen E. Recognition of facial emotions across the lifespan: 8-year-olds resemble older adults. BRITISH JOURNAL OF DEVELOPMENTAL PSYCHOLOGY 2023; 41:128-139. [PMID: 36773033 DOI: 10.1111/bjdp.12442] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2022] [Revised: 01/21/2023] [Accepted: 01/27/2023] [Indexed: 02/12/2023]
Abstract
On standard emotion recognition tasks with relatively long or unlimited stimuli durations, recognition improves as children grow older, whereas older adults are worse than young adults. Crucially, it was unknown (a) how older adults compare to age groups below young adulthood and (b) whether children can recognize emotions at shorter durations, with short durations likely common in real life. We compared emotion recognition in 5-year-olds, 8-year-olds, young adults and older adults at very brief durations (50 ms, 250 ms) as well as standard unlimited durations. Eight-year-olds were better than 5-year-olds, young adults than all groups, and there was a striking similarity between 8-year-olds and older adults, providing the first clear indication that older adults' recognition abilities are equivalent to that of an 8-year-old at all durations. Emotion recognition was above chance on all emotions and durations among the three older age groups and on most stimuli for 5-year-olds.
Collapse
Affiliation(s)
- Ted Ruffman
- Department of Psychology, University of Otago, Dunedin, New Zealand
| | - Qiuyi Kong
- Department of Psychology, University of Otago, Dunedin, New Zealand
| | - Hui Mei Lim
- Department of Psychology, University of Otago, Dunedin, New Zealand
| | - Kangning Du
- Department of Psychology, University of Otago, Dunedin, New Zealand
| | - Emilia Tiainen
- Department of Psychology, University of Otago, Dunedin, New Zealand
| |
Collapse
|
10
|
Plank IS, Christiansen LN, Kunas SL, Dziobek I, Bermpohl F. Mothers need more information to recognise associated emotions in child facial expressions. Cogn Emot 2022; 36:1299-1312. [PMID: 35930357 DOI: 10.1080/02699931.2022.2105819] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/21/2023]
Abstract
Parenting requires mothers to read social cues and understand their children. It is particularly important that they recognise their child's emotions to react appropriately, for example, with compassion to sadness or compersion to happiness. Despite this importance, it is unclear how motherhood affects women's ability to recognise emotions associated with facial expressions in children. Using videos of an emotionally neutral face continually and gradually taking on a facial expression associated with an emotion, we quantified the amount of information needed to match the emotion with the facial expression. Mothers needed more information than non-mothers to match the emotions with the facial expressions. Both mothers and non-mothers performed equally on a control task identifying animals instead of emotions, and both groups needed less information when recognising the emotions associated with facial expressions in adolescents than pre-schoolers. These results indicate that mothers need more information for to correctly recognise typically associated emotions in child facial expressions but not for similar tasks not involving emotions. A possible explanation is that child facial expressions associated with emotions may have a greater emotional impact on mothers than non-mothers leading to task interference but possibly also to increased compassion and compersion.
Collapse
Affiliation(s)
- Irene S Plank
- Department of Psychology, Humboldt-Universität zu Berlin, Berlin, Germany.,Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Berlin, Germany.,Department of Psychiatry and Psychotherapy, Charité - Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Berlin, Germany.,Einstein Center for Neurosciences, Charité - Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Berlin, Germany
| | - Lina-Nel Christiansen
- Center for Chronically Sick Children, Charité - Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Berlin, Germany
| | - Stefanie L Kunas
- Department of Psychiatry and Psychotherapy, Charité - Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Berlin, Germany
| | - Isabel Dziobek
- Department of Psychology, Humboldt-Universität zu Berlin, Berlin, Germany.,Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Berlin, Germany.,Einstein Center for Neurosciences, Charité - Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Berlin, Germany
| | - Felix Bermpohl
- Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Berlin, Germany.,Department of Psychiatry and Psychotherapy, Charité - Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Berlin, Germany.,Einstein Center for Neurosciences, Charité - Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Berlin, Germany
| |
Collapse
|
11
|
Slessor G, Insch P, Donaldson I, Sciaponaite V, Adamowicz M, Phillips LH. Adult Age Differences in Using Information From the Eyes and Mouth to Make Decisions About Others' Emotions. J Gerontol B Psychol Sci Soc Sci 2022; 77:2241-2251. [PMID: 35948271 PMCID: PMC9799183 DOI: 10.1093/geronb/gbac097] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2022] [Indexed: 01/13/2023] Open
Abstract
OBJECTIVES Older adults are often less accurate than younger counterparts at identifying emotions such as anger, sadness, and fear from faces. They also look less at the eyes and more at the mouth during emotion perception. The current studies advance understanding of the nature of these age effects on emotional processing. METHODS Younger and older participants identified emotions from pictures of eyes or mouths (Experiment 1) and incongruent mouth-eyes emotion combinations (Experiment 2). In Experiment 3, participants categorized emotions from pictures in which face masks covered the mouth region. RESULTS Older adults were worse than young at identifying anger and sadness from eyes, but better at identifying the same emotions from the mouth region (Experiment 1) and they were more likely than young to use information from the mouth to classify anger, fear, and disgust (Experiment 2). In Experiment 3, face masks impaired perception of anger, sadness, and fear more for older compared to younger adults. DISCUSSION These studies indicate that older people are more able than young to interpret emotional information from the mouth, they are more biased to use information from the mouth, and suffer more difficulty in emotion perception when the mouth is covered with a face mask. This has implications for social communication in different age groups.
Collapse
Affiliation(s)
| | - Pauline Insch
- School of Psychology, University of Aberdeen, Aberdeen, UK
| | - Isla Donaldson
- School of Psychology, University of Aberdeen, Aberdeen, UK
| | | | | | - Louise H Phillips
- Address correspondence to: Louise Phillips, PhD, School of Psychology, University of Aberdeen, Aberdeen AB24 3FX, UK. E-mail:
| |
Collapse
|
12
|
Low ACY, Oh VYS, Tong EMW, Scarf D, Ruffman T. Older adults have difficulty decoding emotions from the eyes, whereas easterners have difficulty decoding emotion from the mouth. Sci Rep 2022; 12:7408. [PMID: 35524152 PMCID: PMC9076610 DOI: 10.1038/s41598-022-11381-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2021] [Accepted: 04/19/2022] [Indexed: 12/05/2022] Open
Abstract
Older adults and Easterners have worse emotion recognition (than young adults and Westerners, respectively), but the question of why remains unanswered. Older adults look less at eyes, whereas Easterners look less at mouths, raising the possibility that compelling older adults to look at eyes, and Easterners to look at mouths, might improve recognition. We did this by comparing emotion recognition in 108 young adults and 109 older adults from New Zealand and Singapore in the (a) eyes on their own (b) mouth on its own or (c) full face. Older adults were worse than young adults on 4/6 emotions with the Eyes Only stimuli, but only 1/6 emotions with the Mouth Only stimuli. In contrast, Easterners were worse than Westerners on 6/6 emotions for Mouth Only and Full Face stimuli, but were equal on all six emotions for Eyes Only stimuli. These results provide a substantial leap forward because they point to the precise difficulty for older adults and Easterners. Older adults have more consistent difficulty identifying individual emotions in the eyes compared to the mouth, likely due to declining brain functioning, whereas Easterners have more consistent difficulty identifying emotions from the mouth than the eyes, likely due to inexperience inferring mouth information.
Collapse
Affiliation(s)
- Anna C Y Low
- Department of Psychology, University of Otago, P.O. Box 56, Dunedin, 9054, New Zealand
| | - Vincent Y S Oh
- Department of Psychology, National University of Singapore, Block AS4, Level 2, 9 Arts Link, Singapore, 117570, Singapore
| | - Eddie M W Tong
- Department of Psychology, National University of Singapore, Block AS4, Level 2, 9 Arts Link, Singapore, 117570, Singapore
| | - Damian Scarf
- Department of Psychology, University of Otago, P.O. Box 56, Dunedin, 9054, New Zealand
| | - Ted Ruffman
- Department of Psychology, University of Otago, P.O. Box 56, Dunedin, 9054, New Zealand.
| |
Collapse
|
13
|
Ito A, Yoshida K, Aoki R, Fujii T, Kawasaki I, Hayashi A, Ueno A, Sakai S, Mugikura S, Takahashi S, Mori E. The Role of the Ventromedial Prefrontal Cortex in Preferential Decisions for Own- and Other-Age Faces. Front Psychol 2022; 13:822234. [PMID: 35360573 PMCID: PMC8962742 DOI: 10.3389/fpsyg.2022.822234] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2021] [Accepted: 02/16/2022] [Indexed: 11/21/2022] Open
Abstract
Own-age bias is a well-known bias reflecting the effects of age, and its role has been demonstrated, particularly, in face recognition. However, it remains unclear whether an own-age bias exists in facial impression formation. In the present study, we used three datasets from two published and one unpublished functional magnetic resonance imaging (fMRI) study that employed the same pleasantness rating task with fMRI scanning and preferential choice task after the fMRI to investigate whether healthy young and older participants showed own-age effects in face preference. Specifically, we employed a drift-diffusion model to elaborate the existence of own-age bias in the processes of preferential choice. The behavioral results showed higher rating scores and higher drift rate for young faces than for older faces, regardless of the ages of participants. We identified a young-age effect, but not an own-age effect. Neuroimaging results from aggregation analysis of the three datasets suggest a possibility that the ventromedial prefrontal cortex (vmPFC) was associated with evidence accumulation of own-age faces; however, no clear evidence was provided. Importantly, we found no age-related decline in the responsiveness of the vmPFC to subjective pleasantness of faces, and both young and older participants showed a contribution of the vmPFC to the parametric representation of the subjective value of face and functional coupling between the vmPFC and ventral visual area, which reflects face preference. These results suggest that the preferential choice of face is less susceptible to the own-age bias across the lifespan of individuals.
Collapse
Affiliation(s)
- Ayahito Ito
- Research Institute for Future Design, Kochi University of Technology, Kochi, Japan
| | - Kazuki Yoshida
- Faculty of Health Sciences, Hokkaido University, Sapporo, Japan
| | - Ryuta Aoki
- Graduate School of Humanities, Tokyo Metropolitan University, Tokyo, Japan
| | - Toshikatsu Fujii
- Kansei Fukushi Research Institute, Tohoku Fukushi University, Sendai, Japan
| | - Iori Kawasaki
- Department of Behavioral Neurology and Cognitive Neuroscience, Graduate School of Medicine, Tohoku University, Sendai, Japan
| | - Akiko Hayashi
- Department of Behavioral Neurology and Cognitive Neuroscience, Graduate School of Medicine, Tohoku University, Sendai, Japan
| | - Aya Ueno
- Department of Behavioral Neurology and Cognitive Neuroscience, Graduate School of Medicine, Tohoku University, Sendai, Japan
| | - Shinya Sakai
- Faculty of Health Sciences, Hokkaido University, Sapporo, Japan
| | - Shunji Mugikura
- Division of Image Statistics, Tohoku Medical Megabank Organization, Sendai, Japan
- Department of Diagnostic Radiology, Graduate School of Medicine, Tohoku University, Sendai, Japan
| | - Shoki Takahashi
- Department of Diagnostic Radiology, Graduate School of Medicine, Tohoku University, Sendai, Japan
| | - Etsuro Mori
- Department of Behavioral Neurology and Cognitive Neuroscience, Graduate School of Medicine, Tohoku University, Sendai, Japan
| |
Collapse
|
14
|
Age and gender effects on the human’s ability to decode posed and naturalistic emotional faces. Pattern Anal Appl 2022. [DOI: 10.1007/s10044-021-01049-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
15
|
The age-related positivity effect in cognition: A review of key findings across different cognitive domains. PSYCHOLOGY OF LEARNING AND MOTIVATION 2022. [DOI: 10.1016/bs.plm.2022.08.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
|
16
|
Pollux PM. Age-of-actor effects in body expression recognition of children. Acta Psychol (Amst) 2021; 220:103421. [PMID: 34564027 DOI: 10.1016/j.actpsy.2021.103421] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2020] [Revised: 09/14/2021] [Accepted: 09/20/2021] [Indexed: 10/20/2022] Open
Abstract
Investigations of developmental trajectories for emotion recognition suggest that both face- and body expression recognition increases rapidly in early childhood and reaches adult levels of performance near the age of ten. So far, little is known about whether children's ability to recognise body expressions is influenced by the age of the person they are observing. This question is investigated here by presenting 119 children and 42 young adults with videos of children, young adults and older adults expressing emotions with their whole body. The results revealed an own-age advantage for children, reflected in adult-level accuracy for videos of children for most expressions but reduced accuracy for videos of older adults. Children's recognition of older adults' expressions was not correlated with children's estimated amount of contact with older adults. Support for potential influences of social biases on performance measures was minimal. The own-age advantage was explained in terms of children's reduced familiarity with body expressions of older adults due to aging related changes in the kinematics characteristics of movements and potentially due to stronger embodiment of other children's bodily movements.
Collapse
|
17
|
Chuang YC, Chiu MJ, Chen TF, Chang YL, Lai YM, Cheng TW, Hua MS. An Exploration of the Own-Age Effect on Facial Emotion Recognition in Normal Elderly People and Individuals with the Preclinical and Demented Alzheimer's Disease. J Alzheimers Dis 2021; 80:259-269. [PMID: 33522998 DOI: 10.3233/jad-200916] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
BACKGROUND The issue of whether there exists an own-effect on facial recognition in the elderly remains equivocal. Moreover, currently the literature of this issue in pathological aging is little. OBJECTIVE Our study was thus to explore the issue in both of healthy older people and patients with ADMethods:In study 1, 27 older and 31 younger healthy adults were recruited; in study 2, 27 healthy older adults and 80 patients (including subjective cognitive decline (SCD), mild cognitive impairment (MCI), and Alzheimer's disease (AD) groups) were recruited. Participants received the Taiwan Facial Emotion Recognition Task (FER Task), and a clinical neuropsychological assessment. RESULTS No significant differences on the FER test were found among our groups, except for sadness recognition in which our MCI and AD patients' scores were remarkably lower than their healthy counterparts. The own-age effect was not significantly evident in healthy younger and older adults, except for recognizing neutral photos. Our patients with MCI and AD tended to have the effect, particularly for the sad recognition in which the effect was significantly evident in terms of error features (mislabeling it as anger in younger-face and neutral in older-face photos). CONCLUSION Our results displayed no remarkable own-age effect on facial emotional recognition in the healthy elderly (including SCD). However, it did not appear the case for MCI and AD patients, especially their recognizing those sadness items, suggesting that an inclusion of the FER task particularly involving those items of low-intensity emotion in clinical neuropsychological assessment might be contributory to the early detection of AD-related pathological individuals.
Collapse
Affiliation(s)
- Yu-Chen Chuang
- Department of Psychology, College of Science, National Taiwan University, Taiwan
| | - Ming-Jang Chiu
- Department of Neurology, National Taiwan University Hospital, College of Medicine, National Taiwan University, Taipei, Taiwan.,Graduate Institute of Brain and Mind Sciences, College of Medicine, National Taiwan University, Taiwan.,Graduate Institute of Biomedical Electronics and Bioinformatics, National Taiwan University, Taiwan
| | - Ta-Fu Chen
- Department of Neurology, National Taiwan University Hospital, College of Medicine, National Taiwan University, Taipei, Taiwan
| | - Yu-Ling Chang
- Department of Psychology, College of Science, National Taiwan University, Taiwan.,Department of Neurology, National Taiwan University Hospital, College of Medicine, National Taiwan University, Taipei, Taiwan.,Neurobiology and Cognitive Science Center, National Taiwan University, Taipei, Taiwan
| | - Ya-Mei Lai
- Department of Neurology, National Taiwan University Hospital, College of Medicine, National Taiwan University, Taipei, Taiwan
| | - Ting-Wen Cheng
- Department of Neurology, National Taiwan University Hospital, College of Medicine, National Taiwan University, Taipei, Taiwan.,Graduate Institute of Brain and Mind Sciences, College of Medicine, National Taiwan University, Taiwan
| | - Mau-Sun Hua
- Department of Psychology, College of Science, National Taiwan University, Taiwan.,Department of Neurology, National Taiwan University Hospital, College of Medicine, National Taiwan University, Taipei, Taiwan.,Department of Psychology, Asia University, Taichung, Taiwan
| |
Collapse
|
18
|
Development and validation of film stimuli to assess empathy in the work context. Behav Res Methods 2021; 54:75-93. [PMID: 34100203 PMCID: PMC8863710 DOI: 10.3758/s13428-021-01594-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/31/2021] [Indexed: 11/28/2022]
Abstract
A growing body of research suggests that empathy predicts important work outcomes, yet limitations in existing measures to assess empathy have been noted. Extending past work on the assessment of empathy, this study introduces a newly developed set of emotion-eliciting film clips that can be used to assess both cognitive (emotion perception) and affective (emotional congruence and sympathy) facets of empathy in vivo. Using the relived emotions paradigm, film protagonists were instructed to think aloud about an autobiographical, emotional event from working life and relive their emotions while being videotaped. Subsequently, protagonists were asked to provide self-reports of the intensity of their emotions during retelling their event. In a first study with 128 employees, who watched the film clips and rated their own as well as the protagonists’ emotions, we found that the film clips are effective in eliciting moderate levels of emotions as well as sympathy in the test taker and can be used to calculate reliable convergence scores of emotion perception and emotional congruence. Using a selected subset of six film clips, a second two-wave study with 99 employees revealed that all facet-specific measures of empathy had moderate-to-high internal consistencies and test–retest reliabilities, and correlated in expected ways with other self-report and test-based empathy tests, cognition, and demographic variables. With these films, we expand the choice of testing materials for empathy in organizational research to cover a larger array of research questions.
Collapse
|
19
|
Abo Foul Y, Eitan R, Mortillaro M, Aviezer H. Perceiving dynamic emotions expressed simultaneously in the face and body minimizes perceptual differences between young and older adults. J Gerontol B Psychol Sci Soc Sci 2021; 77:84-93. [PMID: 33842959 DOI: 10.1093/geronb/gbab064] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2020] [Indexed: 11/13/2022] Open
Abstract
OBJECTIVES It is commonly argued that older adults show difficulties in standardized tasks of emotional expression perception, yet most previous works relied on classic sets of static, decontextualized, and stereotypical facial expressions. In real-life, facial expressions are dynamic and embedded in a rich context, two key factors that may aid emotion perception. Specifically, body language provides important affective cues that may disambiguate facial movements. METHOD We compared emotion perception of dynamic faces, bodies, and their combination, in a sample of older (age 60-83, n=126) and young (age 18-30, n=124) adults. We used the Geneva Multimodal Emotion Portrayals (GEMEP) set, which includes a full view of expressers' faces and bodies, displaying a diverse range of positive and negative emotions, portrayed dynamically and holistically in a non-stereotypical, unconstrained manner. Critically, we digitally manipulated the dynamic cue such that perceivers viewed isolated faces (without bodies), isolated bodies (without faces), or faces with bodies. RESULTS Older adults showed better perception of positive and negative dynamic facial expressions, while young adults showed better perception of positive isolated dynamic bodily expressions. Importantly, emotion perception of faces with bodies was comparable across ages. DISCUSSION Dynamic emotion perception in young and older adults may be more similar than previously assumed, especially when the task is more realistic and ecological. Our results emphasize the importance of contextualized and ecological tasks in emotion perception across ages.
Collapse
Affiliation(s)
- Yasmin Abo Foul
- Department of Psychology, The Hebrew University of Jerusalem.,Department of Psychiatry, Hadassah-Hebrew University Medical Center, Jerusalem
| | - Renana Eitan
- Department of Psychiatry, Hadassah-Hebrew University Medical Center, Jerusalem.,Neuropsychiatry Unit, Jerusalem Mental Health Center, The Hebrew University of Jerusalem.,Department of Psychiatry, Brigham and Women's Hospital, Harvard Medical School, Boston
| | | | - Hillel Aviezer
- Department of Psychology, The Hebrew University of Jerusalem
| |
Collapse
|
20
|
Stopyn RJN, Hadjistavropoulos T, Loucks J. An Eye Tracking Investigation of Pain Decoding Based on Older and Younger Adults' Facial Expressions. JOURNAL OF NONVERBAL BEHAVIOR 2021; 45:31-52. [PMID: 33678933 PMCID: PMC7900079 DOI: 10.1007/s10919-020-00344-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/24/2020] [Indexed: 12/30/2022]
Abstract
Nonverbal pain cues such as facial expressions, are useful in the systematic assessment of pain in people with dementia who have severe limitations in their ability to communicate. Nonetheless, the extent to which observers rely on specific pain-related facial responses (e.g., eye movements, frowning) when judging pain remains unclear. Observers viewed three types of videos of patients expressing pain (younger patients, older patients without dementia, older patients with dementia) while wearing an eye tracker device that recorded their viewing behaviors. They provided pain ratings for each patient in the videos. These observers assigned higher pain ratings to older adults compared to younger adults and the highest pain ratings to patients with dementia. Pain ratings assigned to younger adults showed greater correspondence to objectively coded facial reactions compared to older adults. The correspondence of observer ratings was not affected by the cognitive status of target patients as there were no differences between the ratings assigned to older adults with and without dementia. Observers' percentage of total dwell time (amount of time that an observer glances or fixates within a defined visual area of interest) across specific facial areas did not predict the correspondence of observers' pain ratings to objective coding of facial responses. Our results demonstrate that patient characteristics such as age and cognitive status impact the pain decoding process by observers when viewing facial expressions of pain in others.
Collapse
Affiliation(s)
- Rhonda J N Stopyn
- Department of Psychology, University of Regina, Regina, SK S4S 0A2 Canada
| | | | - Jeff Loucks
- Department of Psychology, University of Regina, Regina, SK S4S 0A2 Canada
| |
Collapse
|
21
|
Vivas AB, Chrysochoou E, Marful A, Bajo T. Emotional devaluation in ignoring and forgetting as a function of adolescent development. Cognition 2021; 211:104615. [PMID: 33588185 DOI: 10.1016/j.cognition.2021.104615] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2020] [Revised: 01/25/2021] [Accepted: 01/27/2021] [Indexed: 11/25/2022]
Abstract
We know that emotion and cognition interact to guide goal-directed behavior. Accordingly, it has recently been shown that distracting stimuli (Raymond, Fenske, & Tavassoli, 2003) and instructed to-be-forgotten items (Vivas, Marful, Panagiotidou, & Bajo, 2016) are emotionally devaluated. The devaluation by inhibition hypothesis (Raymond, Fenske, & Tavassoli, 2003) is the main theoretical explanation of these effects. However, we know little about how the cognition-emotion interplay is further modulated by development, and particularly, by changes in inhibitory control and affective processing within the adolescence period. In the present study we combined a selective attention task with faces, and a selective memory (directed forgetting paradigm) task with words, with a pleasantness evaluation task to address this question in three age groups; younger adolescents, older adolescents and young adults. Younger adolescents exhibited worse accuracy in the attention task, lower overall recognition of words in the memory task, and a smaller in magnitude directed forgetting effect in the latter, relative to the two older groups. That is, they showed less efficient inhibitory control in attention and memory selection. Despite this, all groups showed similar devaluation effects of the distractor faces and the to-be-forgotten words. Our findings do not fully support an inhibition account of such effects. Yet, they support the robustness of the forgetting devaluation effect, replicating the findings of Vivas, Marful, Panagiotidou, and Bajo (2016) with a Greek version of the task and in a bigger sample of participants.
Collapse
Affiliation(s)
- Ana B Vivas
- The University of Sheffield International Faculty, CITY College, Greece.
| | | | - Alejandra Marful
- Mind, Brain, and Behavior Research Center, Department of Experimental Psychology, University of Granada, Spain
| | - Teresa Bajo
- Mind, Brain, and Behavior Research Center, Department of Experimental Psychology, University of Granada, Spain
| |
Collapse
|
22
|
Rajan Menon K, Malu B, Sinha C. Development and Validation of Emotion Recognition Software in the Indian Population. PSYCHOLOGICAL STUDIES 2020. [DOI: 10.1007/s12646-020-00574-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022] Open
|
23
|
Durbin KA, Rastegar S, Knight BG. Effects of age and mood on emotional face processing differ depending on the intensity of the facial expression. NEUROPSYCHOLOGY, DEVELOPMENT, AND COGNITION. SECTION B, AGING, NEUROPSYCHOLOGY AND COGNITION 2020; 27:902-917. [PMID: 31809671 PMCID: PMC7274884 DOI: 10.1080/13825585.2019.1700900] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/04/2019] [Accepted: 11/27/2019] [Indexed: 10/25/2022]
Abstract
Research suggests that mood can moderate age differences in recognizing facial emotion. In this study, we examined how an anxious versus calm mood state affected younger and older adults' processing of emotional faces. Older adults had greater difficulty identifying negative emotions, particularly when emotions were displayed at a low intensity level. However, an anxious mood did not affect age differences in emotional face recognition. In contrast, age, emotional intensity, and current mood state all affected the perceived intensity of emotion. The effects of age and mood on perceived emotional intensity were only observed for low intensity facial expressions. When induced into an anxious mood, younger adults perceived threatening emotions (i.e., fear, anger) as more emotionally intense, whereas older adults perceived anger and happiness to be more intense. These findings emphasize the need to consider both internal and external factors when investigating the effects of age on emotional face processing.
Collapse
Affiliation(s)
| | - Sarah Rastegar
- Department of Psychology, University of Southern California
| | - Bob G. Knight
- Department of Psychology, University of Southern California
- School of Psychology and Counseling, University of Southern Queensland
| |
Collapse
|
24
|
Ferreira BLC, Fabrício DDM, Chagas MHN. Are facial emotion recognition tasks adequate for assessing social cognition in older people? A review of the literature. Arch Gerontol Geriatr 2020; 92:104277. [PMID: 33091714 DOI: 10.1016/j.archger.2020.104277] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2020] [Revised: 09/25/2020] [Accepted: 09/30/2020] [Indexed: 02/04/2023]
Abstract
OBJECTIVE Facial emotion recognition (FER) is a component of social cognition and important to interpersonal relations. Therefore, tasks have been developed to assess this skill in different population. Regarding older people, even healthy individuals have a poorer performance compared to rate of correct answers commonly used to assess such tasks. Perform a systematic review to analyze studies addressing the performance of healthy older adults on FER tasks compared to the 70% correct response rate commonly used for the creation of stimulus banks. MATERIAL AND METHODS Searches were conducted up to May 2019 in the Pubmed, PsycInfo, Web of Science, and Scopus databases using the keywords ("faces" OR "facial") AND ("recognition" OR "expression" OR "emotional") AND ("elderly" OR "older adults"). RESULTS Twenty-seven articles were included in the present review. In 16 studies (59.2%), older people had correct response rates on FER lower than 70% on at least one of the emotions evaluated. Among the studies that evaluated each emotion specifically, 62.5% found correct response rates lower than 70% for the emotion fear, 50% for surprise, 50% for sadness, 37.5% for anger, 21.4% for disgust, and 5.9% for happiness. Moreover, the studies that evaluated the level of intensity of the emotions demonstrated a lower rate of correct responses when the intensity of the facial expression was low. CONCLUSION That studies employ methods and facial stimuli that may not be adequate for measuring this skill in older people. Thus, it is important to create adequate tasks for assessing the skill in this population.
Collapse
Affiliation(s)
- Bianca Letícia C Ferreira
- Department of Neurosciences and Behavioral Sciences, Universidade de São Paulo, Ribeirão Preto, SP, Brazil; Research Group on Mental Health, Cognition and Aging, Federal University of São Carlos, São Carlos, SP, Brazil.
| | - Daiene de Morais Fabrício
- Research Group on Mental Health, Cognition and Aging, Federal University of São Carlos, São Carlos, SP, Brazil
| | - Marcos Hortes N Chagas
- Department of Neurosciences and Behavioral Sciences, Universidade de São Paulo, Ribeirão Preto, SP, Brazil; Research Group on Mental Health, Cognition and Aging, Federal University of São Carlos, São Carlos, SP, Brazil; Bairral Institute of Psychiatry, Itapira, SP, Brazil
| |
Collapse
|
25
|
Szczypiński J, Alińska A, Waligóra M, Kopera M, Krasowska A, Michalska A, Suszek H, Jakubczyk A, Wypych M, Wojnar M, Marchewka A. Familiarity with children improves the ability to recognize children's mental states: an fMRI study using the Reading the Mind in the Eyes Task and the Nencki Children Eyes Test. Sci Rep 2020; 10:12964. [PMID: 32737383 PMCID: PMC7395771 DOI: 10.1038/s41598-020-69938-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2019] [Accepted: 07/21/2020] [Indexed: 11/09/2022] Open
Abstract
Theory of mind plays a fundamental role in human social interactions. People generally better understand the mental states of members of their own race, a predisposition called the own-race bias, which can be significantly reduced by experience. It is unknown whether the ability to understand mental states can be similarly influenced by own-age bias, whether this bias can be reduced by experience and, finally, what the neuronal correlates of this processes are. We evaluate whether adults working with children (WC) have an advantage over adults not working with children (NWC) in understanding the mental states of youngsters. Participants performed fMRI tasks with Adult Mind (AM) and Child Mind (CM) conditions based on the Reading the Mind in the Eyes test and a newly developed Nencki Children Eyes test. WC had better accuracy in the CM condition than NWC. In NWC, own-age bias was associated with higher activation in the posterior superior temporal sulcus (pSTS) in AM than in CM. This effect was not observed in the WC group, which showed higher activation in the pSTS and inferior frontal gyri in CM than in AM. Therefore, activation in these regions is required for the improvement in recognition of children's mental states caused by experience.
Collapse
Affiliation(s)
- Jan Szczypiński
- Laboratory of Brain Imaging (LOBI), Nencki Institute of Experimental Biology, Polish Academy of Sciences, Pasteur 3, 02-093, Warsaw, Poland.
- Department of Psychiatry, Medical University of Warsaw, Warsaw, Poland.
| | - Anna Alińska
- Laboratory of Brain Imaging (LOBI), Nencki Institute of Experimental Biology, Polish Academy of Sciences, Pasteur 3, 02-093, Warsaw, Poland
| | - Marek Waligóra
- Laboratory of Brain Imaging (LOBI), Nencki Institute of Experimental Biology, Polish Academy of Sciences, Pasteur 3, 02-093, Warsaw, Poland
- Laboratory of Neuroinformatics, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - Maciej Kopera
- Department of Psychiatry, Medical University of Warsaw, Warsaw, Poland
| | | | - Aneta Michalska
- Department of Psychiatry, Medical University of Warsaw, Warsaw, Poland
| | - Hubert Suszek
- Faculty of Psychology, University of Warsaw, Warsaw, Poland
| | - Andrzej Jakubczyk
- Department of Psychiatry, Medical University of Warsaw, Warsaw, Poland
| | - Marek Wypych
- Laboratory of Brain Imaging (LOBI), Nencki Institute of Experimental Biology, Polish Academy of Sciences, Pasteur 3, 02-093, Warsaw, Poland
| | - Marcin Wojnar
- Department of Psychiatry, Medical University of Warsaw, Warsaw, Poland
- Department of Psychiatry, University of Michigan, Ann Arbor, MI, USA
| | - Artur Marchewka
- Laboratory of Brain Imaging (LOBI), Nencki Institute of Experimental Biology, Polish Academy of Sciences, Pasteur 3, 02-093, Warsaw, Poland.
| |
Collapse
|
26
|
Laurita AC, DuPre E, Ebner NC, Turner GR, Spreng RN. Default network interactivity during mentalizing about known others is modulated by age and social closeness. Soc Cogn Affect Neurosci 2020; 15:537-549. [PMID: 32399555 PMCID: PMC7328027 DOI: 10.1093/scan/nsaa067] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2019] [Revised: 04/15/2020] [Accepted: 05/05/2020] [Indexed: 11/12/2022] Open
Abstract
In young adults, mentalizing about known others engages the default network, with differential brain response modulated by social closeness. While the functional integrity of the default network changes with age, few studies have investigated how these changes impact the representation of known others, across levels of closeness. Young (N = 29, 16 females) and older (N = 27, 12 females) adults underwent functional magnetic resonance imaging (fMRI) scanning while making trait judgments for social others varying in closeness. Multivariate analyses (partial least squares) identified default network activation for trait judgments across both age cohorts. For young adults, romantic partner and self-judgments differed from other levels of social closeness and were associated with activity in default and salience networks. In contrast, default network interactivity was not modulated by social closeness for older adults. In two functional connectivity analyses, both age groups demonstrated connectivity between dorsal and ventral medial prefrontal cortex and other default network regions during trait judgments. However older, but not young, adults also showed increased functional coupling between medial and lateral prefrontal brain regions that did not vary by category of known other. Mentalizing about others engages default and frontal brain regions in older adulthood, and this coupling is poorly modulated by social closeness.
Collapse
Affiliation(s)
- Anne C Laurita
- Health Promotion & Prevention Services, University Health Services, Princeton University, Princeton, NJ 08544, USA
| | - Elizabeth DuPre
- Department of Neurology and Neurosurgery, Montreal Neurological Institute, McGill University, Quebec H3A 2B4, Canada
| | - Natalie C Ebner
- Department of Psychology, University of Florida, Gainesville, FL 32611, USA.,Department of Aging and Geriatric Research, Institute on Aging, University of Florida, Gainesville, FL 32611, USA.,Department of Clinical and Health Psychology, Center for Cognitive Aging and Memory, University of Florida, Gainesville, FL 32611, USA
| | - Gary R Turner
- Department of Psychology, York University, Toronto, ON M3J 1P3, Canada
| | - R Nathan Spreng
- Department of Neurology and Neurosurgery, Montreal Neurological Institute, McGill University, Quebec H3A 2B4, Canada.,Department of Psychiatry, McGill University, Quebec H3A 2B4, Canada.,Department of Psychology, McGill University, Quebec H3A 2B4, Canada.,McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University, Quebec H3A 2B4, Canada
| |
Collapse
|
27
|
Strickland-Hughes CM, Dillon KE, West RL, Ebner NC. Own-age bias in face-name associations: Evidence from memory and visual attention in younger and older adults. Cognition 2020; 200:104253. [DOI: 10.1016/j.cognition.2020.104253] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2019] [Revised: 02/27/2020] [Accepted: 03/01/2020] [Indexed: 01/09/2023]
|
28
|
Pehlivanoglu D, Myers E, Ebner NC. Tri-Phasic Model ofOxytocin (TRIO): A systematic conceptual review of oxytocin-related ERP research. Biol Psychol 2020; 154:107917. [PMID: 32512020 PMCID: PMC7556712 DOI: 10.1016/j.biopsycho.2020.107917] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2019] [Revised: 05/22/2020] [Accepted: 05/28/2020] [Indexed: 02/06/2023]
Abstract
BACKGROUND The neuropeptide oxytocin (OT) has been shown to play a role in variety of cognitive and social processes and different hypotheses have been put forth to explain OT's effects on brain and behavior in humans. However, these previous explanatory accounts do not provide information about OT-related temporal modulation in the brain. OBJECTIVES This paper systematically reviewed intranasal OT administration studies employing event-related potentials (ERPs) and synthesized the existing evidence into a novel conceptual framework. METHODS Empirical studies, published until February 2020 and cited in major databases (EBSCOhost, PubMed, and Web of Science), were examined in accordance with PRISMA guidelines. To be included, studies had to: (i) employ intranasal administration of OT, as the chemical modulator; (ii) measure ERPs; (iii) be peer-reviewed journal articles; (iv) be written in English; and (v) examine human participants. RESULTS The search criteria yielded 17 empirical studies. The systematic review resulted in conceptualization of the Tri-Phasic Model ofOxytocin (TRIO), which builds on three processing stages: (i) perception, (ii) selection, and (iii) evaluation. While OT increases attention irrespective of stimuli characteristics in the perception stage, in the selection and evaluation stages, OT acts as a filter to guide attention selectively towards social over non-social stimuli and modulates prosociality/approach motivation associated with social stimuli. CONCLUSIONS TRIO offers an empirically-derived conceptual framework that can guide the study of OT-related modulation on attentional processes, starting very early in the processing stream. This novel account furthers theoretical understanding and informs empirical investigation into OT modulation on the brain.
Collapse
Affiliation(s)
- Didem Pehlivanoglu
- University of Florida, Contact Information Didem Pehlivanoglu Department of Psychology, University of Florida, 945 Center Dr, Gainesville, FL 32603, United States.
| | - Elisha Myers
- University of Florida, Contact Information Didem Pehlivanoglu Department of Psychology, University of Florida, 945 Center Dr, Gainesville, FL 32603, United States
| | - Natalie C Ebner
- University of Florida, Contact Information Didem Pehlivanoglu Department of Psychology, University of Florida, 945 Center Dr, Gainesville, FL 32603, United States
| |
Collapse
|
29
|
Hauschild KM, Felsman P, Keifer CM, Lerner MD. Evidence of an Own-Age Bias in Facial Emotion Recognition for Adolescents With and Without Autism Spectrum Disorder. Front Psychiatry 2020; 11:428. [PMID: 32581859 PMCID: PMC7286307 DOI: 10.3389/fpsyt.2020.00428] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/05/2019] [Accepted: 04/27/2020] [Indexed: 12/15/2022] Open
Abstract
A common interpretation of the face-processing deficits associated with autism spectrum disorder (ASD) is that they arise from a failure to develop normative levels of perceptual expertise. One indicator of perceptual expertise for faces is the own-age bias, operationalized as a processing advantage for faces of one's own age, presumably due to more frequent contact and experience. This effect is especially evident in domains of face recognition memory but less commonly investigated in social-emotional expertise (e.g., facial emotion recognition; FER), where individuals with ASD have shown consistent deficits. In the present study, we investigated whether a FER task would elicit an own-age bias for individuals with and without ASD and explored how the magnitude of an own-age bias may differ as a function of ASD status and symptoms. Ninety-two adolescents (63 male) between the ages of 11 and 14 years completed the child- and adult-face subtests of a standardized FER task. Overall FER accuracy was found to differ by ASD severity, reflecting poorer performance for those with increased symptoms. Results also indicated that an own-age bias was evident, reflecting greater FER performance for child compared to adult faces, for all adolescents regardless of ASD status or symptoms. However, the strength of the observed own-age bias did not differ by ASD status or severity. Findings suggest that face processing abilities of adolescents with ASD may be influenced by experience with specific categories of stimuli, similar to their typically developing peers.
Collapse
Affiliation(s)
- Kathryn M. Hauschild
- Social Competence and Treatment Laboratory, Department of Psychology, Stony Brook University, Stony Brook, NY, United States
| | - Peter Felsman
- Social Competence and Treatment Laboratory, Department of Psychology, Stony Brook University, Stony Brook, NY, United States
- Alan Alda Center for Communicating Science, Stony Brook University, Stony Brook, NY, United States
| | - Cara M. Keifer
- Social Competence and Treatment Laboratory, Department of Psychology, Stony Brook University, Stony Brook, NY, United States
| | - Matthew D. Lerner
- Social Competence and Treatment Laboratory, Department of Psychology, Stony Brook University, Stony Brook, NY, United States
- Department of Psychology, University of Virginia, Charlottesville, VA, United States
| |
Collapse
|
30
|
Recognizing a missing senior citizen in relation to experience with the elderly, demographic characteristics, and personality variables. CURRENT PSYCHOLOGY 2020. [DOI: 10.1007/s12144-019-00499-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
31
|
Rollins L, Olsen A, Evans M. Social categorization modulates own-age bias in face recognition and ERP correlates of face processing. Neuropsychologia 2020; 141:107417. [PMID: 32135182 DOI: 10.1016/j.neuropsychologia.2020.107417] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2019] [Revised: 01/27/2020] [Accepted: 02/29/2020] [Indexed: 10/24/2022]
Abstract
The aim of the present study was to further understanding of how social categorization influences face recognition. According to the categorization-individuation model, face recognition can either be biased toward categorization or individuation. We hypothesized that the face recognition bias associated with a social category (e.g., the own-age bias) would be larger when faces were initially categorized according to that category. To examine this hypothesis, young adults (N = 63) completed a face recognition task after either making age or sex judgments while encoding child and adult faces. Young adults showed the own-age and own-sex biases in face recognition. Consistent with our hypothesis, the magnitude of the own-age bias in face recognition was larger when individuals made age, rather than sex, judgments at encoding. To probe the mechanisms underlying this effect, we examined ERP responses to child and adult faces across the social categorization conditions. Neither the P1 nor the N170 ERP components were modulated by the social categorization task or the social category membership of the face. However, the P2, which is associated with second-order configural processing, was larger to adult faces than child faces only in the age categorization condition. The N250, which is associated with individuation, was larger (i.e., more negative) to adult than child faces and during age categorization than sex categorization. These results are interpreted within the context of the categorization-individuation model and current research on biases in face recognition.
Collapse
Affiliation(s)
- Leslie Rollins
- Department of Psychology, Christopher Newport University, Newport News, VA, USA.
| | - Aubrey Olsen
- Department of Psychology, Christopher Newport University, Newport News, VA, USA
| | - Megan Evans
- Department of Psychology, Christopher Newport University, Newport News, VA, USA
| |
Collapse
|
32
|
Yang T, Di Bernardi Luft C, Sun P, Bhattacharya J, Banissy MJ. Investigating Age-Related Neural Compensation During Emotion Perception Using Electroencephalography. Brain Sci 2020; 10:brainsci10020061. [PMID: 31979321 PMCID: PMC7071462 DOI: 10.3390/brainsci10020061] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/26/2019] [Accepted: 01/21/2020] [Indexed: 11/29/2022] Open
Abstract
Previous research suggests declines in emotion perception in older as compared to younger adults, but the underlying neural mechanisms remain unclear. Here, we address this by investigating how “face-age” and “face emotion intensity” affect both younger and older participants’ behavioural and neural responses using event-related potentials (ERPs). Sixteen young and fifteen older adults viewed and judged the emotion type of facial images with old or young face-age and with high- or low- emotion intensities while EEG was recorded. The ERP results revealed that young and older participants exhibited significant ERP differences in two neural clusters: the left frontal and centromedial regions (100–200 ms stimulus onset) and frontal region (250–900 ms) when perceiving neutral faces. Older participants also exhibited significantly higher ERPs within these two neural clusters during anger and happiness emotion perceptual tasks. However, while this pattern of activity supported neutral emotion processing, it was not sufficient to support the effective processing of facial expressions of anger and happiness as older adults showed reductions in performance when perceiving these emotions. These age-related changes are consistent with theoretical models of age-related changes in neurocognitive abilities and may reflect a general age-related cognitive neural compensation in older adults, rather than a specific emotion-processing neural compensation.
Collapse
Affiliation(s)
- Tao Yang
- Department of Psychology, Tsinghua University, Beijing 100084, China;
- Department of Psychology, Goldsmiths, University of London, London SE14 6NW, UK; (J.B.); (M.J.B.)
- Correspondence:
| | | | - Pei Sun
- Department of Psychology, Tsinghua University, Beijing 100084, China;
| | - Joydeep Bhattacharya
- Department of Psychology, Goldsmiths, University of London, London SE14 6NW, UK; (J.B.); (M.J.B.)
| | - Michael J. Banissy
- Department of Psychology, Goldsmiths, University of London, London SE14 6NW, UK; (J.B.); (M.J.B.)
| |
Collapse
|
33
|
Chung KM, Kim S, Jung WH, Kim Y. Development and Validation of the Yonsei Face Database (YFace DB). Front Psychol 2019; 10:2626. [PMID: 31849755 PMCID: PMC6901828 DOI: 10.3389/fpsyg.2019.02626] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2019] [Accepted: 11/07/2019] [Indexed: 12/13/2022] Open
Abstract
The purposes of this study were to develop the Yonsei Face Database (YFace DB), consisting of both static and dynamic face stimuli for six basic emotions (happiness, sadness, anger, surprise, fear, and disgust), and to test its validity. The database includes selected pictures (static stimuli) and film clips (dynamic stimuli) of 74 models (50% female) aged between 19 and 40. Thousand four hundred and eighty selected pictures and film clips were assessed for the accuracy, intensity, and naturalness during the validation procedure by 221 undergraduate students. The overall accuracy of the pictures was 76%. Film clips had a higher accuracy, of 83%; the highest accuracy was observed in happiness and the lowest in fear across all conditions (static with mouth open or closed, or dynamic). The accuracy was higher in film clips across all emotions but happiness and disgust, while the naturalness was higher in the pictures than in film clips except for sadness and anger. The intensity varied the most across conditions and emotions. Significant gender effects were found in perception accuracy for both the gender of models and raters. Male raters perceived surprise more accurately in static stimuli with mouth open and in dynamic stimuli while female raters perceived fear more accurately in all conditions. Moreover, sadness and anger expressed in static stimuli with mouth open and fear expressed in dynamic stimuli were perceived more accurately when models were male. Disgust expressed in static stimuli with mouth open and dynamic stimuli, and fear expressed in static stimuli with mouth closed were perceived more accurately when models were female. The YFace DB is the largest Asian face database by far and the first to include both static and dynamic facial expression stimuli, and the current study can provide researchers with a wealth of information about the validity of each stimulus through the validation procedure.
Collapse
Affiliation(s)
- Kyong-Mee Chung
- Department of Psychology, Yonsei University, Seoul, South Korea
| | - Soojin Kim
- Department of Psychology, Yonsei University, Seoul, South Korea
| | - Woo Hyun Jung
- Department of Psychology, Chungbuk National University, Cheongju, South Korea
| | - Yeunjoo Kim
- Department of Psychology, University of California, Berkeley, Berkeley, CA, United States
| |
Collapse
|
34
|
Derya D, Kang J, Kwon DY, Wallraven C. Facial Expression Processing Is Not Affected by Parkinson's Disease, but by Age-Related Factors. Front Psychol 2019; 10:2458. [PMID: 31798486 PMCID: PMC6868040 DOI: 10.3389/fpsyg.2019.02458] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2019] [Accepted: 10/17/2019] [Indexed: 11/20/2022] Open
Abstract
The question whether facial expression processing may be impaired in Parkinson’s disease (PD) patients so far has yielded equivocal results – existing studies, however, have focused on testing expression processing in recognition tasks with static images of six standard, emotional facial expressions. Given that non-verbal communication contains both emotional and non-emotional, conversational expressions and that input to the brain is usually dynamic, here we address the question of potential facial expression processing differences in a novel format: we test a range of conversational and emotional, dynamic facial expressions in three groups – PD patients (n = 20), age- and education-matched older healthy controls (n = 20), and younger adult healthy controls (n = 20). This setup allows us to address both effects of PD and age-related differences. We employed a rating task for all groups in which 12 rating dimensions were used to assess evaluative processing of 27 expression videos from six different actors. We found that ratings overall were consistent across groups with several rating dimensions (such as arousal or outgoingness) having a strong correlation with the expressions’ motion energy content as measured by optic flow analysis. Most importantly, we found that the PD group did not differ in any rating dimension from the older healthy control group (HCG), indicating highly similar evaluation processing. Both older groups, however, did show significant differences for several rating scales in comparison with the younger adults HCG. Looking more closely, older participants rated negative expressions compared to the younger participants as more positive, but also as less natural, persuasive, empathic, and sincere. We interpret these findings in the context of the positivity effect and in-group processing advantages. Overall, our findings do not support strong processing deficits due to PD, but rather point to age-related differences in facial expression processing.
Collapse
Affiliation(s)
- Dilara Derya
- Department of Brain and Cognitive Engineering, Korea University, Seoul, South Korea
| | - June Kang
- Department of Brain and Cognitive Engineering, Korea University, Seoul, South Korea
| | - Do-Young Kwon
- Department of Neurology, Korea University Ansan Hospital, Korea University College of Medicine, Ansan-si, South Korea
| | - Christian Wallraven
- Department of Brain and Cognitive Engineering, Korea University, Seoul, South Korea.,Department of Artificial Intelligence, Korea University, Seoul, South Korea
| |
Collapse
|
35
|
Abstract
Recent applications of eye tracking for diagnosis, prognosis and follow-up of therapy in age-related neurological or psychological deficits have been reviewed. The review is focused on active aging, neurodegeneration and cognitive impairments. The potential impacts and current limitations of using characterizing features of eye movements and pupillary responses (oculometrics) as objective biomarkers in the context of aging are discussed. A closer look into the findings, especially with respect to cognitive impairments, suggests that eye tracking is an invaluable technique to study hidden aspects of aging that have not been revealed using any other noninvasive tool. Future research should involve a wider variety of oculometrics, in addition to saccadic metrics and pupillary responses, including nonlinear and combinatorial features as well as blink- and fixation-related metrics to develop biomarkers to trace age-related irregularities associated with cognitive and neural deficits.
Collapse
Affiliation(s)
- Ramtin Z Marandi
- Department of Health Science & Technology, Aalborg University, Aalborg E 9220, Denmark
| | - Parisa Gazerani
- Department of Health Science & Technology, Aalborg University, Aalborg E 9220, Denmark
| |
Collapse
|
36
|
Abbruzzese L, Magnani N, Robertson IH, Mancuso M. Age and Gender Differences in Emotion Recognition. Front Psychol 2019; 10:2371. [PMID: 31708832 PMCID: PMC6819430 DOI: 10.3389/fpsyg.2019.02371] [Citation(s) in RCA: 63] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2019] [Accepted: 10/04/2019] [Indexed: 12/19/2022] Open
Abstract
Background Existing literature suggests that age affects recognition of affective facial expressions. Eye-tracking studies highlighted that age-related differences in recognition of emotions could be explained by different face exploration patterns due to attentional impairment. Gender also seems to play a role in recognition of emotions. Unfortunately, little is known about the differences in emotion perception abilities across lifespans for men and women, even if females show more ability from infancy. Objective The present study aimed to examine the role of age and gender on facial emotion recognition in relation to neuropsychological functions and face exploration strategies. We also aimed to explore the associations between emotion recognition and quality of life. Methods 60 healthy people were consecutively enrolled in the study and divided into two groups: Younger Adults and Older Adults. Participants were assessed for: emotion recognition, attention abilities, frontal functioning, memory functioning and quality of life satisfaction. During the execution of the emotion recognition test using the Pictures of Facial Affects (PoFA) and a modified version of PoFA (M-PoFA), subject’s eye movements were recorded with an Eye Tracker. Results Significant differences between younger and older adults were detected for fear recognition when adjusted for cognitive functioning and eye-gaze fixations characteristics. Adjusted means of fear recognition were significantly higher in the younger group than in the older group. With regard to gender’s effects, old females recognized identical pairs of emotions better than old males. Considering the Satisfaction Profile (SAT-P) we detected negative correlations between some dimensions (Physical functioning, Sleep/feeding/free time) and emotion recognition (i.e., sadness, and disgust). Conclusion The current study provided novel insights into the specific mechanisms that may explain differences in emotion recognition, examining how age and gender differences can be outlined by cognitive functioning and face exploration strategies.
Collapse
Affiliation(s)
| | - Nadia Magnani
- Adult Mental Health Service, NHS-USL Tuscany South-Est, Grosseto, Italy
| | - Ian H Robertson
- Global Brain Health Institute, Trinity College Institute of Neuroscience, Trinity College Dublin, The University of Dublin, Dublin, Ireland
| | - Mauro Mancuso
- Tuscany Rehabilitation Clinic, Montevarchi, Italy.,Physical and Rehabilitative Medicine Unit, NHS-USL Tuscany South-Est, Grosseto, Italy
| |
Collapse
|
37
|
Kang J, Derva D, Kwon DY, Wallraven C. Voluntary and spontaneous facial mimicry toward other's emotional expression in patients with Parkinson's disease. PLoS One 2019; 14:e0214957. [PMID: 30973893 PMCID: PMC6459535 DOI: 10.1371/journal.pone.0214957] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2018] [Accepted: 03/23/2019] [Indexed: 01/31/2023] Open
Abstract
A "masked face", that is, decreased facial expression is considered as one of the cardinal symptoms among individuals with Parkinson's disease (PD). Both spontaneous and voluntary mimicry toward others' emotional expressions is essential for both social communication and emotional sharing with others. Despite many studies showing impairments in facial movements in PD in general, it is still unclear whether voluntary, spontaneous, or both types of mimicry are affected and how the impairments affect the patients' quality of life. We investigated to verify whether impairments in facial movements happen for spontaneous as well as for voluntary expressions by quantitatively comparing muscle activations using surface electromyography. Dynamic facial expressions of Neutral, Anger, Joy, and Sad were presented during recordings in corrugator and zygomatic areas. In the spontaneous condition, participants were instructed to simply watch clips, whereas in the voluntary condition they were instructed to actively mimic the stimuli. We found that PD patients showed decreased mimicry in both spontaneous and voluntary conditions compared to a matched control group, although movement patterns in each emotion were similar in the two groups. Moreover, whereas the decrease in mimicry correlated with the decrease not in a health-related quality of life index (PDQ), it did so in a more subjective measurement of general quality of life index (SWB). The correlation between facial mimicry and subjective well-being index suggests that the 'masked face' symptom deteriorates patients' quality of life in a complex way affecting social and psychological aspects, which in turn may be linked to the increased depression risk among individuals with PD.
Collapse
Affiliation(s)
- June Kang
- Korea University, Department of Brain and Cognitive Engineering, Seoul, South Korea
- Empathy Research Institute, Seoul, South Korea
| | - Dilara Derva
- Korea University, Department of Brain and Cognitive Engineering, Seoul, South Korea
| | - Do-Young Kwon
- Korea University Ansan hospital, Department of Neurology, Ansan City, South Korea
| | - Christian Wallraven
- Korea University, Department of Brain and Cognitive Engineering, Seoul, South Korea
| |
Collapse
|
38
|
Ziaei M, Persson J, Bonyadi MR, Reutens DC, Ebner NC. Amygdala functional network during recognition of own-age vs. other-age faces in younger and older adults. Neuropsychologia 2019; 129:10-20. [PMID: 30876765 DOI: 10.1016/j.neuropsychologia.2019.03.003] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2018] [Revised: 02/19/2019] [Accepted: 03/07/2019] [Indexed: 12/30/2022]
Abstract
Facial cues, such as a person's age, provide important information for social interactions. Processing such facial cues can be affected by observer bias. However, there is currently no consensus regarding how the brain is processing facial cues related to age, and if facial age processing changes as a function of the age of the observer (i.e., own-age bias). The primary study aim was to investigate functional networks involved in processing own-age vs. other-age faces among younger and older adults and determine how emotional expression of the face modulates own-age vs. other-age face processing. The secondary study aim was to examine the relation between higher social cognitive processes (i.e., empathy) and modulation of brain activity by facial age and emotional expression. During functional magnetic resonance imaging (fMRI) younger and older participants were asked to recognize happy, angry, and neutral expressions in own-age and other-age faces. Functional connectivity analyses with the amygdala as seed showed that for own-age faces both age groups recruited a network of regions including the anterior cingulate and anterior insula that was involved in empathy and detection of salient information. Brain-behavior analyses furthermore showed that empathic responses in younger, but not in older, participants were positively correlated with engagement of the medial prefrontal cortex during processing of angry own-age faces. These findings identify the neurobehavioral correlates of facial age processing, and its modulation by emotion expression, and directly link facial cue processing to higher-order social cognitive functioning.
Collapse
Affiliation(s)
- Maryam Ziaei
- Centre for Advanced Imaging, The University of Queensland, Brisbane, Australia; School of Psychology, The University of Queensland, Brisbane, Australia.
| | - Jonas Persson
- Aging Research Center, Karolinska Institute and Stockholm University, Stockholm, Sweden
| | | | - David C Reutens
- School of Psychology, The University of Queensland, Brisbane, Australia
| | - Natalie C Ebner
- Department of Psychology, University of Florida, Florida, USA; Department of Aging and Geriatric Research, Institute on Aging, University of Florida, Florida, USA; Center for Cognitive Aging and Memory, Department of Clinical and Health Psychology, University of Florida, Gainesville, FL, USA
| |
Collapse
|
39
|
Blanke ES, Riediger M. Reading thoughts and feelings in other people: Empathic accuracy across adulthood. PROGRESS IN BRAIN RESEARCH 2019; 247:305-327. [DOI: 10.1016/bs.pbr.2019.02.002] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/10/2023]
|
40
|
Tu YZ, Lin DW, Suzuki A, Goh JOS. East Asian Young and Older Adult Perceptions of Emotional Faces From an Age- and Sex-Fair East Asian Facial Expression Database. Front Psychol 2018; 9:2358. [PMID: 30555382 PMCID: PMC6281963 DOI: 10.3389/fpsyg.2018.02358] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2018] [Accepted: 11/10/2018] [Indexed: 11/21/2022] Open
Abstract
There is increasing interest in clarifying how different face emotion expressions are perceived by people from different cultures, of different ages and sex. However, scant availability of well-controlled emotional face stimuli from non-Western populations limit the evaluation of cultural differences in face emotion perception and how this might be modulated by age and sex differences. We present a database of East Asian face expression stimuli, enacted by young and older, male and female, Taiwanese using the Facial Action Coding System (FACS). Combined with a prior database, this present database consists of 90 identities with happy, sad, angry, fearful, disgusted, surprised and neutral expressions amounting to 628 photographs. Twenty young and 24 older East Asian raters scored the photographs for intensities of multiple-dimensions of emotions and induced affect. Multivariate analyses characterized the dimensionality of perceived emotions and quantified effects of age and sex. We also applied commercial software to extract computer-based metrics of emotions in photographs. Taiwanese raters perceived happy faces as one category, sad, angry, and disgusted expressions as one category, and fearful and surprised expressions as one category. Younger females were more sensitive to face emotions than younger males. Whereas, older males showed reduced face emotion sensitivity, older female sensitivity was similar or accentuated relative to young females. Commercial software dissociated six emotions according to the FACS demonstrating that defining visual features were present. Our findings show that East Asians perceive a different dimensionality of emotions than Western-based definitions in face recognition software, regardless of age and sex. Critically, stimuli with detailed cultural norms are indispensable in interpreting neural and behavioral responses involving human facial expression processing. To this end, we add to the tools, which are available upon request, for conducting such research.
Collapse
Affiliation(s)
- Yu-Zhen Tu
- Graduate Institute of Brain and Mind Sciences, College of Medicine, National Taiwan University, Taipei, Taiwan
| | - Dong-Wei Lin
- Graduate Institute of Brain and Mind Sciences, College of Medicine, National Taiwan University, Taipei, Taiwan
| | - Atsunobu Suzuki
- Department of Psychology, Graduate School of Humanities and Sociology, The University of Tokyo, Tokyo, Japan
| | - Joshua Oon Soo Goh
- Graduate Institute of Brain and Mind Sciences, College of Medicine, National Taiwan University, Taipei, Taiwan.,Department of Psychology, College of Science, National Taiwan University, Taipei, Taiwan.,Neurobiological and Cognitive Science Center, National Taiwan University, Taipei, Taiwan.,Center for Artificial Intelligence and Advanced Robotics, National Taiwan University, Taipei, Taiwan
| |
Collapse
|
41
|
Calvo MG, Fernández-Martín A, Gutiérrez-García A, Lundqvist D. Selective eye fixations on diagnostic face regions of dynamic emotional expressions: KDEF-dyn database. Sci Rep 2018; 8:17039. [PMID: 30451919 PMCID: PMC6242984 DOI: 10.1038/s41598-018-35259-w] [Citation(s) in RCA: 35] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/16/2018] [Accepted: 10/28/2018] [Indexed: 12/20/2022] Open
Abstract
Prior research using static facial stimuli (photographs) has identified diagnostic face regions (i.e., functional for recognition) of emotional expressions. In the current study, we aimed to determine attentional orienting, engagement, and time course of fixation on diagnostic regions. To this end, we assessed the eye movements of observers inspecting dynamic expressions that changed from a neutral to an emotional face. A new stimulus set (KDEF-dyn) was developed, which comprises 240 video-clips of 40 human models portraying six basic emotions (happy, sad, angry, fearful, disgusted, and surprised). For validation purposes, 72 observers categorized the expressions while gaze behavior was measured (probability of first fixation, entry time, gaze duration, and number of fixations). Specific visual scanpath profiles characterized each emotional expression: The eye region was looked at earlier and longer for angry and sad faces; the mouth region, for happy faces; and the nose/cheek region, for disgusted faces; the eye and the mouth regions attracted attention in a more balanced manner for surprise and fear. These profiles reflected enhanced selective attention to expression-specific diagnostic face regions. The KDEF-dyn stimuli and the validation data will be available to the scientific community as a useful tool for research on emotional facial expression processing.
Collapse
Affiliation(s)
- Manuel G Calvo
- Department of Cognitive Psychology, Universidad de La Laguna, Tenerife, Spain.
- Instituto Universitario de Neurociencia (IUNE), Universidad de La Laguna, Tenerife, Spain.
| | | | | | | |
Collapse
|
42
|
Birmingham E, Svärd J, Kanan C, Fischer H. Exploring emotional expression recognition in aging adults using the Moving Window Technique. PLoS One 2018; 13:e0205341. [PMID: 30335767 PMCID: PMC6193651 DOI: 10.1371/journal.pone.0205341] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2018] [Accepted: 09/24/2018] [Indexed: 11/22/2022] Open
Abstract
Adult aging is associated with difficulties in recognizing negative facial expressions such as fear and anger. However, happiness and disgust recognition is generally found to be less affected. Eye-tracking studies indicate that the diagnostic features of fearful and angry faces are situated in the upper regions of the face (the eyes), and for happy and disgusted faces in the lower regions (nose and mouth). These studies also indicate age-differences in visual scanning behavior, suggesting a role for attention in emotion recognition deficits in older adults. However, because facial features can be processed extrafoveally, and expression recognition occurs rapidly, eye-tracking has been questioned as a measure of attention during emotion recognition. In this study, the Moving Window Technique (MWT) was used as an alternative to the conventional eye-tracking technology. By restricting the visual field to a moveable window, this technique provides a more direct measure of attention. We found a strong bias to explore the mouth across both age groups. Relative to young adults, older adults focused less on the left eye, and marginally more on the mouth and nose. Despite these different exploration patterns, older adults were most impaired in recognition accuracy for disgusted expressions. Correlation analysis revealed that among older adults, more mouth exploration was associated with faster recognition of both disgusted and happy expressions. As a whole, these findings suggest that in aging there are both attentional differences and perceptual deficits contributing to less accurate emotion recognition.
Collapse
Affiliation(s)
- Elina Birmingham
- Faculty of Education, Simon Fraser University, Burnaby, BC, Canada
- * E-mail:
| | - Joakim Svärd
- Department of Psychology, Stockholm University, Stockholm, Sweden
| | - Christopher Kanan
- Chester F. Carlson Center for Imaging Science, Rochester Institute of Technology, Rochester, NY, United States of America
| | - Håkan Fischer
- Department of Psychology, Stockholm University, Stockholm, Sweden
| |
Collapse
|
43
|
Smith ML, Grühn D, Bevitt A, Ellis M, Ciripan O, Scrimgeour S, Papasavva M, Ewing L. Transmitting and decoding facial expressions of emotion during healthy aging: More similarities than differences. J Vis 2018; 18:10. [PMID: 30208429 DOI: 10.1167/18.9.10] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
Older adults tend to perform more poorly than younger adults on emotional expression identification tasks. The goal of the present study was to test a processing mechanism that might explain these differences in emotion recognition-specifically, age-related variation in the utilization of specific visual cues. Seventeen younger and 17 older adults completed a reverse-correlation emotion categorization task (Bubbles paradigm), consisting of a large number of trials in each of which only part of the visual information used to convey an emotional facial expression was revealed to participants. The task allowed us to pinpoint the visual features each group used systematically to correctly recognize the emotional expressions shown. To address the possibility that faces of different age groups are differently processed by younger and older adults, we included younger, middle-aged, and older adult face models displaying happy, fearful, angry, disgusted, and sad facial expressions. Our results reveal strong similarity in the utilization of visual information by younger and older adult participants in decoding the emotional expressions from faces across ages-particularly for happy and fear emotions. These findings suggest that age-related differences in strategic information use are unlikely to contribute to the decline of facial expression recognition skills observed in later life.
Collapse
Affiliation(s)
- Marie L Smith
- School of Psychological Sciences, Birkbeck College, University of London, London, UK
| | - Daniel Grühn
- Department of Psychology, North Carolina State University, Raleigh, NC, USA
| | - Ann Bevitt
- School of Psychological Sciences, Birkbeck College, University of London, London, UK
| | - Mark Ellis
- School of Psychological Sciences, Birkbeck College, University of London, London, UK
| | - Oana Ciripan
- School of Psychological Sciences, Birkbeck College, University of London, London, UK
| | - Susan Scrimgeour
- School of Psychological Sciences, Birkbeck College, University of London, London, UK
| | - Michael Papasavva
- School of Psychological Sciences, Birkbeck College, University of London, London, UK
| | - Louise Ewing
- School of Psychological Sciences, Birkbeck College, University of London, London, UK.,School of Psychology, University of East Anglia, Norwich, UK.,Australian Research Council Centre of Excellence in Cognition and its Disorders, School of Psychology University of Western Australia, Crawley, Western Australia, Australia
| |
Collapse
|
44
|
Gonçalves AR, Fernandes C, Pasion R, Ferreira-Santos F, Barbosa F, Marques-Teixeira J. Effects of age on the identification of emotions in facial expressions: a meta-analysis. PeerJ 2018; 6:e5278. [PMID: 30065878 PMCID: PMC6064197 DOI: 10.7717/peerj.5278] [Citation(s) in RCA: 53] [Impact Index Per Article: 7.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2018] [Accepted: 06/27/2018] [Indexed: 11/20/2022] Open
Abstract
Background Emotion identification is a fundamental component of social cognition. Although it is well established that a general cognitive decline occurs with advancing age, the effects of age on emotion identification is still unclear. A meta-analysis by Ruffman and colleagues (2008) explored this issue, but much research has been published since then, reporting inconsistent findings. Methods To examine age differences in the identification of facial expressions of emotion, we conducted a meta-analysis of 24 empirical studies (N = 1,033 older adults, N = 1,135 younger adults) published after 2008. Additionally, a meta-regression analysis was conducted to identify potential moderators. Results Results show that older adults less accurately identify facial expressions of anger, sadness, fear, surprise, and happiness compared to younger adults, strengthening the results obtained by Ruffman et al. (2008). However, meta-regression analyses indicate that effect sizes are moderated by sample characteristics and stimulus features. Importantly, the estimated effect size for the identification of fear and disgust increased for larger differences in the number of years of formal education between the two groups. Discussion We discuss several factors that might explain the age-related differences in emotion identification and suggest how brain changes may account for the observed pattern. Furthermore, moderator effects are interpreted and discussed.
Collapse
Affiliation(s)
- Ana R Gonçalves
- Laboratory of Neuropsychophysiology, Faculty of Psychology and Education Sciences, Universidade do Porto, Porto, Portugal
| | - Carina Fernandes
- Laboratory of Neuropsychophysiology, Faculty of Psychology and Education Sciences, Universidade do Porto, Porto, Portugal.,Faculty of Medicine, Universidade do Porto, Porto, Portugal.,Language Research Laboratory, Institute of Molecular Medicine, Faculty of Medicine, Universidade de Lisboa, Lisboa, Portugal
| | - Rita Pasion
- Laboratory of Neuropsychophysiology, Faculty of Psychology and Education Sciences, Universidade do Porto, Porto, Portugal
| | - Fernando Ferreira-Santos
- Laboratory of Neuropsychophysiology, Faculty of Psychology and Education Sciences, Universidade do Porto, Porto, Portugal
| | - Fernando Barbosa
- Laboratory of Neuropsychophysiology, Faculty of Psychology and Education Sciences, Universidade do Porto, Porto, Portugal
| | - João Marques-Teixeira
- Laboratory of Neuropsychophysiology, Faculty of Psychology and Education Sciences, Universidade do Porto, Porto, Portugal
| |
Collapse
|
45
|
Vetter NC, Drauschke M, Thieme J, Altgassen M. Adolescent Basic Facial Emotion Recognition Is Not Influenced by Puberty or Own-Age Bias. Front Psychol 2018; 9:956. [PMID: 29977212 PMCID: PMC6022279 DOI: 10.3389/fpsyg.2018.00956] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2017] [Accepted: 05/24/2018] [Indexed: 11/14/2022] Open
Abstract
Basic facial emotion recognition is suggested to be negatively affected by puberty onset reflected in a “pubertal dip” in performance compared to pre- or post-puberty. However, findings remain inconclusive. Further, research points to an own-age bias, i.e., a superior emotion recognition for peer faces. We explored adolescents’ ability to recognize specific emotions. Ninety-five children and adolescents, aged 8–17 years, judged whether the emotions displayed by adolescent or adult faces were angry, sad, neutral, or happy. We assessed participants a priori by pubertal status while controlling for age. Results indicated no “pubertal dip”, but decreasing reaction times across adolescence. No own-age bias was found. Taken together, basic facial emotion recognition does not seem to be disrupted during puberty as compared to pre- and post-puberty.
Collapse
Affiliation(s)
- Nora C Vetter
- Department of Child and Adolescent Psychiatry, Faculty of Medicine, Technische Universität Dresden, Dresden, Germany.,Department of Psychiatry and Neuroimaging Center, Technische Universität Dresden, Dresden, Germany.,Department of Psychology, Technische Universität Dresden, Dresden, Germany.,Department of Psychology, Bergische Universität Wuppertal, Wuppertal, Germany
| | - Mandy Drauschke
- Department of Psychology, Technische Universität Dresden, Dresden, Germany
| | - Juliane Thieme
- Department of Psychology, Technische Universität Dresden, Dresden, Germany
| | - Mareike Altgassen
- Department of Psychology, Technische Universität Dresden, Dresden, Germany.,Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, Nijmegen, Netherlands
| |
Collapse
|
46
|
Kynast J, Schroeter ML. Sex, Age, and Emotional Valence: Revealing Possible Biases in the 'Reading the Mind in the Eyes' Task. Front Psychol 2018; 9:570. [PMID: 29755385 PMCID: PMC5932406 DOI: 10.3389/fpsyg.2018.00570] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2017] [Accepted: 04/04/2018] [Indexed: 12/19/2022] Open
Abstract
The 'Reading the Mind in the Eyes' test (RMET) assesses a specific socio-cognitive ability, i.e., the ability to identify mental states from gaze. The development of this ability in a lifespan perspective is of special interest. Whereas former investigations were limited mainly to childhood and adolescence, the focus has been shifted towards aging, and psychiatric and neurodegenerative diseases recently. Although the RMET is frequently applied in developmental psychology and clinical settings, stimulus characteristics have never been investigated with respect to potential effects on test performance. Here, we analyzed the RMET stimulus set with a special focus on interrelations between sex, age and emotional valence. Forty-three persons rated age and emotional valence of the RMET picture set. Differences in emotional valence and age ratings between male and female items were analyzed. The linear relation between age and emotional valence was tested over all items, and separately for male and female items. Male items were rated older and more negative than female stimuli. Regarding male RMET items, age predicted emotional valence: older age was associated with negative emotions. Contrary, age and valence were not linearly related in female pictures. All ratings were independent of rater characteristics. Our results demonstrate a strong confound between sex, age, and emotional valence in the RMET. Male items presented a greater variability in age ratings compared to female items. Age and emotional valence were negatively associated among male items, but no significant association was found among female stimuli. As personal attributes impact social information processing, our results may add a new perspective on the interpretation of previous findings on interindividual differences in RMET accuracy, particularly in the field of developmental psychology, and age-associated neuropsychiatric diseases. A revision of the RMET might be afforded to overcome confounds identified here.
Collapse
Affiliation(s)
- Jana Kynast
- Department of Neurology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Matthias L Schroeter
- Department of Neurology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.,Day Clinic for Cognitive Neurology, University Hospital Leipzig - University of Leipzig, Leipzig, Germany
| |
Collapse
|
47
|
Abstract
Prior research has shown that loneliness is associated with hypervigilance to social threats, with eye-tracking research showing lonely people display a specific attentional bias when viewing social rejection and social exclusion video footage (Bangee, Harris, Bridges, Rotenberg & Qualter, 2014; Qualter, Rotenberg, Barrett et al., 2013). The current study uses eye-tracker methodology to examine whether that attentional bias extends to negative emotional faces and negative social non-rejecting stimuli, or whether it could be explained only as a specific bias to social rejection/exclusion. It is important to establish whether loneliness relates to a specific or general attention bias because it may explain the maintenance of loneliness. Participants (N = 43, F = 35, Mage = 20 years and 2 months, SD = 3 months) took part in three tasks, where they viewed different social information: Task 1 - slides displaying four faces each with different emotions (anger, afraid, happy and neutral), Task 2 - slides displaying sixteen faces with varying ratios expressing happiness and anger, and Task 3 - slides displaying four visual scenes (socially rejecting, physically threatening, socially positive, neutral). For all three tasks, eye movements were recorded in real time with an eye-tracker. Results showed no association between loneliness and viewing patterns of facial expressions, but an association between loneliness and hypervigilant viewing of social rejecting stimuli. The findings indicate that lonely adults do not have a generalised hypervigilance to social threat, but have, instead, a specific attentional bias to rejection information in social contexts. Implications of the findings for interventions are discussed.
Collapse
Affiliation(s)
- Munirah Bangee
- School of Nursing, Faculty of Health and Wellbeing, University of Central Lancashire, Preston, Lancashire, England, UK
| | - Pamela Qualter
- Institute of Education, University of Manchester, Oxford Road, Manchester, England, UK
| |
Collapse
|
48
|
Sen A, Isaacowitz D, Schirmer A. Age differences in vocal emotion perception: on the role of speaker age and listener sex. Cogn Emot 2017; 32:1189-1204. [DOI: 10.1080/02699931.2017.1393399] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Affiliation(s)
- Antarika Sen
- Neurobiology and Aging Programme, National University of Singapore, Singapore, Singapore
| | | | - Annett Schirmer
- Department of Psychology, The Chinese University of Hong Kong, Hong Kong, Hong Kong
- The Mind and Brain Institute, The Chinese University of Hong Kong, Hong Kong, Hong Kong
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| |
Collapse
|
49
|
Social perception and aging: The relationship between aging and the perception of subtle changes in facial happiness and identity. Acta Psychol (Amst) 2017; 179:23-29. [PMID: 28697480 DOI: 10.1016/j.actpsy.2017.06.006] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/20/2016] [Revised: 05/19/2017] [Accepted: 06/24/2017] [Indexed: 11/21/2022] Open
Abstract
Previous findings suggest that older adults show impairments in the social perception of faces, including the perception of emotion and facial identity. The majority of this work has tended to examine performance on tasks involving young adult faces and prototypical emotions. While useful, this can influence performance differences between groups due to perceptual biases and limitations on task performance. Here we sought to examine how typical aging is associated with the perception of subtle changes in facial happiness and facial identity in older adult faces. We developed novel tasks that permitted the ability to assess facial happiness, facial identity, and non-social perception (object perception) across similar task parameters. We observe that aging is linked with declines in the ability to make fine-grained judgements in the perception of facial happiness and facial identity (from older adult faces), but not for non-social (object) perception. This pattern of results is discussed in relation to mechanisms that may contribute to declines in facial perceptual processing in older adulthood.
Collapse
|
50
|
Sullivan S, Campbell A, Hutton SB, Ruffman T. What's good for the goose is not good for the gander: Age and gender differences in scanning emotion faces. J Gerontol B Psychol Sci Soc Sci 2017; 72:441-447. [PMID: 25969472 DOI: 10.1093/geronb/gbv033] [Citation(s) in RCA: 34] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2014] [Accepted: 01/13/2015] [Indexed: 11/12/2022] Open
Abstract
Objectives Research indicates that older adults' (≥60 years) emotion recognition is worse than that of young adults, young and older men's emotion recognition is worse than that of young and older women (respectively), older adults' looking at mouths compared with eyes is greater than that of young adults. Nevertheless, previous research has not compared older men's and women's looking at emotion faces so the present study had two aims: (a) to examine whether the tendency to look at mouths is stronger amongst older men compared with older women and (b) to examine whether men's mouth looking correlates with better emotion recognition. Method We examined the emotion recognition abilities and spontaneous gaze patterns of young (n = 60) and older (n = 58) males and females as they labelled emotion faces. Results Older men spontaneously looked more to mouths than older women, and older men's looking at mouths correlated with their emotion recognition, whereas women's looking at eyes correlated with their emotion recognition. Discussion The findings are discussed in relation to a growing body of research suggesting both age and gender differences in response to emotional stimuli and the differential efficacy of mouth and eyes looking for men and women.
Collapse
Affiliation(s)
- Susan Sullivan
- School of Psychology, University of Sussex, Brighton, BN1 9RH, UK
| | - Anna Campbell
- Psychology Department, University of Otago, PO Box 56, Dunedin 9054, New Zealand
| | - Sam B Hutton
- School of Psychology, University of Sussex, Brighton, BN1 9RH, UK
| | - Ted Ruffman
- Psychology Department, University of Otago, PO Box 56, Dunedin 9054, New Zealand
| |
Collapse
|