1
|
Slessor G, Insch P, Donaldson I, Sciaponaite V, Adamowicz M, Phillips LH. Adult Age Differences in Using Information From the Eyes and Mouth to Make Decisions About Others' Emotions. J Gerontol B Psychol Sci Soc Sci 2022; 77:2241-2251. [PMID: 35948271 PMCID: PMC9799183 DOI: 10.1093/geronb/gbac097] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2022] [Indexed: 01/13/2023] Open
Abstract
OBJECTIVES Older adults are often less accurate than younger counterparts at identifying emotions such as anger, sadness, and fear from faces. They also look less at the eyes and more at the mouth during emotion perception. The current studies advance understanding of the nature of these age effects on emotional processing. METHODS Younger and older participants identified emotions from pictures of eyes or mouths (Experiment 1) and incongruent mouth-eyes emotion combinations (Experiment 2). In Experiment 3, participants categorized emotions from pictures in which face masks covered the mouth region. RESULTS Older adults were worse than young at identifying anger and sadness from eyes, but better at identifying the same emotions from the mouth region (Experiment 1) and they were more likely than young to use information from the mouth to classify anger, fear, and disgust (Experiment 2). In Experiment 3, face masks impaired perception of anger, sadness, and fear more for older compared to younger adults. DISCUSSION These studies indicate that older people are more able than young to interpret emotional information from the mouth, they are more biased to use information from the mouth, and suffer more difficulty in emotion perception when the mouth is covered with a face mask. This has implications for social communication in different age groups.
Collapse
Affiliation(s)
| | - Pauline Insch
- School of Psychology, University of Aberdeen, Aberdeen, UK
| | - Isla Donaldson
- School of Psychology, University of Aberdeen, Aberdeen, UK
| | | | | | - Louise H Phillips
- Address correspondence to: Louise Phillips, PhD, School of Psychology, University of Aberdeen, Aberdeen AB24 3FX, UK. E-mail:
| |
Collapse
|
2
|
Maltezou-Papastylianou C, Russo R, Wallace D, Harmsworth C, Paulmann S. Different stages of emotional prosody processing in healthy ageing–evidence from behavioural responses, ERPs, tDCS, and tRNS. PLoS One 2022; 17:e0270934. [PMID: 35862317 PMCID: PMC9302842 DOI: 10.1371/journal.pone.0270934] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2022] [Accepted: 06/21/2022] [Indexed: 11/22/2022] Open
Abstract
Past research suggests that the ability to recognise the emotional intent of a speaker decreases as a function of age. Yet, few studies have looked at the underlying cause for this effect in a systematic way. This paper builds on the view that emotional prosody perception is a multi-stage process and explores which step of the recognition processing line is impaired in healthy ageing using time-sensitive event-related brain potentials (ERPs). Results suggest that early processes linked to salience detection as reflected in the P200 component and initial build-up of emotional representation as linked to a subsequent negative ERP component are largely unaffected in healthy ageing. The two groups show, however, emotional prosody recognition differences: older participants recognise emotional intentions of speakers less well than younger participants do. These findings were followed up by two neuro-stimulation studies specifically targeting the inferior frontal cortex to test if recognition improves during active stimulation relative to sham. Overall, results suggests that neither tDCS nor high-frequency tRNS stimulation at 2mA for 30 minutes facilitates emotional prosody recognition rates in healthy older adults.
Collapse
Affiliation(s)
| | - Riccardo Russo
- Department of Psychology and Centre for Brain Science, University of Essex, Colchester, United Kingdom
- Department of Brain and Behavioural Sciences, Universita’ di Pavia, Pavia, Italy
| | - Denise Wallace
- Department of Psychology and Centre for Brain Science, University of Essex, Colchester, United Kingdom
| | - Chelsea Harmsworth
- Department of Psychology and Centre for Brain Science, University of Essex, Colchester, United Kingdom
| | - Silke Paulmann
- Department of Psychology and Centre for Brain Science, University of Essex, Colchester, United Kingdom
- * E-mail:
| |
Collapse
|
3
|
Disorders of vocal emotional expression and comprehension: The aprosodias. HANDBOOK OF CLINICAL NEUROLOGY 2021; 183:63-98. [PMID: 34389126 DOI: 10.1016/b978-0-12-822290-4.00005-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/14/2022]
|
4
|
Magnani B, Musetti A, Frassinetti F. Spatial attention and representation of time intervals in childhood. Sci Rep 2020; 10:14960. [PMID: 32917922 PMCID: PMC7486401 DOI: 10.1038/s41598-020-71541-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2019] [Accepted: 07/14/2020] [Indexed: 11/09/2022] Open
Abstract
Spatial attention and spatial representation of time are strictly linked in the human brain. In young adults, a leftward shift of spatial attention by prismatic adaptation (PA), is associated with an underestimation whereas a rightward shift is associated with an overestimation of time both for visual and auditory stimuli. These results suggest a supra-modal representation of time left-to-right oriented that is modulated by a bilateral attentional shift. However, there is evidence of unilateral, instead of bilateral, effects of PA on time in elderly adults suggesting an influence of age on these effects. Here we studied the effects of spatial attention on time representation focusing on childhood. Fifty-four children aged from 5 to 11 years-old performed a temporal bisection task with visual and auditory stimuli before and after PA inducing a leftward or a rightward attentional shift. Results showed that children underestimated time after a leftward attentional shift either for visual or auditory stimuli, whereas a rightward attentional shift had null effect on time. Our results are discussed as a partial maturation of the link between spatial attention and time representation in childhood, due to immaturity of interhemispheric interactions or of executive functions necessary for the attentional complete influence on time representation.
Collapse
Affiliation(s)
- Barbara Magnani
- Centro INforma-MEnte, Via Brigata Reggio 32, 42124, Reggio Emilia, Italy.
| | - Alessandro Musetti
- Department of Humanities, Social Sciences and Cultural Industries, University of Parma, Parma, Italy
| | - Francesca Frassinetti
- Department of Psychology, University of Bologna, Bologna, Italy.,Maugeri Clinical Scientific Institutes - IRCCS of Castel Goffredo, Castel Goffredo, Mantova, Italy
| |
Collapse
|
5
|
Wang A, Zhu S, Chen L, Luo W. Age-Related Decline of Low-Spatial-Frequency Bias in Context-Dependent Visual Size Perception. Front Psychol 2019; 10:1768. [PMID: 31417475 PMCID: PMC6684779 DOI: 10.3389/fpsyg.2019.01768] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2019] [Accepted: 07/15/2019] [Indexed: 11/22/2022] Open
Abstract
Global precedence has been found to decline or even shift to local precedence with increasing age. Little is known about the consequence of this age-related decline of global precedence on other aspects of older adults’ vision. The global and local processing has been preferentially associated with the low-spatial-frequency (LSF) and high-spatial-frequency (HSF) channels, respectively. Here, we used low- and high-pass filtered faces together with the Ebbinghaus illusion whose magnitude is an index of context sensitivity. The results demonstrated that, relative to HSF faces, prior exposure to LSF faces increased the illusion magnitude for younger participants, but it reduced the illusion magnitude for older participants. Significant age group difference was observed only with prior exposure to LSF faces but not to HSF faces. Moreover, similar patterns of results were observed when the filtered faces were rendered invisible with backward masking, and the magnitude of age-related decline was comparable to the visible condition. Our study reveals that LSF-related enhancement of context sensitivity declines with advancing age, and this age-related decline was independent of the awareness of the spatial frequency information. Our findings support the right hemi-aging model and suggest that the magnocellular projections from subcortical to cortical regions might also be vulnerable to age-related changes.
Collapse
Affiliation(s)
- Anqi Wang
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, China
| | - Shengnan Zhu
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, China
| | - Lihong Chen
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, China
| | - Wenbo Luo
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, China
| |
Collapse
|
6
|
Neurophysiology of spontaneous facial expressions: II. Motor control of the right and left face is partially independent in adults. Cortex 2018; 111:164-182. [PMID: 30502646 DOI: 10.1016/j.cortex.2018.10.027] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2018] [Revised: 10/18/2018] [Accepted: 10/31/2018] [Indexed: 12/22/2022]
Abstract
Facial expressions are described traditionally as monolithic or unitary entities. However, humans have the capacity to produce facial blends of emotion in which the upper and lower face simultaneously display different expressions. Recent neuroanatomical studies in monkeys have demonstrated that there are separate cortical motor areas for controlling the upper and lower face in each hemisphere that, presumably, also occur in humans. Using high-speed videography, we began measuring the movement dynamics of spontaneous facial expressions, including facial blends, to develop a more complete understanding of the neurophysiology underlying facial expressions. In our part 1 publication in Cortex (2016), we found that hemispheric motor control of the upper and lower face is overwhelmingly independent; 242 (99%) of the expressions were classified as demonstrating independent hemispheric motor control whereas only 3 (1%) were classified as demonstrating dependent hemispheric motor control. In this companion paper (part 2), 251 unitary facial expressions that occurred on either the upper or lower face were analyzed. 164 (65%) expressions demonstrated dependent hemispheric motor control whereas 87 (35%) expressions demonstrated independent or dual hemispheric motor control, indicating that some expressions represent facial blends of emotion that occur across the vertical facial axis. These findings also support the concepts that 1) spontaneous facial expressions are organized predominantly across the horizontal facial axis and secondarily across the vertical facial axis and 2) facial expressions are complex, multi-component, motoric events. Based on the Emotion-type hypothesis of cerebral lateralization, we propose that facial expressions modulated by a primary-emotional response to an environmental event are initiated by the right hemisphere on the left side of the face whereas facial expressions modulated by a social-emotional response to an environmental event are initiated by the left hemisphere on the right side of the face.
Collapse
|
7
|
Boutsen FA, Dvorak JD, Pulusu VK, Ross ED. Altered saccadic targets when processing facial expressions under different attentional and stimulus conditions. Vision Res 2017; 133:150-160. [PMID: 28279711 DOI: 10.1016/j.visres.2016.07.012] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2015] [Revised: 05/16/2016] [Accepted: 07/09/2016] [Indexed: 10/20/2022]
Abstract
Depending on a subject's attentional bias, robust changes in emotional perception occur when facial blends (different emotions expressed on upper/lower face) are presented tachistoscopically. If no instructions are given, subjects overwhelmingly identify the lower facial expression when blends are presented to either visual field. If asked to attend to the upper face, subjects overwhelmingly identify the upper facial expression in the left visual field but remain slightly biased to the lower facial expression in the right visual field. The current investigation sought to determine whether differences in initial saccadic targets could help explain the perceptual biases described above. Ten subjects were presented with full and blend facial expressions under different attentional conditions. No saccadic differences were found for left versus right visual field presentations or for full facial versus blend stimuli. When asked to identify the presented emotion, saccades were directed to the lower face. When asked to attend to the upper face, saccades were directed to the upper face. When asked to attend to the upper face and try to identify the emotion, saccades were directed to the upper face but to a lesser degree. Thus, saccadic behavior supports the concept that there are cognitive-attentional pre-attunements when subjects visually process facial expressions. However, these pre-attunements do not fully explain the perceptual superiority of the left visual field for identifying the upper facial expression when facial blends are presented tachistoscopically. Hence other perceptual factors must be in play, such as the phenomenon of virtual scanning.
Collapse
Affiliation(s)
- Frank A Boutsen
- Department of Communication Sciences and Disorders, University of Oklahoma Health Sciences, 1200 North Stonewall Ave., Oklahoma City, OK 73117, USA
| | - Justin D Dvorak
- Department of Communication Sciences and Disorders, University of Oklahoma Health Sciences, 1200 North Stonewall Ave., Oklahoma City, OK 73117, USA
| | - Vinay K Pulusu
- Department of Neurology, University of Oklahoma Health Sciences Center, and the VA Medical Center (127), 921 NE 13th Street, Oklahoma City, OK 73104, USA
| | - Elliott D Ross
- Department of Neurology, University of Oklahoma Health Sciences Center, and the VA Medical Center (127), 921 NE 13th Street, Oklahoma City, OK 73104, USA; Department of Communication Sciences and Disorders, University of Oklahoma Health Sciences, 1200 North Stonewall Ave., Oklahoma City, OK 73117, USA.
| |
Collapse
|
8
|
Ross ED, Gupta SS, Adnan AM, Holden TL, Havlicek J, Radhakrishnan S. Neurophysiology of spontaneous facial expressions: I. Motor control of the upper and lower face is behaviorally independent in adults. Cortex 2016; 76:28-42. [DOI: 10.1016/j.cortex.2016.01.001] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2015] [Revised: 12/29/2015] [Accepted: 01/05/2016] [Indexed: 12/01/2022]
|
9
|
Adult developmental trajectories of pseudoneglect in the tactile, visual and auditory modalities and the influence of starting position and stimulus length. Brain Cogn 2016; 103:12-22. [DOI: 10.1016/j.bandc.2015.12.001] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2015] [Revised: 12/02/2015] [Accepted: 12/10/2015] [Indexed: 11/21/2022]
|
10
|
Representational pseudoneglect: a review. Neuropsychol Rev 2014; 24:148-65. [PMID: 24414221 DOI: 10.1007/s11065-013-9245-2] [Citation(s) in RCA: 46] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2013] [Accepted: 12/23/2013] [Indexed: 10/25/2022]
Abstract
Pseudoneglect, the tendency to be biased towards the left-hand side of space, is a robust and consistent behavioural observation best demonstrated on the task of visuospatial line bisection, where participants are asked to centrally bisect visually presented horizontal lines at the perceived centre. A number of studies have revealed that a representational form of pseudoneglect exists, occurring when participants are asked to either mentally represent a stimulus or explore a stimulus using touch in the complete absence of direct visuospatial processing. Despite the growing number of studies that have demonstrated representational pseudoneglect there exists no current and comprehensive review of these findings and no discussion of a theoretical framework into which these findings may fall. An important gap in the current representational pseudoneglect literature is a discussion of the developmental trajectory of the bias. The focus of the current review is to outline studies that have observed representational pseudoneglect in healthy participants, consider a theoretical framework for these observations, and address the impact of lifespan factors such as cognitive ageing on the phenomenon.
Collapse
|
11
|
Ross ED, Shayya L, Champlain A, Monnot M, Prodan CI. Decoding facial blends of emotion: visual field, attentional and hemispheric biases. Brain Cogn 2013; 83:252-61. [PMID: 24091036 DOI: 10.1016/j.bandc.2013.09.001] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2012] [Revised: 07/23/2013] [Accepted: 09/02/2013] [Indexed: 10/26/2022]
Abstract
Most clinical research assumes that modulation of facial expressions is lateralized predominantly across the right-left hemiface. However, social psychological research suggests that facial expressions are organized predominantly across the upper-lower face. Because humans learn to cognitively control facial expression for social purposes, the lower face may display a false emotion, typically a smile, to enable approach behavior. In contrast, the upper face may leak a person's true feeling state by producing a brief facial blend of emotion, i.e. a different emotion on the upper versus lower face. Previous studies from our laboratory have shown that upper facial emotions are processed preferentially by the right hemisphere under conditions of directed attention if facial blends of emotion are presented tachistoscopically to the mid left and right visual fields. This paper explores how facial blends are processed within the four visual quadrants. The results, combined with our previous research, demonstrate that lower more so than upper facial emotions are perceived best when presented to the viewer's left and right visual fields just above the horizontal axis. Upper facial emotions are perceived best when presented to the viewer's left visual field just above the horizontal axis under conditions of directed attention. Thus, by gazing at a person's left ear, which also avoids the social stigma of eye-to-eye contact, one's ability to decode facial expressions should be enhanced.
Collapse
Affiliation(s)
- Elliott D Ross
- Department of Neurology, University of Oklahoma Health Sciences Center and the VA Medical Center 127, 921 NE 13th Street, Oklahoma City, OK 73104, USA.
| | | | | | | | | |
Collapse
|
12
|
Exploring the association between pain intensity and facial display in term newborns. Pain Res Manag 2011; 16:10-2. [PMID: 21369535 DOI: 10.1155/2011/873103] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/12/2023]
Abstract
BACKGROUND Facial expression is widely used to judge pain in neonates. However, little is known about the relationship between intensity of the painful stimulus and the nature of the expression in term neonates. OBJECTIVES To describe differences in the movement of key facial areas between two groups of term neonates experiencing painful stimuli of different intensities. METHODS Video recordings from two previous studies were used to select study subjects. Four term neonates undergoing circumcision without analgesia were compared with four similar male term neonates undergoing a routine heel stick. Facial movements were measured with a computer using a previously developed 'point-pair' system that focuses on movement in areas implicated in neonatal pain expression. Measurements were expressed in pixels, standardized to percentage of individual infant face width. RESULTS Point pairs measuring eyebrow and eye movement were similar, as was the sum of change across the face (41.15 in the circumcision group versus 40.33 in the heel stick group). Point pair 4 (horizontal change of the mouth) was higher for the heel stick group at 9.09 versus 3.93 for the circumcision group, while point pair 5 (vertical change of the mouth) was higher for the circumcision group (23.32) than for the heel stick group (15.53). CONCLUSION Little difference was noted in eye and eyebrow movement between pain intensities. The mouth opened wider (vertically) in neonates experiencing the higher pain stimulus. Qualitative differences in neonatal facial expression to pain intensity may exist, and the mouth may be an area in which to detect them. Further study of the generalizability of these findings is needed.
Collapse
|
13
|
Ross ED, Monnot M. Affective prosody: What do comprehension errors tell us about hemispheric lateralization of emotions, sex and aging effects, and the role of cognitive appraisal. Neuropsychologia 2011; 49:866-877. [DOI: 10.1016/j.neuropsychologia.2010.12.024] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2010] [Revised: 12/10/2010] [Accepted: 12/13/2010] [Indexed: 10/18/2022]
|
14
|
Beaton AA, Fouquet NC, Maycock NC, Platt E, Payne LS, Derrett A. Processing emotion from the eyes: a divided visual field and ERP study. Laterality 2011; 17:486-514. [PMID: 21337252 DOI: 10.1080/1357650x.2010.517848] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/28/2022]
Abstract
The right cerebral hemisphere is preferentially involved in recognising at least some facial expressions of emotion. We investigated whether there is a laterality effect in judging emotions from the eyes. In one task a pair of emotionally expressive eyes presented in central vision had to be physically matched to a subsequently presented set of eyes in one or other visual hemifield (eyes condition). In the second task a word was presented centrally followed by a set of eyes to left or right hemifield and the participant had to decide whether the word correctly described the emotion portrayed by the laterally presented set of eyes (word condition). Participants were a group of undergraduate students and a group of older volunteers (> 50). There was no visual hemifield difference in accuracy or raw response times in either task for either group, but log-transformed times showed an overall left hemifield advantage. Contrary to the right hemisphere ageing hypothesis, older participants showed no evidence of a relative right hemisphere decline in performance on the tasks. In the younger group mean peak amplitude of the N170 component of the EEG at lateral posterior electrode sites was significantly greater over the right hemisphere (T6/PO2) than the left (T5/PO1) in both the perceptual recognition task and the emotional judgement task. It was significantly greater for the task of judging emotion than in the eyes-matching task. In future it would be useful to combine electrophysiological techniques with lateralised visual input in studying lateralisation of emotion with older as well as younger participants.
Collapse
Affiliation(s)
- Alan A Beaton
- Department of Psychology, University of Swansea, Singleton Park, Swansea, UK.
| | | | | | | | | | | |
Collapse
|
15
|
Boggio PS, Campanhã C, Valasek CA, Fecteau S, Pascual-Leone A, Fregni F. Modulation of decision-making in a gambling task in older adults with transcranial direct current stimulation. Eur J Neurosci 2010; 31:593-7. [PMID: 20105234 DOI: 10.1111/j.1460-9568.2010.07080.x] [Citation(s) in RCA: 109] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/20/2023]
Abstract
Cognitive performance usually declines in older adults as a result of neurodegenerative processes. One of the cognitive domains usually affected is decision-making. Based on our recent findings suggesting that non-invasive brain stimulation can improve decision-making in young participants, we studied whether bifrontal transcranial direct current stimulation (tDCS) applied over the right and left prefrontal cortex of older adult subjects can change balance of risky and safe responses as it can in younger individuals. Twenty-eight subjects (age range from 50 to 85 years) performed a gambling risk task while receiving either anodal tDCS over the right and cathodal tDCS over the left dorsolateral prefrontal cortex (DLPFC), anodal tDCS over the left with cathodal tDCS over the right DLPFC, or sham stimulation. Our main finding was a significant group effect showing that participants receiving left anodal/right cathodal stimulation chose more often high-risk prospects as compared with participants receiving sham or those receiving right anodal/left cathodal stimulation. This result is contrary to previous findings in young subjects, suggesting that modulation of cortical activity in young and elderly results in opposite behavioral effects; thus supporting fundamental changes in cognitive processing in the elderly.
Collapse
Affiliation(s)
- Paulo Sérgio Boggio
- Center for Health and Biological Sciences, Mackenzie Presbyterian University, Sao Paulo, Brazil.
| | | | | | | | | | | |
Collapse
|
16
|
Re-addressing gender bias in Cortex publications. Cortex 2009; 45:1126-37. [DOI: 10.1016/j.cortex.2009.04.004] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2009] [Revised: 04/13/2009] [Accepted: 04/13/2009] [Indexed: 11/22/2022]
|
17
|
Paulmann S, Pell MD, Kotz SA. How aging affects the recognition of emotional speech. BRAIN AND LANGUAGE 2008; 104:262-9. [PMID: 17428529 DOI: 10.1016/j.bandl.2007.03.002] [Citation(s) in RCA: 69] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/11/2006] [Revised: 03/02/2007] [Accepted: 03/03/2007] [Indexed: 05/14/2023]
Abstract
To successfully infer a speaker's emotional state, diverse sources of emotional information need to be decoded. The present study explored to what extent emotional speech recognition of 'basic' emotions (anger, disgust, fear, happiness, pleasant surprise, sadness) differs between different sex (male/female) and age (young/middle-aged) groups in a behavioural experiment. Participants were asked to identify the emotional prosody of a sentence as accurately as possible. As a secondary goal, the perceptual findings were examined in relation to acoustic properties of the sentences presented. Findings indicate that emotion recognition rates differ between the different categories tested and that these patterns varied significantly as a function of age, but not of sex.
Collapse
Affiliation(s)
- Silke Paulmann
- Max Planck Institute for Human Cognitive and Brain Sciences, P.O. Box 500 355, 04303 Leipzig, Germany.
| | | | | |
Collapse
|
18
|
Ross ED, Prodan CI, Monnot M. Human Facial Expressions Are Organized Functionally Across the Upper-Lower Facial Axis. Neuroscientist 2007; 13:433-46. [PMID: 17901253 DOI: 10.1177/1073858407305618] [Citation(s) in RCA: 42] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Abstract
Most clinical research has focused on intensity differences of facial expressions between the right and left hemiface to explore lateralization of emotions in the brain. Observations by social psychologists, however, suggest that control of facial expression is organized predominantly across the upper-lower facial axis because of the phenomena of facial blends: simultaneous display of different emotions on the upper and lower face. Facial blends are related to social emotions and development of display rules that allow individuals to sculpt facial expressions for social and manipulative purposes. We have demonstrated that facial blends of emotion are more easily and accurately posed on the upper-lower than right-left hemiface, and that upper facial emotions are processed preferentially by the right hemisphere whereas lower facial emotions are processed preferentially by the left hemisphere. Based on these results, recent anatomical studies showing separate cortical areas for motor control of upper and lower face and the neurology of posed and spontaneous expressions of emotion, a functional-anatomic model of how the forebrain modulates facial expressions, is presented. The unique human ability to produce facial blends of emotion is, most likely, an adaptive modification linked to the evolution of speech and language. NEUROSCIENTIST 13(5):433—446, 2007.
Collapse
Affiliation(s)
- Elliott D Ross
- Department of Neurology, University of Oklahoma Health Sciences Center, Oklahoma City, Oklahoma, USA.
| | | | | |
Collapse
|