1
|
Kim H, Kim J. Consistent neural representation of valence in encoding and recall. Brain Cogn 2025; 186:106296. [PMID: 40157046 DOI: 10.1016/j.bandc.2025.106296] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/26/2024] [Revised: 03/06/2025] [Accepted: 03/20/2025] [Indexed: 04/01/2025]
Abstract
Recall is an act of elicitation of emotions similar to those emotions previously experienced. Unlike the past experiences where external sensory stimuli triggered emotions, recall does not require external sensory stimuli. This difference is pertinent to the key debate in affective representation, addressing whether the representation of valence is consistent across modalities (modality-general) or dependent on modalities (modality-specific). This study aimed to verify neural representations of valence between encoding and recall. Using neuroimaging data from movie watching and recall (Chen et al., 2017) and behavioral data for valence ratings (Kim et al., 2020), a searchlight analysis was conducted with cross-participant regression-based decoding between movie watching and recall. Multidimensional scaling was employed as a validation analysis of the results from searchlight analysis. The searchlight analysis revealed the right middle temporal and inferior temporal gyrus as well as the left fusiform gyrus. The validation analysis further exhibited significant consistent neural representations of valence in the inferior temporal gyrus and the left fusiform gyrus. This study identified the brain regions where valence is consistently represented between encoding and recall about real events. These findings contribute to debate in affective representations, by comparing conditions utilized little in prior, suggesting the inferior temporal gyrus relates to representations of valence during encoding and recalling natural events.
Collapse
Affiliation(s)
- Hyeonjung Kim
- Department of Psychology, Jeonbuk National University, Republic of Korea
| | - Jongwan Kim
- Department of Psychology, Jeonbuk National University, Republic of Korea.
| |
Collapse
|
2
|
Sadeghi S, Gu Z, De Rosa E, Kuceyeski A, Anderson AK. Direct perception of affective valence from vision. Nat Commun 2024; 15:10735. [PMID: 39737913 PMCID: PMC11686310 DOI: 10.1038/s41467-024-53668-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2023] [Accepted: 10/16/2024] [Indexed: 01/01/2025] Open
Abstract
Subjective feelings are thought to arise from conceptual and bodily states. We examine whether the valence of feelings may also be decoded directly from objective ecological statistics of the visual environment. We train a visual valence (VV) machine learning model of low-level image statistics on nearly 8000 emotionally charged photographs. The VV model predicts human valence ratings of images and transfers even more robustly to abstract paintings. In human observers, limiting conceptual analysis of images enhances VV contributions to valence experience, increasing correspondence with machine perception of valence. In the brain, VV resides in lower to mid-level visual regions, where neural activity submitted to deep generative networks synthesizes new images containing positive versus negative VV. There are distinct modes of valence experience, one derived indirectly from meaning, and the other embedded in ecological statistics, affording direct perception of subjective valence as an apparent objective property of the external world.
Collapse
Affiliation(s)
- Saeedeh Sadeghi
- Department of Psychology, Cornell University, Ithaca, NY, USA.
- Division of Humanities and Social Sciences, California Institute of Technology, Pasadena, CA, USA.
| | - Zijin Gu
- School of Electrical and Computer Engineering, Cornell University and Cornell Tech, Ithaca, NY, USA
| | - Eve De Rosa
- Department of Psychology, Cornell University, Ithaca, NY, USA
| | - Amy Kuceyeski
- Department of Radiology, Weill Cornell Medicine, New York, NY, USA
- Department of Computational Biology, Cornell University, Ithaca, NY, USA
| | - Adam K Anderson
- Department of Psychology, Cornell University, Ithaca, NY, USA.
| |
Collapse
|
3
|
Lee KM, Satpute AB. More than labels: neural representations of emotion words are widely distributed across the brain. Soc Cogn Affect Neurosci 2024; 19:nsae043. [PMID: 38903026 PMCID: PMC11259136 DOI: 10.1093/scan/nsae043] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2023] [Revised: 03/15/2024] [Accepted: 06/20/2024] [Indexed: 06/22/2024] Open
Abstract
Although emotion words such as "anger," "disgust," "happiness," or "pride" are often thought of as mere labels, increasing evidence points to language as being important for emotion perception and experience. Emotion words may be particularly important for facilitating access to the emotion concepts. Indeed, deficits in semantic processing or impaired access to emotion words interfere with emotion perception. Yet, it is unclear what these behavioral findings mean for affective neuroscience. Thus, we examined the brain areas that support processing of emotion words using representational similarity analysis of functional magnetic resonance imaging data (N = 25). In the task, participants saw 10 emotion words (e.g. "anger," "happiness") while in the scanner. Participants rated each word based on its valence on a continuous scale ranging from 0 (Pleasant/Good) to 1 (Unpleasant/Bad) scale to ensure they were processing the words. Our results revealed that a diverse range of brain areas including prefrontal, midline cortical, and sensorimotor regions contained information about emotion words. Notably, our results overlapped with many regions implicated in decoding emotion experience by prior studies. Our results raise questions about what processes are being supported by these regions during emotion experience.
Collapse
Affiliation(s)
- Kent M Lee
- Department of Psychology, Northeastern University, 125 Nightingale Hall, Boston, MA 02115, USA
| | - Ajay B Satpute
- Department of Psychology, Northeastern University, 125 Nightingale Hall, Boston, MA 02115, USA
| |
Collapse
|
4
|
Lee SA, Lee JJ, Han J, Choi M, Wager TD, Woo CW. Brain representations of affective valence and intensity in sustained pleasure and pain. Proc Natl Acad Sci U S A 2024; 121:e2310433121. [PMID: 38857402 PMCID: PMC11194486 DOI: 10.1073/pnas.2310433121] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2023] [Accepted: 04/18/2024] [Indexed: 06/12/2024] Open
Abstract
Pleasure and pain are two fundamental, intertwined aspects of human emotions. Pleasurable sensations can reduce subjective feelings of pain and vice versa, and we often perceive the termination of pain as pleasant and the absence of pleasure as unpleasant. This implies the existence of brain systems that integrate them into modality-general representations of affective experiences. Here, we examined representations of affective valence and intensity in an functional MRI (fMRI) study (n = 58) of sustained pleasure and pain. We found that the distinct subpopulations of voxels within the ventromedial and lateral prefrontal cortices, the orbitofrontal cortex, the anterior insula, and the amygdala were involved in decoding affective valence versus intensity. Affective valence and intensity predictive models showed significant decoding performance in an independent test dataset (n = 62). These models were differentially connected to distinct large-scale brain networks-the intensity model to the ventral attention network and the valence model to the limbic and default mode networks. Overall, this study identified the brain representations of affective valence and intensity across pleasure and pain, promoting a systems-level understanding of human affective experiences.
Collapse
Affiliation(s)
- Soo Ahn Lee
- Center for Neuroscience Imaging Research, Institute for Basic Science, Suwon16419, Republic of Korea
- Department of Biomedical Engineering, Sungkyunkwan University, Suwon16419, Republic of Korea
- Department of Intelligent Precision Healthcare Convergence, Sungkyunkwan University, Suwon16419, Republic of Korea
| | - Jae-Joong Lee
- Center for Neuroscience Imaging Research, Institute for Basic Science, Suwon16419, Republic of Korea
| | - Jisoo Han
- Korea Brain Research Institute, Daegu41062, Republic of Korea
| | - Myunghwan Choi
- Center for Neuroscience Imaging Research, Institute for Basic Science, Suwon16419, Republic of Korea
- School of Biological Sciences, Seoul National University, Seoul08826, Republic of Korea
| | - Tor D. Wager
- Department of Psychological and Brain Sciences, Dartmouth College, Hanover, NH03755
| | - Choong-Wan Woo
- Center for Neuroscience Imaging Research, Institute for Basic Science, Suwon16419, Republic of Korea
- Department of Biomedical Engineering, Sungkyunkwan University, Suwon16419, Republic of Korea
- Department of Intelligent Precision Healthcare Convergence, Sungkyunkwan University, Suwon16419, Republic of Korea
- Life-inspired Neural Network for Prediction and Optimization Research Group, Suwon16419, Republic of Korea
| |
Collapse
|
5
|
Li Y, Li S, Hu W, Yang L, Luo W. Spatial representation of multidimensional information in emotional faces revealed by fMRI. Neuroimage 2024; 290:120578. [PMID: 38499051 DOI: 10.1016/j.neuroimage.2024.120578] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2023] [Revised: 03/13/2024] [Accepted: 03/15/2024] [Indexed: 03/20/2024] Open
Abstract
Face perception is a complex process that involves highly specialized procedures and mechanisms. Investigating into face perception can help us better understand how the brain processes fine-grained, multidimensional information. This research aimed to delve deeply into how different dimensions of facial information are represented in specific brain regions or through inter-regional connections via an implicit face recognition task. To capture the representation of various facial information in the brain, we employed support vector machine decoding, functional connectivity, and model-based representational similarity analysis on fMRI data, resulting in the identification of three crucial findings. Firstly, despite the implicit nature of the task, emotions were still represented in the brain, contrasting with all other facial information. Secondly, the connection between the medial amygdala and the parahippocampal gyrus was found to be essential for the representation of facial emotion in implicit tasks. Thirdly, in implicit tasks, arousal representation occurred in the parahippocampal gyrus, while valence depended on the connection between the primary visual cortex and the parahippocampal gyrus. In conclusion, these findings dissociate the neural mechanisms of emotional valence and arousal, revealing the precise spatial patterns of multidimensional information processing in faces.
Collapse
Affiliation(s)
- Yiwen Li
- State Key Laboratory of Cognitive Neuroscience and Learning & IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing 100875, PR China; Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, PR China; Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian 116029, PR China
| | - Shuaixia Li
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, PR China; Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian 116029, PR China
| | - Weiyu Hu
- State Key Laboratory of Cognitive Neuroscience and Learning & IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing 100875, PR China
| | - Lan Yang
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, PR China; Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian 116029, PR China
| | - Wenbo Luo
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, PR China; Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian 116029, PR China.
| |
Collapse
|
6
|
Lee S, Kim J. Testing the bipolar assumption of Singer-Loomis Type Deployment Inventory for Korean adults using classification and multidimensional scaling. Front Psychol 2024; 14:1249185. [PMID: 38356992 PMCID: PMC10864660 DOI: 10.3389/fpsyg.2023.1249185] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2023] [Accepted: 12/26/2023] [Indexed: 02/16/2024] Open
Abstract
In this study, we explored whether the Korean version of Singer Loomis Type Deployment Inventory II (K-SLTDI) captures the opposing tendencies of Jung's theory of psychological type. The types are Extroverted Sensing, Extroverted Intuition, Extroverted Feeling, Extroverted Thinking, Introverted Sensing, Introverted Intuition, Introverted Feeling, and Introverted Thinking. A nationwide online survey was conducted in South Korea. We performed multidimensional scaling and classification analyses based on 521 Korean adult profiles with eight psychological types to test the bipolarity assumption. The results showed that the Procrustes-rotated four-dimensional space successfully represented four types of opposing tendencies. Moreover, the bipolarity assumption in the four dimensions of Jungian typology was tested and compared between lower and higher psychological distress populations via cluster analysis. Lastly, we explored patterns of responses in lower and higher psychological distress populations using intersubject correlation. Both similarity analyses and classification results consistently support the theoretical considerations on the conceptualization of Jung's type in independent order that the types could be derived without bipolar assumption as Singer and Loomis expected in their Type Development Inventory. Limitations in our study include the sample being randomly selected internet users during the COVID-19 pandemic, despite excellence in the use of the internet in the general Korean population.
Collapse
Affiliation(s)
| | - Jongwan Kim
- Psychology Department, Jeonbuk National University, Jeonju, Republic of Korea
| |
Collapse
|
7
|
Ballotta D, Maramotti R, Borelli E, Lui F, Pagnoni G. Neural correlates of emotional valence for faces and words. Front Psychol 2023; 14:1055054. [PMID: 36910761 PMCID: PMC9996044 DOI: 10.3389/fpsyg.2023.1055054] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2022] [Accepted: 02/06/2023] [Indexed: 02/25/2023] Open
Abstract
Stimuli with negative emotional valence are especially apt to influence perception and action because of their crucial role in survival, a property that may not be precisely mirrored by positive emotional stimuli of equal intensity. The aim of this study was to identify the neural circuits differentially coding for positive and negative valence in the implicit processing of facial expressions and words, which are among the main ways human beings use to express emotions. Thirty-six healthy subjects took part in an event-related fMRI experiment. We used an implicit emotional processing task with the visual presentation of negative, positive, and neutral faces and words, as primary stimuli. Dynamic Causal Modeling (DCM) of the fMRI data was used to test effective brain connectivity within two different anatomo-functional models, for the processing of words and faces, respectively. In our models, the only areas showing a significant differential response to negative and positive valence across both face and word stimuli were early visual cortices, with faces eliciting stronger activations. For faces, DCM revealed that this effect was mediated by a facilitation of activity in the amygdala by positive faces and in the fusiform face area by negative faces; for words, the effect was mainly imputable to a facilitation of activity in the primary visual cortex by positive words. These findings support a role of early sensory cortices in discriminating the emotional valence of both faces and words, where the effect may be mediated chiefly by the subcortical/limbic visual route for faces, and rely more on the direct thalamic pathway to primary visual cortex for words.
Collapse
Affiliation(s)
- Daniela Ballotta
- Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, Modena, Italy
| | - Riccardo Maramotti
- Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, Modena, Italy
| | - Eleonora Borelli
- Department of Medical and Surgical, Maternal-Infantile and Adult Sciences, University of Modena and Reggio Emilia, Modena, Italy
| | - Fausta Lui
- Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, Modena, Italy
| | - Giuseppe Pagnoni
- Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, Modena, Italy
| |
Collapse
|
8
|
Audiovisual Emotional Congruency Modulates the Stimulus-Driven Cross-Modal Spread of Attention. Brain Sci 2022; 12:brainsci12091229. [PMID: 36138965 PMCID: PMC9497153 DOI: 10.3390/brainsci12091229] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2022] [Revised: 09/04/2022] [Accepted: 09/07/2022] [Indexed: 11/18/2022] Open
Abstract
It has been reported that attending to stimuli in visual modality can spread to task-irrelevant but synchronously presented stimuli in auditory modality, a phenomenon termed the cross-modal spread of attention, which could be either stimulus-driven or representation-driven depending on whether the visual constituent of an audiovisual object is further selected based on the object representation. The stimulus-driven spread of attention occurs whenever a task-irrelevant sound synchronizes with an attended visual stimulus, regardless of the cross-modal semantic congruency. The present study recorded event-related potentials (ERPs) to investigate whether the stimulus-driven cross-modal spread of attention could be modulated by audio-visual emotional congruency in a visual oddball task where emotion (positive/negative) was task-irrelevant. The results first demonstrated a prominent stimulus-driven spread of attention regardless of audio-visual emotional congruency by showing that for all audiovisual pairs, the extracted ERPs to the auditory constituents of audiovisual stimuli within the time window of 200–300 ms were significantly larger than ERPs to the same auditory stimuli delivered alone. However, the amplitude of this stimulus-driven auditory Nd component during 200–300 ms was significantly larger for emotionally incongruent than congruent audiovisual stimuli when their visual constituents’ emotional valences were negative. Moreover, the Nd was sustained during 300–400 ms only for the incongruent audiovisual stimuli with emotionally negative visual constituents. These findings suggest that although the occurrence of the stimulus-driven cross-modal spread of attention is independent of audio-visual emotional congruency, its magnitude is nevertheless modulated even when emotion is task-irrelevant.
Collapse
|
9
|
Kim MJ, Knodt AR, Hariri AR. Meta-analytic activation maps can help identify affective processes captured by contrast-based task fMRI: the case of threat-related facial expressions. Soc Cogn Affect Neurosci 2022; 17:777-787. [PMID: 35137241 PMCID: PMC9433847 DOI: 10.1093/scan/nsac010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2019] [Revised: 10/30/2021] [Accepted: 02/07/2022] [Indexed: 12/02/2022] Open
Abstract
Meta-analysis of functional magnetic resonance imaging (fMRI) data is an effective method for capturing the distributed patterns of brain activity supporting discrete cognitive and affective processes. One opportunity presented by the resulting meta-analysis maps (MAMs) is as a reference for better understanding the nature of individual contrast maps (ICMs) derived from specific task fMRI data. Here, we compared MAMs from 148 neuroimaging studies representing emotion categories of fear, anger, disgust, happiness and sadness with ICMs from fearful > neutral and angry > neutral faces from an independent dataset of task fMRI (n = 1263). Analyses revealed that both fear and anger ICMs exhibited the greatest pattern similarity to fear MAMs. As the number of voxels included for the computation of pattern similarity became more selective, the specificity of MAM-ICM correspondence decreased. Notably, amygdala activity long considered critical for processing threat-related facial expressions was neither sufficient nor necessary for detecting MAM-ICM pattern similarity effects. Our analyses suggest that both fearful and angry facial expressions are best captured by distributed patterns of brain activity, a putative neural correlate of threat. More generally, our analyses demonstrate how MAMs can be leveraged to better understand affective processes captured by ICMs in task fMRI data.
Collapse
Affiliation(s)
- M Justin Kim
- Department of Psychology, Sungkyunkwan University, Seoul 03063, South Korea
- Center for Neuroscience Imaging Research, Institute for Basic Science, Suwon 16419, South Korea, and
| | - Annchen R Knodt
- Laboratory of NeuroGenetics, Department of Psychology and Neuroscience, Duke University, Durham, NC 27708 USA
| | - Ahmad R Hariri
- Laboratory of NeuroGenetics, Department of Psychology and Neuroscience, Duke University, Durham, NC 27708 USA
| |
Collapse
|
10
|
Thakral PP, Bottary R, Kensinger EA. Representing the Good and Bad: fMRI signatures during the encoding of multisensory positive, negative, and neutral events. Cortex 2022; 151:240-258. [PMID: 35462202 PMCID: PMC9124690 DOI: 10.1016/j.cortex.2022.02.014] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2021] [Revised: 01/04/2022] [Accepted: 02/13/2022] [Indexed: 11/27/2022]
Abstract
Few studies have examined how multisensory emotional experiences are processed and encoded into memory. Here, we aimed to determine whether, at encoding, activity within functionally-defined visual- and auditory-processing brain regions discriminated the emotional category (i.e., positive, negative, or neutral) of the multisensory (audio-visual) events. Participants incidentally encoded positive, negative, and neutral multisensory stimuli during event-related functional magnetic resonance imaging (fMRI). Following a 3-h post-encoding delay, their memory for studied stimuli was tested, allowing us to identify emotion-category-specific subsequent-memory effects focusing on medial temporal lobe regions (i.e., amygdala, hippocampus) and visual- and auditory-processing regions. We used a combination of univariate and multivoxel pattern fMRI analyses (MVPA) to examine emotion-category-specificity in mean activity levels and neural patterning, respectively. Univariate analyses revealed many more visual regions that showed negative-category-specificity relative to positive-category-specificity, and auditory regions only showed negative-category-specificity. These results suggest that negative emotion is more closely tied to information contained within sensory regions, a conclusion that was supported by the MVPA analyses. Functional connectivity analyses further revealed that the visual amplification of category-selective processing is driven, in part, by mean signal from the amygdala. Interestingly, while stronger representations in visuo-auditory regions were related to subsequent-memory for neutral multisensory stimuli, they were related to subsequent-forgetting of positive and negative stimuli. Neural patterning in the hippocampus and amygdala were related to memory for negative multisensory stimuli. These results provide new evidence that negative emotional stimuli are processed with increased engagement of visuosensory regions, but that this sensory engagement-that generalizes across the entire emotion category-is not the type of sensory encoding that is most beneficial for later retrieval.
Collapse
Affiliation(s)
| | - Ryan Bottary
- Department of Psychology and Neuroscience, Boston College, MA, USA; Division of Sleep Medicine, Harvard Medical School, MA, USA
| | | |
Collapse
|
11
|
Lee KM, Ferreira-Santos F, Satpute AB. Predictive processing models and affective neuroscience. Neurosci Biobehav Rev 2021; 131:211-228. [PMID: 34517035 PMCID: PMC9074371 DOI: 10.1016/j.neubiorev.2021.09.009] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2019] [Revised: 02/10/2021] [Accepted: 09/07/2021] [Indexed: 01/17/2023]
Abstract
The neural bases of affective experience remain elusive. Early neuroscience models of affect searched for specific brain regions that uniquely carried out the computations that underlie dimensions of valence and arousal. However, a growing body of work has failed to identify these circuits. Research turned to multivariate analyses, but these strategies, too, have made limited progress. Predictive processing models offer exciting new directions to address this problem. Here, we use predictive processing models as a lens to critique prevailing functional neuroimaging research practices in affective neuroscience. Our review highlights how much work relies on rigid assumptions that are inconsistent with a predictive processing approach. We outline the central aspects of a predictive processing model and draw out their implications for research in affective and cognitive neuroscience. Predictive models motivate a reformulation of "reverse inference" in cognitive neuroscience, and placing a greater emphasis on external validity in experimental design.
Collapse
Affiliation(s)
- Kent M Lee
- Northeastern University, 360 Huntington Ave, 125 NI, Boston, MA 02118, USA.
| | - Fernando Ferreira-Santos
- Laboratory of Neuropsychophysiology, Faculty of Psychology and Education Sciences, University of Porto, Portugal
| | - Ajay B Satpute
- Northeastern University, 360 Huntington Ave, 125 NI, Boston, MA 02118, USA
| |
Collapse
|
12
|
Modality-general and modality-specific audiovisual valence processing. Cortex 2021; 138:127-137. [PMID: 33684626 DOI: 10.1016/j.cortex.2021.01.022] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2020] [Revised: 11/08/2020] [Accepted: 01/20/2021] [Indexed: 11/23/2022]
Abstract
A fundamental question in affective neuroscience is whether there is a common hedonic system for valence processing independent of modality, or there are distinct neural systems for different modalities. To address this question, we used both region of interest and whole-brain representational similarity analyses on functional magnetic resonance imaging data to identify modality-general and modality-specific brain areas involved in valence processing across visual and auditory modalities. First, region of interest analyses showed that the superior temporal cortex was associated with both modality-general and auditory-specific models, while the primary visual cortex was associated with the visual-specific model. Second, the whole-brain searchlight analyses also identified both modality-general and modality-specific representations. The modality-general regions included the superior temporal, medial superior frontal, inferior frontal, precuneus, precentral, postcentral, supramarginal, paracentral lobule and middle cingulate cortices. The modality-specific regions included both perceptual cortices and higher-order brain areas. The valence representations derived from individualized behavioral valence ratings were consistent with these results. Together, these findings suggest both modality-general and modality-specific representations of valence.
Collapse
|
13
|
Bo K, Yin S, Liu Y, Hu Z, Meyyappan S, Kim S, Keil A, Ding M. Decoding Neural Representations of Affective Scenes in Retinotopic Visual Cortex. Cereb Cortex 2021; 31:3047-3063. [PMID: 33594428 DOI: 10.1093/cercor/bhaa411] [Citation(s) in RCA: 23] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2020] [Revised: 12/18/2020] [Accepted: 12/21/2020] [Indexed: 12/28/2022] Open
Abstract
The perception of opportunities and threats in complex visual scenes represents one of the main functions of the human visual system. The underlying neurophysiology is often studied by having observers view pictures varying in affective content. It has been shown that viewing emotionally engaging, compared with neutral, pictures (1) heightens blood flow in limbic, frontoparietal, and anterior visual structures and (2) enhances the late positive event-related potential (LPP). The role of retinotopic visual cortex in this process has, however, been contentious, with competing theories predicting the presence versus absence of emotion-specific signals in retinotopic visual areas. Recording simultaneous electroencephalography-functional magnetic resonance imaging while observers viewed pleasant, unpleasant, and neutral affective pictures, and applying multivariate pattern analysis, we found that (1) unpleasant versus neutral and pleasant versus neutral decoding accuracy were well above chance level in retinotopic visual areas, (2) decoding accuracy in ventral visual cortex (VVC), but not in early or dorsal visual cortex, was correlated with LPP, and (3) effective connectivity from amygdala to VVC predicted unpleasant versus neutral decoding accuracy, whereas effective connectivity from ventral frontal cortex to VVC predicted pleasant versus neutral decoding accuracy. These results suggest that affective scenes evoke valence-specific neural representations in retinotopic visual cortex and that these representations are influenced by reentry signals from anterior brain regions.
Collapse
Affiliation(s)
- Ke Bo
- J. Crayton Pruitt Family Department of Biomedical Engineering, University of Florida, Gainesville, FL 32611, USA
| | - Siyang Yin
- J. Crayton Pruitt Family Department of Biomedical Engineering, University of Florida, Gainesville, FL 32611, USA
| | - Yuelu Liu
- Center for Mind and Brain, University of California, Davis, CA 95618, USA
| | - Zhenhong Hu
- J. Crayton Pruitt Family Department of Biomedical Engineering, University of Florida, Gainesville, FL 32611, USA
| | - Sreenivasan Meyyappan
- J. Crayton Pruitt Family Department of Biomedical Engineering, University of Florida, Gainesville, FL 32611, USA
| | - Sungkean Kim
- J. Crayton Pruitt Family Department of Biomedical Engineering, University of Florida, Gainesville, FL 32611, USA
| | - Andreas Keil
- Department of Psychology, University of Florida, Gainesville, FL 32611, USA
| | - Mingzhou Ding
- J. Crayton Pruitt Family Department of Biomedical Engineering, University of Florida, Gainesville, FL 32611, USA
| |
Collapse
|
14
|
Buono GH, Crukley J, Hornsby BWY, Picou EM. Loss of high- or low-frequency audibility can partially explain effects of hearing loss on emotional responses to non-speech sounds. Hear Res 2020; 401:108153. [PMID: 33360158 DOI: 10.1016/j.heares.2020.108153] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/06/2020] [Revised: 11/20/2020] [Accepted: 12/08/2020] [Indexed: 11/16/2022]
Abstract
Hearing loss can disrupt emotional responses to sound. However, the impact of stimulus modality (multisensory versus unisensory) on this disruption, and the underlying mechanisms responsible, are unclear. The purposes of this project were to evaluate the effects of stimulus modality and filtering on emotional responses to non-speech stimuli. It was hypothesized that low- and high-pass filtering would result in less extreme ratings, but only for unisensory stimuli. Twenty-four adults (22- 34 years old; 12 male) with normal hearing participated. Participants made ratings of valence and arousal in response to pleasant, neutral, and unpleasant non-speech sounds and/or pictures. Each participant completed ratings of five stimulus modalities: auditory-only, visual-only, auditory-visual, filtered auditory-only, and filtered auditory-visual. Half of the participants rated low-pass filtered stimuli (800 Hz cutoff), and half of the participants rated high-pass filtered stimuli (2000 Hz cutoff). Combining auditory and visual modalities resulted in more extreme (more pleasant and more unpleasant) ratings of valence in response to pleasant and unpleasant stimuli. In addition, low- and high-pass filtering of sounds resulted in less extreme ratings of valence (less pleasant and less unpleasant) and arousal (less exciting) in response to both auditory-only and auditory-visual stimuli. These results suggest that changes in audible spectral information are partially responsible for the noted changes in emotional responses to sound that accompany hearing loss. The findings also suggest the effects of hearing loss will generalize to multisensory stimuli if the stimuli include sound, although further work is warranted to confirm this in listeners with hearing loss.
Collapse
Affiliation(s)
- Gabrielle H Buono
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, 1215 21st Ave South, Room 8310, Nashville, TN 37232, United States
| | - Jeffery Crukley
- Department of Speech-Language Pathology, University of Toronto, Canada
| | - Benjamin W Y Hornsby
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, 1215 21st Ave South, Room 8310, Nashville, TN 37232, United States
| | - Erin M Picou
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, 1215 21st Ave South, Room 8310, Nashville, TN 37232, United States.
| |
Collapse
|
15
|
Shinkareva SV, Gao C, Wedell D. Audiovisual Representations of Valence: a Cross-study Perspective. ACTA ACUST UNITED AC 2020; 1:237-246. [DOI: 10.1007/s42761-020-00023-9] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2020] [Accepted: 10/22/2020] [Indexed: 01/25/2023]
|
16
|
Reisch LM, Wegrzyn M, Woermann FG, Bien CG, Kissler J. Negative content enhances stimulus-specific cerebral activity during free viewing of pictures, faces, and words. Hum Brain Mapp 2020; 41:4332-4354. [PMID: 32633448 PMCID: PMC7502837 DOI: 10.1002/hbm.25128] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2020] [Revised: 06/16/2020] [Accepted: 06/24/2020] [Indexed: 01/25/2023] Open
Abstract
Negative visual stimuli have been found to elicit stronger brain activation than do neutral stimuli. Such emotion effects have been shown for pictures, faces, and words alike, but the literature suggests stimulus-specific differences regarding locus and lateralization of the activity. In the current functional magnetic resonance imaging study, we directly compared brain responses to passively viewed negative and neutral pictures of complex scenes, faces, and words (nouns) in 43 healthy participants (21 males) varying in age and demographic background. Both negative pictures and faces activated the extrastriate visual cortices of both hemispheres more strongly than neutral ones, but effects were larger and extended more dorsally for pictures, whereas negative faces additionally activated the superior temporal sulci. Negative words differentially activated typical higher-level language processing areas such as the left inferior frontal and angular gyrus. There were small emotion effects in the amygdala for faces and words, which were both lateralized to the left hemisphere. Although pictures elicited overall the strongest amygdala activity, amygdala response to negative pictures was not significantly stronger than to neutral ones. Across stimulus types, emotion effects converged in the left anterior insula. No gender effects were apparent, but age had a small, stimulus-specific impact on emotion processing. Our study specifies similarities and differences in effects of negative emotional content on the processing of different types of stimuli, indicating that brain response to negative stimuli is specifically enhanced in areas involved in processing of the respective stimulus type in general and converges across stimuli in the left anterior insula.
Collapse
Affiliation(s)
- Lea Marie Reisch
- Department of Psychology, University of Bielefeld, Bielefeld, Germany.,Epilepsy Centre Bethel, Krankenhaus Mara, Bielefeld, Germany
| | - Martin Wegrzyn
- Department of Psychology, University of Bielefeld, Bielefeld, Germany
| | | | | | - Johanna Kissler
- Department of Psychology, University of Bielefeld, Bielefeld, Germany
| |
Collapse
|
17
|
A study in affect: Predicting valence from fMRI data. Neuropsychologia 2020; 143:107473. [DOI: 10.1016/j.neuropsychologia.2020.107473] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2019] [Revised: 04/10/2020] [Accepted: 04/19/2020] [Indexed: 12/19/2022]
|
18
|
Levine SM, Alahäivälä ALI, Wechsler TF, Wackerle A, Rupprecht R, Schwarzbach JV. Linking Personality Traits to Individual Differences in Affective Spaces. Front Psychol 2020; 11:448. [PMID: 32231631 PMCID: PMC7082752 DOI: 10.3389/fpsyg.2020.00448] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2019] [Accepted: 02/26/2020] [Indexed: 11/13/2022] Open
Abstract
Different individuals respond differently to emotional stimuli in their environment. Therefore, to understand how emotions are represented mentally will ultimately require investigations into individual-level information. Here we tasked participants with freely arranging emotionally charged images on a computer screen according to their subjective emotional similarity (yielding a unique affective space for each participant) and subsequently sought external validity of the layout of the individuals’ affective spaces through the five-factor personality model (Neuroticism, Extraversion, Openness to Experience, Agreeableness, Conscientiousness) assessed via the NEO Five-Factor Inventory. Applying agglomerative hierarchical clustering to the group-level affective space revealed a set of underlying affective clusters whose within-cluster dissimilarity, per individual, was then correlated with individuals’ personality scores. These cluster-based analyses predominantly revealed that the dispersion of the negative cluster showed a positive relationship with Neuroticism and a negative relationship with Conscientiousness, a finding that would be predicted by prior work. Such results demonstrate the non-spurious structure of individualized emotion information revealed by data-driven analyses of a behavioral task (and validated by incorporating psychological measures of personality) and corroborate prior knowledge of the interaction between affect and personality. Future investigations can similarly combine hypothesis- and data-driven methods to extend such findings, potentially yielding new perspectives on underlying cognitive processes, disease susceptibility, or even diagnostic/prognostic markers for mental disorders involving emotion dysregulation.
Collapse
Affiliation(s)
- Seth M Levine
- Department of Psychiatry and Psychotherapy, University of Regensburg, Regensburg, Germany
| | - Aino L I Alahäivälä
- Department of Psychiatry and Psychotherapy, University of Regensburg, Regensburg, Germany
| | - Theresa F Wechsler
- Department of Clinical Psychology and Psychotherapy, University of Regensburg, Regensburg, Germany
| | - Anja Wackerle
- Department of Psychiatry and Psychotherapy, University of Regensburg, Regensburg, Germany
| | - Rainer Rupprecht
- Department of Psychiatry and Psychotherapy, University of Regensburg, Regensburg, Germany
| | - Jens V Schwarzbach
- Department of Psychiatry and Psychotherapy, University of Regensburg, Regensburg, Germany
| |
Collapse
|
19
|
Sambuco N, Bradley MM, Herring DR, Lang PJ. Common circuit or paradigm shift? The functional brain in emotional scene perception and emotional imagery. Psychophysiology 2020; 57:e13522. [PMID: 32011742 PMCID: PMC7446773 DOI: 10.1111/psyp.13522] [Citation(s) in RCA: 28] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2019] [Revised: 12/04/2019] [Accepted: 12/10/2019] [Indexed: 12/19/2022]
Abstract
Meta-analytic and experimental studies investigating the neural basis of emotion often compare functional activation in different emotional induction contexts, assessing evidence for a "core affect" or "salience" network. Meta-analyses necessarily aggregate effects across diverse paradigms and different samples, which ignore potential neural differences specific to the method of affect induction. Data from repeated measures designs are few, reporting contradictory results with a small N. In the current study, functional brain activity is assessed in a large (N = 61) group of healthy participants during two common emotion inductions-scene perception and narrative imagery-to evaluate cross-paradigm consistency. Results indicate that limbic and paralimbic regions, together with visual and parietal cortex, are reliably engaged during emotional scene perception. For emotional imagery, in contrast, enhanced functional activity is found in several cerebellar regions, hippocampus, caudate, and dorsomedial prefrontal cortex, consistent with the conception that imagery is an action disposition. Taken together, the data suggest that a common emotion network is not engaged across paradigms, but that the specific neural regions activated during emotional processing can vary significantly with the context of the emotional induction.
Collapse
Affiliation(s)
- Nicola Sambuco
- Center for the Study of Emotion and Attention, University of Florida, Gainesville, Florida
| | - Margaret M Bradley
- Center for the Study of Emotion and Attention, University of Florida, Gainesville, Florida
| | - David R Herring
- Center for the Study of Emotion and Attention, University of Florida, Gainesville, Florida
| | - Peter J Lang
- Center for the Study of Emotion and Attention, University of Florida, Gainesville, Florida
| |
Collapse
|
20
|
Ultra High Field fMRI of Human Superior Colliculi Activity during Affective Visual Processing. Sci Rep 2020; 10:1331. [PMID: 31992744 PMCID: PMC6987103 DOI: 10.1038/s41598-020-57653-z] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/14/2019] [Accepted: 12/31/2019] [Indexed: 11/08/2022] Open
Abstract
Research on rodents and non-human primates has established the involvement of the superior colliculus in defensive behaviours and visual threat detection. The superior colliculus has been well-studied in humans for its functional roles in saccade and visual processing, but less is known about its involvement in affect. In standard functional MRI studies of the human superior colliculus, it is challenging to discern activity in the superior colliculus from activity in surrounding nuclei such as the periaqueductal gray due to technological and methodological limitations. Employing high-field strength (7 Tesla) fMRI techniques, this study imaged the superior colliculus at high (0.75 mm isotropic) resolution, which enabled isolation of the superior colliculus from other brainstem nuclei. Superior colliculus activation during emotionally aversive image viewing blocks was greater than that during neutral image viewing blocks. These findings suggest that the superior colliculus may play a role in shaping subjective emotional experiences in addition to its visuomotor functions, bridging the gap between affective research on humans and non-human animals.
Collapse
|
21
|
Gu J, Cao L, Liu B. Modality-general representations of valences perceived from visual and auditory modalities. Neuroimage 2019; 203:116199. [PMID: 31536804 DOI: 10.1016/j.neuroimage.2019.116199] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2018] [Revised: 08/31/2019] [Accepted: 09/14/2019] [Indexed: 01/29/2023] Open
Abstract
Valence is a dimension of emotion and can be either positive, negative, or neutral. Valences can be expressed through the visual and auditory modalities, and the valences of each modality can be conveyed by different types of stimuli (face, body, voice or music). This study focused on the modality-general representations of valences, that is, valence information can be shared across not only visual and auditory modalities but also different types of stimuli within each modality. Functional magnetic resonance imaging (fMRI) data were collected when subjects made affective judgment on silent videos (face and body) and audio clips (voice and music). The searchlight analysis helped to locate four areas that might be sensitive to the representations of modality-general valences, including the bilateral postcentral gyrus, left middle temporal gyrus (MTG) and right middle frontal gyrus (MFG). Further cross-modal classification based on multivoxel pattern analysis (MVPA) was performed as a validation analysis, which suggested that only the left postcentral gyrus could successfully distinguish three valences (positive versus negative and versus neutral: PvsNvs0) across different types of stimuli (face, body, voice or music), and the classification was also successful in left MTG across the stimuli types of face and body. The univariate analysis further found the valence-specific activation differences across stimulus types in MTG. Our study showed that the left postcentral gyrus was informative to valence representations, and extended the research about valence representation that the modality-general representation of valences across not only visual and auditory modalities but also different types of stimuli within each modality.
Collapse
Affiliation(s)
- Jin Gu
- College of Intelligence and Computing, Tianjin University, Tianjin, 300350, PR China
| | - Linjing Cao
- College of Intelligence and Computing, Tianjin University, Tianjin, 300350, PR China
| | - Baolin Liu
- School of Computer and Communication Engineering, University of Science and Technology Beijing, Beijing, 100083, PR China.
| |
Collapse
|
22
|
Gao C, Baucom LB, Kim J, Wang J, Wedell DH, Shinkareva SV. Distinguishing abstract from concrete concepts in supramodal brain regions. Neuropsychologia 2019; 131:102-110. [PMID: 31175884 DOI: 10.1016/j.neuropsychologia.2019.05.032] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2018] [Revised: 01/18/2019] [Accepted: 05/31/2019] [Indexed: 11/24/2022]
Abstract
Concrete words have been shown to have a processing advantage over abstract words, yet theoretical accounts and neural correlates underlying the distinction between concrete and abstract concepts are still unresolved. In an fMRI study, participants performed a property verification task on abstract and concrete concepts. Property comparisons of concrete concepts were predominantly based on either visual or haptic features. Multivariate pattern analysis successfully distinguished between abstract and concrete stimulus comparisons at the whole brain level. Multivariate searchlight analyses showed that posterior and middle cingulate cortices contained information that distinguished abstract from concrete concepts regardless of feature dominance. These results support the view that supramodal convergence zones play an important role in representation of concrete and abstract concepts.
Collapse
Affiliation(s)
- Chuanji Gao
- Department of Psychology, Institute of Mind and Brain, University of South Carolina, Columbia, 29201, USA
| | - Laura B Baucom
- Department of Psychology, Institute of Mind and Brain, University of South Carolina, Columbia, 29201, USA
| | - Jongwan Kim
- Department of Psychology, Institute of Mind and Brain, University of South Carolina, Columbia, 29201, USA
| | - Jing Wang
- Department of Psychology, Institute of Mind and Brain, University of South Carolina, Columbia, 29201, USA
| | - Douglas H Wedell
- Department of Psychology, Institute of Mind and Brain, University of South Carolina, Columbia, 29201, USA
| | - Svetlana V Shinkareva
- Department of Psychology, Institute of Mind and Brain, University of South Carolina, Columbia, 29201, USA.
| |
Collapse
|
23
|
Levine SM, Wackerle A, Rupprecht R, Schwarzbach JV. The neural representation of an individualized relational affective space. Neuropsychologia 2018; 120:35-42. [DOI: 10.1016/j.neuropsychologia.2018.10.008] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2018] [Revised: 08/11/2018] [Accepted: 10/10/2018] [Indexed: 10/28/2022]
|
24
|
Cao L, Xu J, Yang X, Li X, Liu B. Abstract Representations of Emotions Perceived From the Face, Body, and Whole-Person Expressions in the Left Postcentral Gyrus. Front Hum Neurosci 2018; 12:419. [PMID: 30405375 PMCID: PMC6200969 DOI: 10.3389/fnhum.2018.00419] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2018] [Accepted: 09/27/2018] [Indexed: 12/03/2022] Open
Abstract
Emotions can be perceived through the face, body, and whole-person, while previous studies on the abstract representations of emotions only focused on the emotions of the face and body. It remains unclear whether emotions can be represented at an abstract level regardless of all three sensory cues in specific brain regions. In this study, we used the representational similarity analysis (RSA) to explore the hypothesis that the emotion category is independent of all three stimulus types and can be decoded based on the activity patterns elicited by different emotions. Functional magnetic resonance imaging (fMRI) data were collected when participants classified emotions (angry, fearful, and happy) expressed by videos of faces, bodies, and whole-persons. An abstract emotion model was defined to estimate the neural representational structure in the whole-brain RSA, which assumed that the neural patterns were significantly correlated in within-emotion conditions ignoring the stimulus types but uncorrelated in between-emotion conditions. A neural representational dissimilarity matrix (RDM) for each voxel was then compared to the abstract emotion model to examine whether specific clusters could identify the abstract representation of emotions that generalized across stimulus types. The significantly positive correlations between neural RDMs and models suggested that the abstract representation of emotions could be successfully captured by the representational space of specific clusters. The whole-brain RSA revealed an emotion-specific but stimulus category-independent neural representation in the left postcentral gyrus, left inferior parietal lobe (IPL) and right superior temporal sulcus (STS). Further cluster-based MVPA revealed that only the left postcentral gyrus could successfully distinguish three types of emotions for the two stimulus type pairs (face-body and body-whole person) and happy versus angry/fearful, which could be considered as positive versus negative for three stimulus type pairs, when the cross-modal classification analysis was performed. Our study suggested that abstract representations of three emotions (angry, fearful, and happy) could extend from the face and body stimuli to whole-person stimuli and the findings of this study provide support for abstract representations of emotions in the left postcentral gyrus.
Collapse
Affiliation(s)
- Linjing Cao
- School of Computer Science and Technology, Tianjin Key Laboratory of Cognitive Computing and Application, Tianjin University, Tianjin, China
| | - Junhai Xu
- School of Computer Science and Technology, Tianjin Key Laboratory of Cognitive Computing and Application, Tianjin University, Tianjin, China
| | - Xiaoli Yang
- School of Computer Science and Technology, Tianjin Key Laboratory of Cognitive Computing and Application, Tianjin University, Tianjin, China
| | - Xianglin Li
- Medical Imaging Research Institute, Binzhou Medical University, Yantai, China
| | - Baolin Liu
- School of Computer Science and Technology, Tianjin Key Laboratory of Cognitive Computing and Application, Tianjin University, Tianjin, China.,State Key Laboratory of Intelligent Technology and Systems, Tsinghua National Laboratory for Information Science and Technology, Tsinghua University, Beijing, China
| |
Collapse
|
25
|
Identification of task sets within and across stimulus modalities. Neuropsychologia 2018; 113:78-84. [DOI: 10.1016/j.neuropsychologia.2018.03.023] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2017] [Revised: 02/06/2018] [Accepted: 03/19/2018] [Indexed: 11/19/2022]
|
26
|
Satpute AB, Kragel PA, Barrett LF, Wager TD, Bianciardi M. Deconstructing arousal into wakeful, autonomic and affective varieties. Neurosci Lett 2018; 693:19-28. [PMID: 29378297 DOI: 10.1016/j.neulet.2018.01.042] [Citation(s) in RCA: 58] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2017] [Revised: 01/13/2018] [Accepted: 01/22/2018] [Indexed: 12/11/2022]
Abstract
Arousal plays a central role in a wide variety of phenomena, including wakefulness, autonomic function, affect and emotion. Despite its importance, it remains unclear as to how the neural mechanisms for arousal are organized across them. In this article, we review neuroscience findings for three of the most common origins of arousal: wakeful arousal, autonomic arousal, and affective arousal. Our review makes two overarching points. First, research conducted primarily in non-human animals underscores the importance of several subcortical nuclei that contribute to various sources of arousal, motivating the need for an integrative framework. Thus, we outline an integrative neural reference space as a key first step in developing a more systematic understanding of central nervous system contributions to arousal. Second, there is a translational gap between research on non-human animals, which emphasizes subcortical nuclei, and research on humans using non-invasive neuroimaging techniques, which focuses more on gross anatomical characterizations of cortical (e.g. network architectures including the default mode network) and subcortical structures. We forecast the importance of high-field neuroimaging in bridging this gap to examine how the various networks within the neural reference space for arousal operate across varieties of arousal-related phenomena.
Collapse
Affiliation(s)
- Ajay B Satpute
- Departments of Psychology and Neuroscience, Pomona College, Claremont, CA, USA; Department of Psychology, Northeastern University, Boston, MA, USA.
| | - Philip A Kragel
- Department of Psychology and Neuroscience, University of Colorado Boulder, Boulder, USA; The Institute of Cognitive Science, University of Colorado Boulder, Boulder, USA
| | - Lisa Feldman Barrett
- Department of Psychology, Northeastern University, Boston, MA, USA; Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Boston, MA, USA; Department of Radiology, Harvard Medical School, Boston, MA, USA; Department of Psychiatry, Massachusetts General Hospital, Boston, MA, USA
| | - Tor D Wager
- Department of Psychology and Neuroscience, University of Colorado Boulder, Boulder, USA; The Institute of Cognitive Science, University of Colorado Boulder, Boulder, USA
| | - Marta Bianciardi
- Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Boston, MA, USA; Department of Radiology, Harvard Medical School, Boston, MA, USA.
| |
Collapse
|
27
|
Miskovic V, Anderson AK. Modality general and modality specific coding of hedonic valence. Curr Opin Behav Sci 2018; 19:91-97. [PMID: 29967806 DOI: 10.1016/j.cobeha.2017.12.012] [Citation(s) in RCA: 35] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/10/2023]
Abstract
The pleasant or unpleasant qualities that attach to our perceptions help to determine whether we approach or avoid environmental stimuli, shaping their affordances. How do brains create this affective perceptual dimension? The traditional answer is that sensory areas serve only as conduits for external impressions that are then modulated by heteromodal limbic structures in subsequent phases. Here we raise the possibility that, in addition to these well established gain control effects, sensory systems might also have a more direct role in representing the pleasantness component of perception, as supported by several strands of recent brain imaging evidence. In conjunction with a shared valence code that is independent of its sensory origins, valence representations interleaved within sensory brain areas may support finer grained experiential distinctions between how things look, sound, feel, taste and smell good or bad to us, offering a higher dimensional space of evaluative discriminations.
Collapse
Affiliation(s)
- V Miskovic
- Department of Psychology, State University of New York at Binghamton, United States.,Center for Affective Science, State University of New York at Binghamton, United States
| | - A K Anderson
- Department of Human Development and Human Neuroscience Institute, Cornell University, United States
| |
Collapse
|
28
|
Picou EM, Singh G, Goy H, Russo F, Hickson L, Oxenham AJ, Buono GH, Ricketts TA, Launer S. Hearing, Emotion, Amplification, Research, and Training Workshop: Current Understanding of Hearing Loss and Emotion Perception and Priorities for Future Research. Trends Hear 2018; 22:2331216518803215. [PMID: 30270810 PMCID: PMC6168729 DOI: 10.1177/2331216518803215] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2018] [Revised: 08/18/2018] [Accepted: 09/03/2018] [Indexed: 12/19/2022] Open
Abstract
The question of how hearing loss and hearing rehabilitation affect patients' momentary emotional experiences is one that has received little attention but has considerable potential to affect patients' psychosocial function. This article is a product from the Hearing, Emotion, Amplification, Research, and Training workshop, which was convened to develop a consensus document describing research on emotion perception relevant for hearing research. This article outlines conceptual frameworks for the investigation of emotion in hearing research; available subjective, objective, neurophysiologic, and peripheral physiologic data acquisition research methods; the effects of age and hearing loss on emotion perception; potential rehabilitation strategies; priorities for future research; and implications for clinical audiologic rehabilitation. More broadly, this article aims to increase awareness about emotion perception research in audiology and to stimulate additional research on the topic.
Collapse
Affiliation(s)
- Erin M. Picou
- Vanderbilt University School of
Medicine, Nashville, TN, USA
| | - Gurjit Singh
- Phonak Canada, Mississauga, ON,
Canada
- Department of Speech-Language Pathology,
University of Toronto, ON, Canada
- Department of Psychology, Ryerson
University, Toronto, ON, Canada
| | - Huiwen Goy
- Department of Psychology, Ryerson
University, Toronto, ON, Canada
| | - Frank Russo
- Department of Psychology, Ryerson
University, Toronto, ON, Canada
| | - Louise Hickson
- School of Health and Rehabilitation
Sciences, University of Queensland, Brisbane, Australia
| | | | | | | | | |
Collapse
|
29
|
Bailey J, Pereira S. Advances in neuroscience imply that harmful experiments in dogs are unethical. JOURNAL OF MEDICAL ETHICS 2018; 44:47-52. [PMID: 28739639 PMCID: PMC5749309 DOI: 10.1136/medethics-2016-103630] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 04/18/2016] [Revised: 05/18/2017] [Accepted: 06/09/2017] [Indexed: 06/07/2023]
Abstract
Functional MRI (fMRI) of fully awake and unrestrained dog 'volunteers' has been proven an effective tool to understand the neural circuitry and functioning of the canine brain. Although every dog owner would vouch that dogs are perceptive, cognitive, intuitive and capable of positive emotions/empathy, as indeed substantiated by ethological studies for some time, neurological investigations now corroborate this. These studies show that there exists a striking similarity between dogs and humans in the functioning of the caudate nucleus (associated with pleasure and emotion), and dogs experience positive emotions, empathic-like responses and demonstrate human bonding which, some scientists claim, may be at least comparable with human children. There exists an area analogous to the 'voice area' in the canine brain, enabling dogs to comprehend and respond to emotional cues/valence in human voices, and evidence of a region in the temporal cortex of dogs involved in the processing of faces, as also observed in humans and monkeys. We therefore contend that using dogs in invasive and/or harmful research, and toxicity testing, cannot be ethically justifiable.
Collapse
|
30
|
Gao C, Wedell DH, Kim J, Weber CE, Shinkareva SV. Modelling audiovisual integration of affect from videos and music. Cogn Emot 2017; 32:516-529. [DOI: 10.1080/02699931.2017.1320979] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Affiliation(s)
- Chuanji Gao
- Department of Psychology, University of South Carolina, Columbia, SC, USA
| | - Douglas H. Wedell
- Department of Psychology, University of South Carolina, Columbia, SC, USA
| | - Jongwan Kim
- Department of Psychology, University of South Carolina, Columbia, SC, USA
| | - Christine E. Weber
- Department of Psychology, University of South Carolina, Columbia, SC, USA
| | | |
Collapse
|
31
|
Kim J, Shinkareva SV, Wedell DH. Representations of modality-general valence for videos and music derived from fMRI data. Neuroimage 2017; 148:42-54. [DOI: 10.1016/j.neuroimage.2017.01.002] [Citation(s) in RCA: 41] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2016] [Revised: 12/13/2016] [Accepted: 01/01/2017] [Indexed: 11/28/2022] Open
|
32
|
Soares JM, Magalhães R, Moreira PS, Sousa A, Ganz E, Sampaio A, Alves V, Marques P, Sousa N. A Hitchhiker's Guide to Functional Magnetic Resonance Imaging. Front Neurosci 2016; 10:515. [PMID: 27891073 PMCID: PMC5102908 DOI: 10.3389/fnins.2016.00515] [Citation(s) in RCA: 127] [Impact Index Per Article: 14.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2016] [Accepted: 10/25/2016] [Indexed: 12/12/2022] Open
Abstract
Functional Magnetic Resonance Imaging (fMRI) studies have become increasingly popular both with clinicians and researchers as they are capable of providing unique insights into brain functions. However, multiple technical considerations (ranging from specifics of paradigm design to imaging artifacts, complex protocol definition, and multitude of processing and methods of analysis, as well as intrinsic methodological limitations) must be considered and addressed in order to optimize fMRI analysis and to arrive at the most accurate and grounded interpretation of the data. In practice, the researcher/clinician must choose, from many available options, the most suitable software tool for each stage of the fMRI analysis pipeline. Herein we provide a straightforward guide designed to address, for each of the major stages, the techniques, and tools involved in the process. We have developed this guide both to help those new to the technique to overcome the most critical difficulties in its use, as well as to serve as a resource for the neuroimaging community.
Collapse
Affiliation(s)
- José M. Soares
- Life and Health Sciences Research Institute (ICVS), School of Medicine, University of MinhoBraga, Portugal
- ICVS/3B's - PT Government Associate LaboratoryBraga, Portugal
| | - Ricardo Magalhães
- Life and Health Sciences Research Institute (ICVS), School of Medicine, University of MinhoBraga, Portugal
- ICVS/3B's - PT Government Associate LaboratoryBraga, Portugal
| | - Pedro S. Moreira
- Life and Health Sciences Research Institute (ICVS), School of Medicine, University of MinhoBraga, Portugal
- ICVS/3B's - PT Government Associate LaboratoryBraga, Portugal
| | - Alexandre Sousa
- Life and Health Sciences Research Institute (ICVS), School of Medicine, University of MinhoBraga, Portugal
- ICVS/3B's - PT Government Associate LaboratoryBraga, Portugal
- Department of Informatics, University of MinhoBraga, Portugal
| | - Edward Ganz
- Life and Health Sciences Research Institute (ICVS), School of Medicine, University of MinhoBraga, Portugal
- ICVS/3B's - PT Government Associate LaboratoryBraga, Portugal
| | - Adriana Sampaio
- Neuropsychophysiology Lab, CIPsi, School of Psychology, University of MinhoBraga, Portugal
| | - Victor Alves
- Department of Informatics, University of MinhoBraga, Portugal
| | - Paulo Marques
- Life and Health Sciences Research Institute (ICVS), School of Medicine, University of MinhoBraga, Portugal
- ICVS/3B's - PT Government Associate LaboratoryBraga, Portugal
| | - Nuno Sousa
- Life and Health Sciences Research Institute (ICVS), School of Medicine, University of MinhoBraga, Portugal
- ICVS/3B's - PT Government Associate LaboratoryBraga, Portugal
- Clinical Academic Center – BragaBraga, Portugal
| |
Collapse
|
33
|
Kim J, Wang J, Wedell DH, Shinkareva SV. Identifying Core Affect in Individuals from fMRI Responses to Dynamic Naturalistic Audiovisual Stimuli. PLoS One 2016; 11:e0161589. [PMID: 27598534 PMCID: PMC5012606 DOI: 10.1371/journal.pone.0161589] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2016] [Accepted: 08/08/2016] [Indexed: 01/19/2023] Open
Abstract
Recent research has demonstrated that affective states elicited by viewing pictures varying in valence and arousal are identifiable from whole brain activation patterns observed with functional magnetic resonance imaging (fMRI). Identification of affective states from more naturalistic stimuli has clinical relevance, but the feasibility of identifying these states on an individual trial basis from fMRI data elicited by dynamic multimodal stimuli is unclear. The goal of this study was to determine whether affective states can be similarly identified when participants view dynamic naturalistic audiovisual stimuli. Eleven participants viewed 5s audiovisual clips in a passive viewing task in the scanner. Valence and arousal for individual trials were identified both within and across participants based on distributed patterns of activity in areas selectively responsive to audiovisual naturalistic stimuli while controlling for lower level features of the stimuli. In addition, the brain regions identified by searchlight analyses to represent valence and arousal were consistent with previously identified regions associated with emotion processing. These findings extend previous results on the distributed representation of affect to multimodal dynamic stimuli.
Collapse
Affiliation(s)
- Jongwan Kim
- Department of Psychology, University of South Carolina, Columbia, South Carolina, United States of America
| | - Jing Wang
- Department of Psychology, Carnegie Mellon University, Pittsburgh, Pennsylvania, United States of America
| | - Douglas H. Wedell
- Department of Psychology, University of South Carolina, Columbia, South Carolina, United States of America
| | - Svetlana V. Shinkareva
- Department of Psychology, University of South Carolina, Columbia, South Carolina, United States of America
- * E-mail:
| |
Collapse
|
34
|
Kragel PA, LaBar KS. Decoding the Nature of Emotion in the Brain. Trends Cogn Sci 2016; 20:444-455. [PMID: 27133227 DOI: 10.1016/j.tics.2016.03.011] [Citation(s) in RCA: 171] [Impact Index Per Article: 19.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2016] [Revised: 03/28/2016] [Accepted: 03/30/2016] [Indexed: 10/21/2022]
Abstract
A central, unresolved problem in affective neuroscience is understanding how emotions are represented in nervous system activity. After prior localization approaches largely failed, researchers began applying multivariate statistical tools to reconceptualize how emotion constructs might be embedded in large-scale brain networks. Findings from pattern analyses of neuroimaging data show that affective dimensions and emotion categories are uniquely represented in the activity of distributed neural systems that span cortical and subcortical regions. Results from multiple-category decoding studies are incompatible with theories postulating that specific emotions emerge from the neural coding of valence and arousal. This 'new look' into emotion representation promises to improve and reformulate neurobiological models of affect.
Collapse
Affiliation(s)
- Philip A Kragel
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708, USA
| | - Kevin S LaBar
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708, USA.
| |
Collapse
|
35
|
Kim J, Wedell DH. Comparison of physiological responses to affect eliciting pictures and music. Int J Psychophysiol 2016; 101:9-17. [DOI: 10.1016/j.ijpsycho.2015.12.011] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2015] [Revised: 12/14/2015] [Accepted: 12/30/2015] [Indexed: 11/27/2022]
|
36
|
Folyi T, Wentura D. Fast and unintentional evaluation of emotional sounds: evidence from brief segment ratings and the affective Simon task. Cogn Emot 2015; 31:312-324. [DOI: 10.1080/02699931.2015.1110514] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Affiliation(s)
- Tímea Folyi
- Department of Psychology, Saarland University, Saarbrücken, Germany
| | - Dirk Wentura
- Department of Psychology, Saarland University, Saarbrücken, Germany
| |
Collapse
|
37
|
Kragel PA, LaBar KS. Multivariate neural biomarkers of emotional states are categorically distinct. Soc Cogn Affect Neurosci 2015; 10:1437-48. [PMID: 25813790 DOI: 10.1093/scan/nsv032] [Citation(s) in RCA: 126] [Impact Index Per Article: 12.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2014] [Accepted: 03/19/2015] [Indexed: 11/13/2022] Open
Abstract
Understanding how emotions are represented neurally is a central aim of affective neuroscience. Despite decades of neuroimaging efforts addressing this question, it remains unclear whether emotions are represented as distinct entities, as predicted by categorical theories, or are constructed from a smaller set of underlying factors, as predicted by dimensional accounts. Here, we capitalize on multivariate statistical approaches and computational modeling to directly evaluate these theoretical perspectives. We elicited discrete emotional states using music and films during functional magnetic resonance imaging scanning. Distinct patterns of neural activation predicted the emotion category of stimuli and tracked subjective experience. Bayesian model comparison revealed that combining dimensional and categorical models of emotion best characterized the information content of activation patterns. Surprisingly, categorical and dimensional aspects of emotion experience captured unique and opposing sources of neural information. These results indicate that diverse emotional states are poorly differentiated by simple models of valence and arousal, and that activity within separable neural systems can be mapped to unique emotion categories.
Collapse
Affiliation(s)
- Philip A Kragel
- Department of Psychology & Neuroscience and Center for Cognitive Neuroscience, Duke University, Durham, NC, USA
| | - Kevin S LaBar
- Department of Psychology & Neuroscience and Center for Cognitive Neuroscience, Duke University, Durham, NC, USA
| |
Collapse
|
38
|
Gerdes ABM, Wieser MJ, Alpers GW. Emotional pictures and sounds: a review of multimodal interactions of emotion cues in multiple domains. Front Psychol 2014; 5:1351. [PMID: 25520679 PMCID: PMC4248815 DOI: 10.3389/fpsyg.2014.01351] [Citation(s) in RCA: 65] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2014] [Accepted: 11/06/2014] [Indexed: 01/28/2023] Open
Abstract
In everyday life, multiple sensory channels jointly trigger emotional experiences and one channel may alter processing in another channel. For example, seeing an emotional facial expression and hearing the voice’s emotional tone will jointly create the emotional experience. This example, where auditory and visual input is related to social communication, has gained considerable attention by researchers. However, interactions of visual and auditory emotional information are not limited to social communication but can extend to much broader contexts including human, animal, and environmental cues. In this article, we review current research on audiovisual emotion processing beyond face-voice stimuli to develop a broader perspective on multimodal interactions in emotion processing. We argue that current concepts of multimodality should be extended in considering an ecologically valid variety of stimuli in audiovisual emotion processing. Therefore, we provide an overview of studies in which emotional sounds and interactions with complex pictures of scenes were investigated. In addition to behavioral studies, we focus on neuroimaging, electro- and peripher-physiological findings. Furthermore, we integrate these findings and identify similarities or differences. We conclude with suggestions for future research.
Collapse
Affiliation(s)
- Antje B M Gerdes
- Clinical and Biological Psychology, Department of Psychology, School of Social Sciences, University of Mannheim Mannheim, Germany
| | | | - Georg W Alpers
- Clinical and Biological Psychology, Department of Psychology, School of Social Sciences, University of Mannheim Mannheim, Germany ; Otto-Selz Institute, University of Mannheim Mannheim, Germany
| |
Collapse
|