1
|
Zhou B, Feng G, Chen W, Zhou W. Olfaction Warps Visual Time Perception. Cereb Cortex 2018; 28:1718-1728. [PMID: 28334302 DOI: 10.1093/cercor/bhx068] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2016] [Accepted: 03/01/2017] [Indexed: 11/12/2022] Open
Abstract
Our perception of the world builds upon dynamic inputs from multiple senses with different temporal resolutions, and is threaded with the passing of subjective time. How time is extracted from multisensory inputs is scantly known. Utilizing psychophysical testing and electroencephalography, we show in healthy human adults that odors modulate object visibility around critical flicker-fusion frequency (CFF)-the limit at which chromatic flickers become perceived as a stable color-and effectively alter CFF in a congruency-based manner, despite that they afford no clear environmental temporal information. The behavioral gain produced by a congruent relative to an incongruent odor is accompanied by elevated neural oscillatory power around the object's flicker frequency in the right temporal region ~150-300 ms after object onset, and is not mediated by visual awareness. In parallel, odors bias the subjective duration of visual objects without affecting one's temporal sensitivity. These findings point to a neuronal network in the right temporal cortex that executes flexible temporal filtering of upstream visual inputs based on olfactory information. Moreover, they collectively indicate that the very process of sensory integration at the stage of object processing twists time perception, hence casting new insights into the neural timing of multisensory events.
Collapse
Affiliation(s)
- Bin Zhou
- Institute of Psychology, CAS Key Laboratory of Behavioral Science, CAS Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Beijing 100101, China.,Department of Psychology, University of Chinese Academy of Sciences, Beijing 100049, China
| | - Guo Feng
- Institute of Psychology, CAS Key Laboratory of Behavioral Science, CAS Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Beijing 100101, China.,Department of Psychology, University of Chinese Academy of Sciences, Beijing 100049, China
| | - Wei Chen
- Institute of Psychology, CAS Key Laboratory of Behavioral Science, CAS Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Beijing 100101, China.,Department of Psychology, University of Chinese Academy of Sciences, Beijing 100049, China
| | - Wen Zhou
- Institute of Psychology, CAS Key Laboratory of Behavioral Science, CAS Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Beijing 100101, China.,Department of Psychology, University of Chinese Academy of Sciences, Beijing 100049, China
| |
Collapse
|
2
|
Abstract
Perception, cognition, and emotion do not operate along segregated pathways; rather, their adaptive interaction is supported by various sources of evidence. For instance, the aesthetic appraisal of powerful mood inducers like music can bias the facial expression of emotions towards mood congruency. In four experiments we showed similar mood-congruency effects elicited by the comfort/discomfort of body actions. Using a novel Motor Action Mood Induction Procedure, we let participants perform comfortable/uncomfortable visually-guided reaches and tested them in a facial emotion identification task. Through the alleged mediation of motor action induced mood, action comfort enhanced the quality of the participant’s global experience (a neutral face appeared happy and a slightly angry face neutral), while action discomfort made a neutral face appear angry and a slightly happy face neutral. Furthermore, uncomfortable (but not comfortable) reaching improved the sensitivity for the identification of emotional faces and reduced the identification time of facial expressions, as a possible effect of hyper-arousal from an unpleasant bodily experience.
Collapse
|
3
|
Lacey S, Sathian K. Visuo-haptic multisensory object recognition, categorization, and representation. Front Psychol 2014; 5:730. [PMID: 25101014 PMCID: PMC4102085 DOI: 10.3389/fpsyg.2014.00730] [Citation(s) in RCA: 60] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2014] [Accepted: 06/23/2014] [Indexed: 12/15/2022] Open
Abstract
Visual and haptic unisensory object processing show many similarities in terms of categorization, recognition, and representation. In this review, we discuss how these similarities contribute to multisensory object processing. In particular, we show that similar unisensory visual and haptic representations lead to a shared multisensory representation underlying both cross-modal object recognition and view-independence. This shared representation suggests a common neural substrate and we review several candidate brain regions, previously thought to be specialized for aspects of visual processing, that are now known also to be involved in analogous haptic tasks. Finally, we lay out the evidence for a model of multisensory object recognition in which top-down and bottom-up pathways to the object-selective lateral occipital complex are modulated by object familiarity and individual differences in object and spatial imagery.
Collapse
Affiliation(s)
- Simon Lacey
- Department of Neurology, Emory University School of Medicine Atlanta, GA, USA
| | - K Sathian
- Department of Neurology, Emory University School of Medicine Atlanta, GA, USA ; Department of Rehabilitation Medicine, Emory University School of Medicine Atlanta, GA, USA ; Department of Psychology, Emory University School of Medicine Atlanta, GA, USA ; Rehabilitation Research and Development Center of Excellence, Atlanta Veterans Affairs Medical Center Decatur, GA, USA
| |
Collapse
|
4
|
Matsumiya K. Seeing a haptically explored face: visual facial-expression aftereffect from haptic adaptation to a face. Psychol Sci 2013; 24:2088-98. [PMID: 24002886 DOI: 10.1177/0956797613486981] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
Current views on face perception assume that the visual system receives only visual facial signals. However, I show that the visual perception of faces is systematically biased by adaptation to a haptically explored face. Recently, face aftereffects (FAEs; the altered perception of faces after adaptation to a face) have been demonstrated not only in visual perception but also in haptic perception; therefore, I combined the two FAEs to examine whether the visual system receives face-related signals from the haptic modality. I found that adaptation to a haptically explored facial expression on a face mask produced a visual FAE for facial expression. This cross-modal FAE was not due to explicitly imaging a face, response bias, or adaptation to local features. Furthermore, FAEs transferred from vision to haptics. These results indicate that visual face processing depends on substrates adapted by haptic faces, which suggests that face processing relies on shared representation underlying cross-modal interactions.
Collapse
|