1
|
Cai YJF, Allar IB, Maier JX. Taste enhances the ability to express a preference for a congruent odor in rats. Behav Neurosci 2024; 138:433-440. [PMID: 39298234 PMCID: PMC11631660 DOI: 10.1037/bne0000605] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/25/2024]
Abstract
Foods that make up a typical diet are characterized by a rich set of sensory qualities that are perceived through multiple different modalities. It is well known that multisensory aspects of food are integrated to create our perception of flavor, which in turn affects our behavioral responses to food. However, the principles underlying multisensory integration of flavor-related sensory signals and how they inform perceptual judgments remain poorly understood, partly due to lack of control over flavor experience in human subjects. Here, we used rats as a model to overcome this limitation and tested the hypothesis that taste can enhance discriminability of retronasal odor cues. In a series of two-bottle tests, animals chose between two odorized solutions after learning to associate one of the odors with saccharin. When odors were highly similar, animals showed little preference for the saccharin-associated odor. When adding saccharin to both bottles-rendering one of the solutions' congruent-animals' preference for the saccharin-associated odor was significantly enhanced. No effect of taste was observed when using dissimilar odor pairs or novel taste stimuli. These findings suggest that congruent taste stimuli selectively enhance odor identity representations, aiding in the discriminability of perceptually similar flavors. (PsycInfo Database Record (c) 2024 APA, all rights reserved).
Collapse
Affiliation(s)
- Yuan JF Cai
- Department of Translational Neuroscience, Wake Forest University School of Medicine, Winston-Salem, NC
| | - Isabella B Allar
- Department of Translational Neuroscience, Wake Forest University School of Medicine, Winston-Salem, NC
| | - Joost X Maier
- Department of Translational Neuroscience, Wake Forest University School of Medicine, Winston-Salem, NC
| |
Collapse
|
2
|
Hassan S, Wang L, Mahmud KR. Robotic Odor Source Localization via Vision and Olfaction Fusion Navigation Algorithm. SENSORS (BASEL, SWITZERLAND) 2024; 24:2309. [PMID: 38610520 PMCID: PMC11014090 DOI: 10.3390/s24072309] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/16/2024] [Revised: 03/28/2024] [Accepted: 04/03/2024] [Indexed: 04/14/2024]
Abstract
Robotic odor source localization (OSL) is a technology that enables mobile robots or autonomous vehicles to find an odor source in unknown environments. An effective navigation algorithm that guides the robot to approach the odor source is the key to successfully locating the odor source. While traditional OSL approaches primarily utilize an olfaction-only strategy, guiding robots to find the odor source by tracing emitted odor plumes, our work introduces a fusion navigation algorithm that combines both vision and olfaction-based techniques. This hybrid approach addresses challenges such as turbulent airflow, which disrupts olfaction sensing, and physical obstacles inside the search area, which may impede vision detection. In this work, we propose a hierarchical control mechanism that dynamically shifts the robot's search behavior among four strategies: crosswind maneuver, Obstacle-Avoid Navigation, Vision-Based Navigation, and Olfaction-Based Navigation. Our methodology includes a custom-trained deep-learning model for visual target detection and a moth-inspired algorithm for Olfaction-Based Navigation. To assess the effectiveness of our approach, we implemented the proposed algorithm on a mobile robot in a search environment with obstacles. Experimental results demonstrate that our Vision and Olfaction Fusion algorithm significantly outperforms vision-only and olfaction-only methods, reducing average search time by 54% and 30%, respectively.
Collapse
Affiliation(s)
- Sunzid Hassan
- Department of Computer Science, Louisiana Tech University, 201 Mayfield Ave., Ruston, LA 71272, USA; (S.H.); (K.R.M.)
| | - Lingxiao Wang
- Department of Electrical Engineering, Louisiana Tech University, 201 Mayfield Ave., Ruston, LA 71272, USA
| | - Khan Raqib Mahmud
- Department of Computer Science, Louisiana Tech University, 201 Mayfield Ave., Ruston, LA 71272, USA; (S.H.); (K.R.M.)
| |
Collapse
|
3
|
Ma H, Fang H, Xie X, Liu Y, Tian H, Chai Y. Optoelectronic Synapses Based on MXene/Violet Phosphorus van der Waals Heterojunctions for Visual-Olfactory Crossmodal Perception. NANO-MICRO LETTERS 2024; 16:104. [PMID: 38300424 PMCID: PMC10834395 DOI: 10.1007/s40820-024-01330-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/20/2023] [Accepted: 12/11/2023] [Indexed: 02/02/2024]
Abstract
The crossmodal interaction of different senses, which is an important basis for learning and memory in the human brain, is highly desired to be mimicked at the device level for developing neuromorphic crossmodal perception, but related researches are scarce. Here, we demonstrate an optoelectronic synapse for vision-olfactory crossmodal perception based on MXene/violet phosphorus (VP) van der Waals heterojunctions. Benefiting from the efficient separation and transport of photogenerated carriers facilitated by conductive MXene, the photoelectric responsivity of VP is dramatically enhanced by 7 orders of magnitude, reaching up to 7.7 A W-1. Excited by ultraviolet light, multiple synaptic functions, including excitatory postsynaptic currents, paired-pulse facilitation, short/long-term plasticity and "learning-experience" behavior, were demonstrated with a low power consumption. Furthermore, the proposed optoelectronic synapse exhibits distinct synaptic behaviors in different gas environments, enabling it to simulate the interaction of visual and olfactory information for crossmodal perception. This work demonstrates the great potential of VP in optoelectronics and provides a promising platform for applications such as virtual reality and neurorobotics.
Collapse
Affiliation(s)
- Hailong Ma
- Center for Advancing Materials Performance From the Nanoscale (CAMP-Nano), State Key Laboratory for Mechanical Behavior of Materials, Xi'an Jiaotong University, Xi'an, 710049, People's Republic of China
| | - Huajing Fang
- Center for Advancing Materials Performance From the Nanoscale (CAMP-Nano), State Key Laboratory for Mechanical Behavior of Materials, Xi'an Jiaotong University, Xi'an, 710049, People's Republic of China.
| | - Xinxing Xie
- Center for Advancing Materials Performance From the Nanoscale (CAMP-Nano), State Key Laboratory for Mechanical Behavior of Materials, Xi'an Jiaotong University, Xi'an, 710049, People's Republic of China
| | - Yanming Liu
- Institute of Microelectronics and Beijing National Research Center for Information Science and Technology (BNRist), Tsinghua University, Beijing, 100084, People's Republic of China
| | - He Tian
- Institute of Microelectronics and Beijing National Research Center for Information Science and Technology (BNRist), Tsinghua University, Beijing, 100084, People's Republic of China.
| | - Yang Chai
- Department of Applied Physics, The Hong Kong Polytechnic University, Hong Kong, People's Republic of China.
| |
Collapse
|
4
|
Jordan KA, Sprayberry JD, Joiner WM, Combes SA. Multimodal processing of noisy cues in bumblebees. iScience 2024; 27:108587. [PMID: 38161424 PMCID: PMC10755353 DOI: 10.1016/j.isci.2023.108587] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2023] [Revised: 10/20/2023] [Accepted: 11/24/2023] [Indexed: 01/03/2024] Open
Abstract
Multimodal cues can improve behavioral responses by enhancing the detection and localization of sensory cues and reducing response times. Across species, studies have shown that multisensory integration of visual and olfactory cues can improve response accuracy. However, in real-world settings, sensory cues are often noisy; visual and olfactory cues can be deteriorated, masked, or mixed, making the target cue less clear to the receiver. In this study, we use an associative learning paradigm (Free Moving Proboscis Extension Reflex, FMPER) to show that having multimodal cues may improve the accuracy of bees' responses to noisy cues. Adding a noisy visual cue improves the accuracy of response to a noisy olfactory cue, despite neither the clear nor noisy visual cue being sufficient when paired with a novel olfactory cue. This may provide insight into the neural mechanisms underlying multimodal processing and the effects of environmental change on pollination services.
Collapse
Affiliation(s)
- Katherine A. Jordan
- Department of Neurobiology, Physiology, and Behavior, University of California, Davis, Davis, CA 95616, USA
| | | | - Wilsaan M. Joiner
- Department of Neurobiology, Physiology, and Behavior, University of California, Davis, Davis, CA 95616, USA
| | - Stacey A. Combes
- Department of Neurobiology, Physiology, and Behavior, University of California, Davis, Davis, CA 95616, USA
| |
Collapse
|
5
|
Damon F, Mezrai N, Magnier L, Leleu A, Durand K, Schaal B. Olfaction in the Multisensory Processing of Faces: A Narrative Review of the Influence of Human Body Odors. Front Psychol 2021; 12:750944. [PMID: 34675855 PMCID: PMC8523678 DOI: 10.3389/fpsyg.2021.750944] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2021] [Accepted: 09/06/2021] [Indexed: 01/08/2023] Open
Abstract
A recent body of research has emerged regarding the interactions between olfaction and other sensory channels to process social information. The current review examines the influence of body odors on face perception, a core component of human social cognition. First, we review studies reporting how body odors interact with the perception of invariant facial information (i.e., identity, sex, attractiveness, trustworthiness, and dominance). Although we mainly focus on the influence of body odors based on axillary odor, we also review findings about specific steroids present in axillary sweat (i.e., androstenone, androstenol, androstadienone, and estratetraenol). We next survey the literature showing body odor influences on the perception of transient face properties, notably in discussing the role of body odors in facilitating or hindering the perception of emotional facial expression, in relation to competing frameworks of emotions. Finally, we discuss the developmental origins of these olfaction-to-vision influences, as an emerging literature indicates that odor cues strongly influence face perception in infants. Body odors with a high social relevance such as the odor emanating from the mother have a widespread influence on various aspects of face perception in infancy, including categorization of faces among other objects, face scanning behavior, or facial expression perception. We conclude by suggesting that the weight of olfaction might be especially strong in infancy, shaping social perception, especially in slow-maturing senses such as vision, and that this early tutoring function of olfaction spans all developmental stages to disambiguate a complex social environment by conveying key information for social interactions until adulthood.
Collapse
Affiliation(s)
- Fabrice Damon
- Developmental Ethology and Cognitive Psychology Laboratory, Centre des Sciences du Goût et de l’Alimentation, Inrae, AgroSup Dijon, CNRS (UMR 6265), Université Bourgogne Franche-Comté, Dijon, France
| | | | | | | | | | | |
Collapse
|
6
|
Tsushima Y, Nishino Y, Ando H. Olfactory Stimulation Modulates Visual Perception Without Training. Front Neurosci 2021; 15:642584. [PMID: 34408620 PMCID: PMC8364961 DOI: 10.3389/fnins.2021.642584] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2020] [Accepted: 07/06/2021] [Indexed: 11/13/2022] Open
Abstract
Considerable research shows that olfactory stimulations affect other modalities in high-level cognitive functions such as emotion. However, little known fact is that olfaction modulates low-level perception of other sensory modalities. Although some studies showed that olfaction had influenced on the other low-level perception, all of them required specific experiences like perceptual training. To test the possibility that olfaction modulates low-level perception without training, we conducted a series of psychophysical and neuroimaging experiments. From the results of a visual task in which participants reported the speed of moving dots, we found that participants perceived the slower motions with a lemon smell and the faster motions with a vanilla smell, without any specific training. In functional magnetic resonance imaging (fMRI) studies, brain activities in the visual cortices [V1 and human middle temporal area (hMT)] changed based on the type of olfactory stimulation. Our findings provide us with the first direct evidence that olfaction modulates low-level visual perception without training, thereby indicating that olfactory-visual effect is not an acquired behavior but an innate behavior. The present results show us with a new crossmodal effect between olfaction and vision, and bring a unique opportunity to reconsider some fundamental roles of olfactory function.
Collapse
Affiliation(s)
- Yoshiaki Tsushima
- National Institute of Information and Communications Technology, Center for Information and Neural Networks, Osaka, Japan
| | - Yurie Nishino
- National Institute of Information and Communications Technology, Center for Information and Neural Networks, Osaka, Japan
| | - Hiroshi Ando
- National Institute of Information and Communications Technology, Universal Communication Research Institute, Kyoto, Japan
| |
Collapse
|
7
|
Invitto S, Keshmiri S, Mazzatenta A, Grasso A, Romano D, Bona F, Shiomi M, Sumioka H, Ishiguro H. Perception of Social Odor and Gender-Related Differences Investigated Through the Use of Transfer Entropy and Embodied Medium. Front Syst Neurosci 2021; 15:650528. [PMID: 34177474 PMCID: PMC8232750 DOI: 10.3389/fnsys.2021.650528] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2021] [Accepted: 05/10/2021] [Indexed: 11/28/2022] Open
Abstract
The perception of putative pheromones or social odors (PPSO) in humans is a widely debated topic because the published results seem ambiguous. Our research aimed to evaluate how cross-modal processing of PPSO and gender voice can affect the behavioral and psychophysiological states of the subject during a listening task with a bodily contact medium, and how these effects could be gender related. Before the experimental session, three embodied media, were exposed to volatilized estratetraenol (Estr), 5α-androst-16-en-3 α-ol (Andr), and Vaseline oil. The experimental session consisted in listening to a story that were transmitted, with a male or female voice, by the communicative medium via a Bluetooth system during a listening task, recorded through 64-active channel electroencephalography (EEG). The sense of co-presence and social presence, elicited by the medium, showed how the established relationship with the medium was gender dependent and modulated by the PPSO. In particular, Andr induced greater responses related to co-presence. The gender of the participants was related to the co-presence desire, where women imagined higher medium co-presence than men. EEG findings seemed to be more responsive to the PPSO–gender voice interaction, than behavioral results. The mismatch between female PPSO and male voice elicited the greatest cortical flow of information. In the case of the Andr–male voice condition, the trained model appeared to assign more relevance to the flow of information to the right frontotemporal regions (involved in odor recognition memory and social behavior). The Estr–male voice condition showed activation of the bilateral frontoparietal network, which is linked to cognitive control, cognitive flexibility, and auditory consciousness. The model appears to distinguish the dissonance condition linked to Andr matched with a female voice: it highlights a flow of information to the right occipital lobe and to the frontal pole. The PPSO could influence the co-presence judgements and EEG response. The results seem suggest that could be an implicit pattern linked to PPSO-related gender differences and gender voice.
Collapse
Affiliation(s)
- Sara Invitto
- INSPIRE-Laboratory of Cognitive and Psychophysiological Olfactory Processes, Department of Biological and Environmental Sciences and Technologies, University of Salento, Lecce, Italy
| | - Soheil Keshmiri
- The Thomas N. Sato BioMEC-X Laboratories, Advanced Telecommunications Research Institute International, Kyoto, Japan
| | - Andrea Mazzatenta
- Neurophysiology, Olfaction and Chemoreception Laboratory, Physiology and Physiopathology Section, Neuroscience, Imaging and Clinical Sciences Department, 'G. d'Annunzio' University of Chieti-Pescara, Chieti, Italy
| | - Alberto Grasso
- INSPIRE-Laboratory of Cognitive and Psychophysiological Olfactory Processes, Department of Biological and Environmental Sciences and Technologies, University of Salento, Lecce, Italy
| | - Daniele Romano
- Department of Psychology and NeuroMi, University of Milano-Bicocca, Milan, Italy.,Department of History, Society and Human Studies, University of Salento, Lecce, Italy
| | - Fabio Bona
- INSPIRE-Laboratory of Cognitive and Psychophysiological Olfactory Processes, Department of Biological and Environmental Sciences and Technologies, University of Salento, Lecce, Italy
| | - Masahiro Shiomi
- Interaction Science Laboratories, Advanced Telecommunications Research Institute International, Kyoto, Japan
| | - Hidenobu Sumioka
- Hiroshi Ishiguro Laboratories, Advanced Telecommunications Research Institute International, Kyoto, Japan
| | - Hiroshi Ishiguro
- Hiroshi Ishiguro Laboratories, Advanced Telecommunications Research Institute International, Kyoto, Japan.,Graduate School of Engineering Science, Osaka University, Osaka, Japan
| |
Collapse
|
8
|
Sabiniewicz A. Smells Influence Perceived Pleasantness but Not Memorization of a Visual Virtual Environment. Iperception 2021; 12:2041669521989731. [PMID: 33868626 PMCID: PMC8020408 DOI: 10.1177/2041669521989731] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2020] [Accepted: 01/04/2021] [Indexed: 12/20/2022] Open
Abstract
The present study aimed to investigate whether the perception of still scenes in a virtual environment in congruent versus incongruent condition can be influenced by odors. Ninety healthy participants were divided into three groups, including two experimental virtual reality (VR) environments: a rose garden, an orange basket, and a control condition. In each VR condition, participants were exposed to a rose odor, an orange odor, or no odor, resulting in congruent, incongruent, and control conditions. Participants were asked to describe (a) the content of the VR scene and rate its overall pleasantness and (b) the smell and to rate its intensity and pleasantness. For each condition, participants were tested twice. During the second test, participants provided ratings and descriptions of the content of the VR scenes without being exposed to odors or VR environments. Virtual scenarios tended to be remembered as more pleasant when presented with congruent odors. Furthermore, participants used more descriptors in congruent scenarios than in incongruent scenarios. Eventually, rose odor appeared to be remembered as more pleasant when presented within congruent scenarios. These findings show that olfactory stimuli in congruent versus incongruent conditions can possibly modulate the perception of the pleasantness of visual scenes but not the memorization.
Collapse
Affiliation(s)
- Agnieszka Sabiniewicz
- Interdisciplinary Center “Smell & Taste”, Department of Otorhinolaryngology, TU Dresden, Dresden, Germany Institute of Psychology, University of Wrocław, Wrocław, Poland
| |
Collapse
|
9
|
Spence C. Scenting the Anosmic Cube: On the Use of Ambient Scent in the Context of the Art Gallery or Museum. Iperception 2020; 11:2041669520966628. [PMID: 33282169 PMCID: PMC7686631 DOI: 10.1177/2041669520966628] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2020] [Accepted: 09/06/2020] [Indexed: 11/30/2022] Open
Abstract
In recent years, there has been growing interest in the possibility of augmenting the visitor's experience of the exhibits in various art galleries and museums by means of the delivery of a genuinely multisensory experience, one that engages more than just the visual sense. This kind of approach both holds the promise of increasing engagement while, at the same time, also helping to address, in some small way, issues around accessibility for the visually impaired visitor. One of the increasingly popular approaches to enhancing multisensory experience design involves the use of scents that have been chosen to match, or augment, the art or museum display in some way. The various different kinds of congruency between olfaction and vision that have been investigated by researchers and/or incorporated into art/museum displays already are reviewed. However, while the laboratory research does indeed appear to suggest that people's experience of the paintings (or rather reproductions or photos of the works of art) may well be influenced by the presence of an ambient odour, the results are by no means guaranteed to be positive, either in terms of the emotional response while viewing the display or in terms of the viewer's subsequent recall of their multisensory experience. As such, caution is advised for those who may be considering whether to augment their multisensory displays/exhibits with ambient scent.
Collapse
Affiliation(s)
- Charles Spence
- Crossmodal Research Laboratory, Department
of Experimental Psychology, University of Oxford, Oxford, United
Kingdom
| |
Collapse
|
10
|
Yang Y, Wang X. Odor Modulates Hand Movements in a Reach-to-Grasp Task. Front Neurosci 2020; 14:560. [PMID: 32612498 PMCID: PMC7308559 DOI: 10.3389/fnins.2020.00560] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2019] [Accepted: 05/06/2020] [Indexed: 12/14/2022] Open
Abstract
Recent evidence suggests that target-relevant sensory stimuli (i.e., visual, auditory, and olfactory) can play important roles in the motor system. However, little is known about the effects of olfactory information on reaching and grasping movements. To determine whether odor stimuli affect hand movements, the reaching and grasping kinematic characteristics of 29 human participants were recorded using a three-dimensional video motion capture system. Participants received an odor stimulus by Sniffin’ Sticks and then reached toward and grasped a target. Grasping targets were apple, orange, ginger, and garlic. The odor stimulus was congruent with the target. The size of the odor-cued object (OCO) was the same size, smaller, or larger than a target to be grasped; or participants received odorless air while they viewed that target. They reached the target with one of two grips: a precision grip for a small target or a power grip for a larger target. The visual feedback was lost in half of 80 total trials after a start signal. It was no longer visible when participants reached the target. The results of repeated-measures analyses of variance followed by simple-effects analyses showed that when the size of the hand movement evoked by the odor cue was congruent with the size of the target, either both small or both large, the reaction time was significantly shorter than it was for odorless air. When participants received visual feedback throughout the trial, movement duration was significantly shorter if the odor cue was congruent with the size of the target or if odorless air was dispensed. When the size of hand movement evoked by the odor cue was incongruent with the size of the target, an interference effect was apparent on the maximum aperture time. The result of odorless air control group in a closed loop was shorter than incongruent odor group. In addition, visual feedback influenced the results such that the maximum aperture time occurred later when visibility was blocked only in the odorless air control condition. These results suggest that olfactory information has a positive effect on reach-to-grasp hand movements and that vision and olfaction may interact to optimize motor behavior.
Collapse
Affiliation(s)
- Yang Yang
- School of Psychology, Shanghai University of Sport, Shanghai, China
| | - Xiaochun Wang
- School of Psychology, Shanghai University of Sport, Shanghai, China
| |
Collapse
|
11
|
Differential Rapid Plasticity in Auditory and Visual Responses in the Primarily Multisensory Orbitofrontal Cortex. eNeuro 2020; 7:ENEURO.0061-20.2020. [PMID: 32424057 PMCID: PMC7294472 DOI: 10.1523/eneuro.0061-20.2020] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2020] [Accepted: 03/26/2020] [Indexed: 01/17/2023] Open
Abstract
Given the connectivity of orbitofrontal cortex (OFC) with the sensory areas and areas involved in goal execution, it is likely that OFC, along with its function in reward processing, also has a role to play in perception-based multisensory decision-making. To understand mechanisms involved in multisensory decision-making, it is important to first know the encoding of different sensory stimuli in single neurons of the mouse OFC. Ruling out effects of behavioral state, memory, and others, we studied the anesthetized mouse OFC responses to auditory, visual, and audiovisual/multisensory stimuli, multisensory associations and sensory-driven input organization to the OFC. Almost all, OFC single neurons were found to be multisensory in nature, with sublinear to supralinear integration of the component unisensory stimuli. With a novel multisensory oddball stimulus set, we show that the OFC receives both unisensory as well as multisensory inputs, further corroborated by retrograde tracers showing labeling in secondary auditory and visual cortices, which we find to also have similar multisensory integration and responses. With long audiovisual pairing/association, we show rapid plasticity in OFC single neurons, with a strong visual bias, leading to a strong depression of auditory responses and effective enhancement of visual responses. Such rapid multisensory association driven plasticity is absent in the auditory and visual cortices, suggesting its emergence in the OFC. Based on the above results, we propose a hypothetical local circuit model in the OFC that integrates auditory and visual information which participates in computing stimulus value in dynamic multisensory environments.
Collapse
|
12
|
Hörberg T, Larsson M, Ekström I, Sandöy C, Lundén P, Olofsson JK. Olfactory Influences on Visual Categorization: Behavioral and ERP Evidence. Cereb Cortex 2020; 30:4220-4237. [PMID: 32232368 PMCID: PMC7264693 DOI: 10.1093/cercor/bhaa050] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2019] [Revised: 01/09/2020] [Accepted: 02/12/2020] [Indexed: 01/22/2023] Open
Abstract
Visual stimuli often dominate nonvisual stimuli during multisensory perception. Evidence suggests higher cognitive processes prioritize visual over nonvisual stimuli during divided attention. Visual stimuli should thus be disproportionally distracting when processing incongruent cross-sensory stimulus pairs. We tested this assumption by comparing visual processing with olfaction, a “primitive” sensory channel that detects potentially hazardous chemicals by alerting attention. Behavioral and event-related brain potentials (ERPs) were assessed in a bimodal object categorization task with congruent or incongruent odor–picture pairings and a delayed auditory target that indicated whether olfactory or visual cues should be categorized. For congruent pairings, accuracy was higher for visual compared to olfactory decisions. However, for incongruent pairings, reaction times (RTs) were faster for olfactory decisions. Behavioral results suggested that incongruent odors interfered more with visual decisions, thereby providing evidence for an “olfactory dominance” effect. Categorization of incongruent pairings engendered a late “slow wave” ERP effect. Importantly, this effect had a later amplitude peak and longer latency during visual decisions, likely reflecting additional categorization effort for visual stimuli in the presence of incongruent odors. In sum, contrary to what might be inferred from theories of “visual dominance,” incongruent odors may in fact uniquely attract mental processing resources during perceptual incongruence.
Collapse
Affiliation(s)
- Thomas Hörberg
- Gösta Ekman Laboratory, Department of Psychology, Stockholm University, Sweden.,Department of Linguistics, Stockholm University, Sweden
| | - Maria Larsson
- Gösta Ekman Laboratory, Department of Psychology, Stockholm University, Sweden
| | - Ingrid Ekström
- Department of Linguistics, Stockholm University, Sweden.,Aging Research Center (ARC), Karolinska Institute, Sweden
| | - Camilla Sandöy
- Gösta Ekman Laboratory, Department of Psychology, Stockholm University, Sweden
| | - Peter Lundén
- Gösta Ekman Laboratory, Department of Psychology, Stockholm University, Sweden
| | - Jonas K Olofsson
- Gösta Ekman Laboratory, Department of Psychology, Stockholm University, Sweden
| |
Collapse
|
13
|
Kuang S, Deng H, Zhang T. Adaptive heading performance during self-motion perception. Psych J 2019; 9:295-305. [PMID: 31814320 DOI: 10.1002/pchj.330] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2019] [Revised: 11/05/2019] [Accepted: 11/13/2019] [Indexed: 11/07/2022]
Abstract
Previous studies have documented that the perception of self-motion direction can be extracted from the patterns of image motion on the retina (also termed optic flow). Self-motion perception remains stable even when the optic-flow information is distorted by concurrent gaze shifts from body/eye rotations. This has been interpreted that extraretinal signals-efference copies of eye/body movements-are involved in compensating for retinal distortions. Here, we tested an alternative hypothesis to the extraretinal interpretation. We hypothesized that accurate self-motion perception can be achieved from a purely optic-flow-based visual strategy acquired through experience, independent of extraretinal mechanism. To test this, we asked human subjects to perform a self-motion direction discrimination task under normal optic flow (fixation condition) or distorted optic flow resulted from either realistic (pursuit condition) or simulated (simulated condition) eye movements. The task was performed either without (pre- and posttraining) or with (during training) the feedback about the correct answer. We first replicated the previous observation that before training, direction perception was greatly impaired in the simulated condition where the optic flow was distorted and extraretinal eye movement signals were absent. We further showed that after a few training sessions, the initial impairment in direction perception was gradually improved. These results reveal that behavioral training can enforce the exploitation of retinal cues to compensate for the distortion, without the contribution from the extraretinal signals. Our results suggest that self-motion perception is a flexible and adaptive process which might depend on neural plasticity in relevant cortical areas.
Collapse
Affiliation(s)
- Shenbing Kuang
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| | - Hu Deng
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| | - Tao Zhang
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
14
|
Invitto S, Montinaro R, Ciccarese V, Venturella I, Fronda G, Balconi M. Smell and 3D Haptic Representation: A Common Pathway to Understand Brain Dynamics in a Cross-Modal Task. A Pilot OERP and fNIRS Study. Front Behav Neurosci 2019; 13:226. [PMID: 31616263 PMCID: PMC6775200 DOI: 10.3389/fnbeh.2019.00226] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2019] [Accepted: 09/11/2019] [Indexed: 11/13/2022] Open
Abstract
Cross-modal perception allows olfactory information to integrate with other sensory modalities. Olfactory representations are processed by multisensory cortical pathways, where the aspects related to the haptic sensations are integrated. This complex reality allows the development of an integrated perception, where olfactory aspects compete with haptic and/or trigeminal activations. It is assumed that this integration involves both perceptive electrophysiological and metabolic/hemodynamic aspects, but there are no studies evaluating these activations in parallel. The aim of this study was to investigate brain dynamics during a cross-modal olfactory and haptic attention task, preceded by an exploratory session. The assessment of cross-modal dynamics was conducted through simultaneous electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS) recording, evaluating both electrophysiological and hemodynamic activities. The study consisted of two experimental sessions and was conducted with a sample of ten healthy subjects (mean age 25 ± 5.2 years). In Session 1, the subjects were trained to manipulate 3D haptic models (HC) and to smell different scents (SC). In Session 2, the subjects were tested during an attentive olfactory task, in order to investigate the olfactory event-related potentials (OERP) N1 and late positive component (LPC), and EEG rhythms associated with fNIRS components (oxy-Hb and deoxy-Hb). The main results of this study highlighted, in Task 1, a higher fNIRS oxy-Hb response during SC and a positive correlation with the delta rhythm in the central and parietal EEG region of interest. In Session 2, the N1 OERP highlighted a greater amplitude in SC. A negative correlation was found in HC for the deoxy-Hb parietal with frontal and central N1, and for the oxy-Hb frontal with N1 in the frontal, central and parietal regions of interests (ROIs). A negative correlation was found in parietal LPC amplitude with central deoxy-Hb. The data suggest that cross-modal valence modifies the attentional olfactory response and that the dorsal cortical/metabolic pathways are involved in these responses. This can be considered as an important starting point for understanding integrated cognition, as the subject could perceive in an ecological context.
Collapse
Affiliation(s)
- Sara Invitto
- Human Anatomy and Neuroscience Laboratory, Department of Biological and Environmental Sciences and Technologies, University of Salento, Lecce, Italy.,Laboratory of Interdisciplinary Research Applied to Medicine, University of Salento-Vito Fazzi Hospital, Lecce, Italy
| | - Roberta Montinaro
- Human Anatomy and Neuroscience Laboratory, Department of Biological and Environmental Sciences and Technologies, University of Salento, Lecce, Italy
| | | | - Irene Venturella
- Research Unit in Affective and Social Neuroscience, Department of Psychology, Catholic University of Milan, Milan, Italy
| | - Giulia Fronda
- Research Unit in Affective and Social Neuroscience, Department of Psychology, Catholic University of Milan, Milan, Italy
| | - Michela Balconi
- Research Unit in Affective and Social Neuroscience, Department of Psychology, Catholic University of Milan, Milan, Italy
| |
Collapse
|
15
|
Distinct forms of motion sensitivity impairments in Alzheimer's disease. Sci Rep 2019; 9:12931. [PMID: 31506450 PMCID: PMC6736838 DOI: 10.1038/s41598-019-48942-3] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2019] [Accepted: 07/31/2019] [Indexed: 11/21/2022] Open
Abstract
Motion sensitivity impairment in Alzheimer’s disease (AD) is often characterized as elevated coherence threshold. An alternative way to measure motion sensitivity is the direction threshold, i.e., the minimal angle of motion direction that can be discriminated. So far, it is less clear whether and how the direction threshold is altered in AD. Here we asked a group of AD patients and two control groups of healthy participants (young and elderly adults) to judge their perceived heading direction based on a field of optic flow stimuli simulating a forward translation in the environment. We manipulated the heading direction and the coherence of the optic flow independently and measured the direction and coherence thresholds from each participant. We found that the direction threshold increased significantly in AD patients as compared to healthy controls, like the coherence threshold. Yet, the elevation in the direction threshold was less pronounced than the coherence threshold. Moreover, the magnitudes of the direction and coherence thresholds in AD patients were not correlated. Our results suggest that coherence and direction impairments are two distinct forms of motion deficits in AD patients which might be associated with independent neural mechanisms.
Collapse
|
16
|
Song Y, Wang H. Motion-induced position mis-localization predicts the severity of Alzheimer's disease. J Neuropsychol 2019; 14:333-345. [PMID: 30859737 DOI: 10.1111/jnp.12181] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2018] [Revised: 01/31/2019] [Indexed: 01/16/2023]
Abstract
Patients with Alzheimer's disease (AD) often exhibit motion processing deficits. It is unclear whether the localization of moving objects - a perceptual process tightly linked to motion - is impaired or intact in AD. In this study, we used the phenomenon of illusory shift of position induced by motion as a behavioural paradigm to probe how the spatial representation differs between AD patients and healthy elderly controls. We measured the magnitudes of motion-induced position shift in a group of AD participants (N = 24) and age-matched elderly observers (N = 24). We found that AD patients showed weakened position mis-localization, but only for motion stimuli of slow speeds. For fast motion, the position mis-localization did not differ significantly between groups. Furthermore, we showed that the magnitudes of position mis-localization can predict the severity of AD; that is, patients with more severe symptoms had less preserved position mis-localization. Our results suggest that AD pathology impacts not only motion processing per se, but also the perceptual process related to motion such as the localization of moving objects.
Collapse
Affiliation(s)
- Yamin Song
- Department of Neurology, Liaocheng People's Hospital, China
| | - Huiting Wang
- Department of Neurology, Liaocheng People's Hospital, China
| |
Collapse
|
17
|
Patnaik B, Batch A, Elmqvist N. Information Olfactation: Harnessing Scent to Convey Data. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2018; 25:726-736. [PMID: 30137003 DOI: 10.1109/tvcg.2018.2865237] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
Olfactory feedback for analytical tasks is a virtually unexplored area in spite of the advantages it offers for information recall, feature identification, and location detection. Here we introduce the concept of information olfactation as the fragrant sibling of information visualization, and discuss how scent can be used to convey data. Building on a review of the human olfactory system and mirroring common visualization practice, we propose olfactory marks, the substrate in which they exist, and their olfactory channels that are available to designers. To exemplify this idea, we present VISCENT: A six-scent stereo olfactory display capable of conveying olfactory glyphs of varying temperature and direction, as well as a corresponding software system that integrates the display with a traditional visualization display. Finally, we present three applications that make use of the viScent system: A 2D graph visualization, a 2D line and point chart, and an immersive analytics graph visualization in 3D virtual reality. We close the paper with a review of possible extensions of viScent and applications of information olfactation for general visualization beyond the examples in this paper.
Collapse
|
18
|
Deroy O, Faivre N, Lunghi C, Spence C, Aller M, Noppeney U. The Complex Interplay Between Multisensory Integration and Perceptual Awareness. Multisens Res 2018; 29:585-606. [PMID: 27795942 DOI: 10.1163/22134808-00002529] [Citation(s) in RCA: 27] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/22/2022]
Abstract
The integration of information has been considered a hallmark of human consciousness, as it requires information being globally available via widespread neural interactions. Yet the complex interdependencies between multisensory integration and perceptual awareness, or consciousness, remain to be defined. While perceptual awareness has traditionally been studied in a single sense, in recent years we have witnessed a surge of interest in the role of multisensory integration in perceptual awareness. Based on a recent IMRF symposium on multisensory awareness, this review discusses three key questions from conceptual, methodological and experimental perspectives: (1) What do we study when we study multisensory awareness? (2) What is the relationship between multisensory integration and perceptual awareness? (3) Which experimental approaches are most promising to characterize multisensory awareness? We hope that this review paper will provoke lively discussions, novel experiments, and conceptual considerations to advance our understanding of the multifaceted interplay between multisensory integration and consciousness.
Collapse
Affiliation(s)
- O Deroy
- Centre for the Study of the Senses, Institute of Philosophy, School of Advanced Study, University of London, London, UK
| | - N Faivre
- Laboratory of Cognitive Neuroscience, Brain Mind Institute, Ecole Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - C Lunghi
- Department of Translational Research on New Technologies in Medicine and Surgery, University of Pisa, Pisa, Italy
| | - C Spence
- Crossmodal Research Laboratory, Department of Experimental Psychology, Oxford University, Oxford, UK
| | - M Aller
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, Birmingham, UK
| | - U Noppeney
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, Birmingham, UK
| |
Collapse
|
19
|
Abstract
The localization of object position in space is one of the most important visual abilities in humans. Motion-induced position shift is a perceptual illusion in which the position of a moving object is perceived to be shifted in the direction of motion. In this study, we wanted to explore whether and how Alzheimer’s disease (AD) affects this illusion. We recruited a group of patients with early AD and a group of age-matched healthy controls. In our experiments, two drifting Gabor patches moving in opposite directions were presented and participants were asked to report whether the upper Gabor appeared rightwards or leftwards of the lower one. We measured the psychometric functions, of which the point of subjective alignment was taken as the magnitude of motion-induced position shift. We compared the position shift across the two groups at three different retinal eccentricities. We found that position shifts were systematically smaller in the AD group as comparing to the elderly control group. Our data demonstrated that AD patients were less prone to motion-induced position shift. The results add to the existing knowledge of perceptual deficits in AD patients. We suggest that motion induced position shift may be effective as a new behavioral indicator for AD identification.
Collapse
|
20
|
Zhuang X, Chen Y, Zhuang X, Xing T, Chen T, Jiang G, Yang X. Impaired Center-Surround Suppression in Patients with Alzheimer's Disease. J Alzheimers Dis 2018; 55:1101-1108. [PMID: 27767987 DOI: 10.3233/jad-160603] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Abstract
Alzheimer's disease (AD) is often associated with declined visual processing abilities. Here we tested whether the functions of center-surround suppression- a hallmark property in the visual system- are altered by AD. To this end, we recruited three groups of participants (AD, elderly, and young) in a motion direction discrimination task, in which we measured the temporal duration threshold of a drifting Gabor with varying stimulus sizes. We first replicated the phenomena of center-surround suppression that the required duration for discriminating a high contrast grating decreases with increasing stimulus size. We then showed that the magnitudes of suppression varied among the three groups. There was progressive reduction of suppression in the elderly and AD groups compared with the young group. Interestingly, we found that the levels of suppression can predict the severity of dementia in the AD group. Our results suggest that AD is associated with impaired center-surround functions in the visual motion processing pathway.
Collapse
Affiliation(s)
- Xianbo Zhuang
- Department of Neurology, Liaocheng People's Hospital, Liaocheng city, Shandong Province, China
| | - Yanxiu Chen
- Department of Neurology, Liaocheng People's Hospital, Liaocheng city, Shandong Province, China
| | - Xianpeng Zhuang
- Department of CT room, Liaocheng Fourth People's Hospital, Liaocheng city, Shandong Province, China
| | - Tao Xing
- Department of Neurosurgery, Liaocheng People's Hospital, Liaocheng city, Shandong Province, China
| | - Tuanzhi Chen
- Department of Neurology, Liaocheng People's Hospital, Liaocheng city, Shandong Province, China
| | - Guisheng Jiang
- Department of Neurology, Liaocheng People's Hospital, Liaocheng city, Shandong Province, China
| | - Xiafeng Yang
- Department of Neurology, Liaocheng People's Hospital, Liaocheng city, Shandong Province, China
| |
Collapse
|
21
|
Deng H, Chen W, Kuang S, Zhang T. Distinct Aging Effects on Motion Repulsion and Surround Suppression in Humans. Front Aging Neurosci 2017; 9:363. [PMID: 29163143 PMCID: PMC5673999 DOI: 10.3389/fnagi.2017.00363] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2017] [Accepted: 10/23/2017] [Indexed: 12/27/2022] Open
Abstract
Elderly exhibit accumulating deficits in visual motion perception, which is critical for humans to interact with their environment. Previous studies have suggested that aging generally reduces neuronal inhibition in the visual system. Here, we investigated how aging affects the local intra-cortical inhibition using a motion direction discrimination task based on the motion repulsion phenomenon. Motion repulsion refers to the phenomenon by which observers overestimate the perceived angle when two superimposed dot patterns are moving at an acute angle. The misperception has been interpreted as local mutual inhibition between nearby direction-tuned neurons within the same cortical area. We found that elderly exhibited much stronger motion repulsion than young adults. We then compared this effect to how aging affects the global inter-cortical inhibition by adopting the surround suppression paradigm previously used by Betts et al. (2005). We found that elderly showed less change in the discrimination threshold when the size of a high-contrast drifting Gabor was increased, indicating reduced surround suppression compared to young adults. Our results indicate that aging may not always lead to a decrease of neuronal inhibition in the visual system. These distinct effects of aging on inhibitory functions might be one of the reasons that elderly people often exhibit deficits of motion perception in a real-world situation.
Collapse
Affiliation(s)
- Hu Deng
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Weiying Chen
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Shenbing Kuang
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Tao Zhang
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
22
|
Abstract
Recent research in Alzheimer’s disease (AD) indicates that perceptual impairments may occur before the onset of cognitive declines, and can thus serve as an early noninvasive indicator for AD. In this study, we focused on visual motion processing and explored whether AD induces changes in the properties of direction repulsion between two competing motions. We used random dot kinematograms (RDKs) and measured the magnitudes of direction repulsion between two overlapping RDKs moving different directions in three groups of participants: an AD group, an age-matched old control group, and a young control group. We showed that motion direction repulsion was significantly weaker in AD patients as comparing to both healthy controls. More importantly, we found that the magnitude of motion repulsion was predictive of the assessment of clinical severity in the AD group. Our results implicate that AD pathology is associated with altered neural functions in visual cortical areas and that motion repulsion deficit is a behavioral biomarker for the tracking of AD development.
Collapse
|
23
|
Fiore A, Pazzaglia M. Commentary: Cortical Plasticity and Olfactory Function in Early Blindness. Front Hum Neurosci 2017; 10:689. [PMID: 28119592 PMCID: PMC5220096 DOI: 10.3389/fnhum.2016.00689] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2016] [Accepted: 12/26/2016] [Indexed: 11/17/2022] Open
Affiliation(s)
- Alessandra Fiore
- Department of Psychology, University of Rome “La Sapienza”Rome, Italy
| | - Mariella Pazzaglia
- Department of Psychology, University of Rome “La Sapienza”Rome, Italy
- IRCCS Santa Lucia FoundationRome, Italy
- *Correspondence: Mariella Pazzaglia
| |
Collapse
|
24
|
Kawamura S, Melin AD. Evolution of Genes for Color Vision and the Chemical Senses in Primates. EVOLUTION OF THE HUMAN GENOME I 2017. [DOI: 10.1007/978-4-431-56603-8_10] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/16/2023]
|
25
|
|
26
|
Goeke CM, Planera S, Finger H, König P. Bayesian Alternation during Tactile Augmentation. Front Behav Neurosci 2016; 10:187. [PMID: 27774057 PMCID: PMC5054009 DOI: 10.3389/fnbeh.2016.00187] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2016] [Accepted: 09/22/2016] [Indexed: 11/25/2022] Open
Abstract
A large number of studies suggest that the integration of multisensory signals by humans is well-described by Bayesian principles. However, there are very few reports about cue combination between a native and an augmented sense. In particular, we asked the question whether adult participants are able to integrate an augmented sensory cue with existing native sensory information. Hence for the purpose of this study, we build a tactile augmentation device. Consequently, we compared different hypotheses of how untrained adult participants combine information from a native and an augmented sense. In a two-interval forced choice (2 IFC) task, while subjects were blindfolded and seated on a rotating platform, our sensory augmentation device translated information on whole body yaw rotation to tactile stimulation. Three conditions were realized: tactile stimulation only (augmented condition), rotation only (native condition), and both augmented and native information (bimodal condition). Participants had to choose one out of two consecutive rotations with higher angular rotation. For the analysis, we fitted the participants' responses with a probit model and calculated the just notable difference (JND). Then, we compared several models for predicting bimodal from unimodal responses. An objective Bayesian alternation model yielded a better prediction (χred2 = 1.67) than the Bayesian integration model (χred2 = 4.34). Slightly higher accuracy showed a non-Bayesian winner takes all (WTA) model (χred2 = 1.64), which either used only native or only augmented values per subject for prediction. However, the performance of the Bayesian alternation model could be substantially improved (χred2 = 1.09) utilizing subjective weights obtained by a questionnaire. As a result, the subjective Bayesian alternation model predicted bimodal performance most accurately among all tested models. These results suggest that information from augmented and existing sensory modalities in untrained humans is combined via a subjective Bayesian alternation process. Therefore, we conclude that behavior in our bimodal condition is explained better by top down-subjective weighting than by bottom-up weighting based upon objective cue reliability.
Collapse
Affiliation(s)
- Caspar M. Goeke
- Institute of Cognitive Science, University of OsnabrückOsnabrück, Germany
| | - Serena Planera
- Institute of Cognitive Science, University of OsnabrückOsnabrück, Germany
| | - Holger Finger
- Institute of Cognitive Science, University of OsnabrückOsnabrück, Germany
| | - Peter König
- Institute of Cognitive Science, University of OsnabrückOsnabrück, Germany
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-EppendorfHamburg, Germany
| |
Collapse
|
27
|
Hidaka S, Teramoto W, Sugita Y. Spatiotemporal Processing in Crossmodal Interactions for Perception of the External World: A Review. Front Integr Neurosci 2015; 9:62. [PMID: 26733827 PMCID: PMC4686600 DOI: 10.3389/fnint.2015.00062] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2015] [Accepted: 12/03/2015] [Indexed: 11/13/2022] Open
Abstract
Research regarding crossmodal interactions has garnered much interest in the last few decades. A variety of studies have demonstrated that multisensory information (vision, audition, tactile sensation, and so on) can perceptually interact with each other in the spatial and temporal domains. Findings regarding crossmodal interactions in the spatiotemporal domain (i.e., motion processing) have also been reported, with updates in the last few years. In this review, we summarize past and recent findings on spatiotemporal processing in crossmodal interactions regarding perception of the external world. A traditional view regarding crossmodal interactions holds that vision is superior to audition in spatial processing, but audition is dominant over vision in temporal processing. Similarly, vision is considered to have dominant effects over the other sensory modalities (i.e., visual capture) in spatiotemporal processing. However, recent findings demonstrate that sound could have a driving effect on visual motion perception. Moreover, studies regarding perceptual associative learning reported that, after association is established between a sound sequence without spatial information and visual motion information, the sound sequence could trigger visual motion perception. Other sensory information, such as motor action or smell, has also exhibited similar driving effects on visual motion perception. Additionally, recent brain imaging studies demonstrate that similar activation patterns could be observed in several brain areas, including the motion processing areas, between spatiotemporal information from different sensory modalities. Based on these findings, we suggest that multimodal information could mutually interact in spatiotemporal processing in the percept of the external world and that common perceptual and neural underlying mechanisms would exist for spatiotemporal processing.
Collapse
Affiliation(s)
- Souta Hidaka
- Department of Psychology, Rikkyo University Saitama, Japan
| | - Wataru Teramoto
- Department of Psychology, Kumamoto University Kumamoto, Japan
| | - Yoichi Sugita
- Department of Psychology, Waseda University Tokyo, Japan
| |
Collapse
|
28
|
Multisensory Perception: Pinpointing Visual Enhancement by Appropriate Odors. Curr Biol 2015; 25:R196-8. [DOI: 10.1016/j.cub.2015.01.021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|