1
|
Kim HW, Park M, Lee YS, Kim CY. Prior conscious experience modulates the impact of audiovisual temporal correspondence on unconscious visual processing. Conscious Cogn 2024; 122:103709. [PMID: 38781813 DOI: 10.1016/j.concog.2024.103709] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2024] [Revised: 05/09/2024] [Accepted: 05/14/2024] [Indexed: 05/25/2024]
Abstract
Conscious visual experiences are enriched by concurrent auditory information, implying audiovisual interactions. In the present study, we investigated how prior conscious experience of auditory and visual information influences the subsequent audiovisual temporal integration under the surface of awareness. We used continuous flash suppression (CFS) to render perceptually invisible a ball-shaped object constantly moving and bouncing inside a square frame window. To examine whether audiovisual temporal correspondence facilitates the ball stimulus to enter awareness, the visual motion was accompanied by click sounds temporally congruent or incongruent with the bounces of the ball. In Experiment 1, where no prior experience of the audiovisual events was given, we found no significant impact of audiovisual correspondence on visual detection time. However, when the temporally congruent or incongruent bounce-sound relations were consciously experienced prior to CFS in Experiment 2, congruent sounds yielded faster detection time compared to incongruent sounds during CFS. In addition, in Experiment 3, explicit processing of the incongruent bounce-sound relation prior to CFS slowed down detection time when the ball bounces became later congruent with sounds during CFS. These findings suggest that audiovisual temporal integration may take place outside of visual awareness though its potency is modulated by previous conscious experiences of the audiovisual events. The results are discussed in light of the framework of multisensory causal inference.
Collapse
Affiliation(s)
- Hyun-Woong Kim
- School of Behavioral and Brain Sciences, University of Texas at Dallas, Richardson, United States; Department of Psychology, The University of Texas at Dallas, Richardson, United States
| | - Minsun Park
- School of Psychology, Korea University, Seoul, Republic of Korea
| | - Yune Sang Lee
- School of Behavioral and Brain Sciences, University of Texas at Dallas, Richardson, United States; Department of Speech, Language, and Hearing, The University of Texas at Dallas, Richardson, United States
| | - Chai-Youn Kim
- School of Psychology, Korea University, Seoul, Republic of Korea.
| |
Collapse
|
2
|
Ono M, Hirose N, Mori S. Tactile information affects alternating visual percepts during binocular rivalry using naturalistic objects. Cogn Res Princ Implic 2022; 7:40. [PMID: 35543826 PMCID: PMC9095789 DOI: 10.1186/s41235-022-00390-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2020] [Accepted: 04/17/2022] [Indexed: 12/14/2022] Open
Abstract
INTRODUCTION Past studies have provided evidence that the effects of tactile stimulation on binocular rivalry are mediated by primitive features (orientation and spatial frequency) common in vision and touch. In this study, we examined whether such effects on binocular rivalry can be obtained through the roughness of naturalistic objects. In three experiments, the total dominant time of visual percepts of two objects was measured under binocular rivalry when participants touched one of the objects. RESULT In Experiment 1, the total dominant time for the image of artificial turf and bathmat was prolonged by congruent tactile stimulation and shortened by incongruent tactile stimulation. In Experiment 2, we used the same stimuli but rotated their visual images in opposite directions. The dominant time for either image was prolonged by congruent tactile stimulation. In Experiment 3, we used different types of stimuli, smooth marble and rough fabric, and noted significant effects of the congruent and incongruent tactile stimulation on the dominant time of visual percepts. CONCLUSION These three experiments demonstrated that visuo-tactile interaction on binocular rivalry can be mediated by roughness.
Collapse
Affiliation(s)
- Mikoto Ono
- grid.177174.30000 0001 2242 4849Department of Informatics, Graduate school of Information Science and Electrical Engineering, Kyushu University, 744 Motooka, Nishi-ku, Fukuoka City, Fukuoka 819-0395 Japan
| | - Nobuyuki Hirose
- grid.177174.30000 0001 2242 4849Department of Informatics, Graduate school of Information Science and Electrical Engineering, Kyushu University, 744 Motooka, Nishi-ku, Fukuoka City, Fukuoka 819-0395 Japan
| | - Shuji Mori
- grid.177174.30000 0001 2242 4849Department of Informatics, Graduate school of Information Science and Electrical Engineering, Kyushu University, 744 Motooka, Nishi-ku, Fukuoka City, Fukuoka 819-0395 Japan
| |
Collapse
|
3
|
Delong P, Noppeney U. Semantic and spatial congruency mould audiovisual integration depending on perceptual awareness. Sci Rep 2021; 11:10832. [PMID: 34035358 PMCID: PMC8149651 DOI: 10.1038/s41598-021-90183-w] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2020] [Accepted: 04/22/2021] [Indexed: 11/09/2022] Open
Abstract
Information integration is considered a hallmark of human consciousness. Recent research has challenged this tenet by showing multisensory interactions in the absence of awareness. This psychophysics study assessed the impact of spatial and semantic correspondences on audiovisual binding in the presence and absence of visual awareness by combining forward-backward masking with spatial ventriloquism. Observers were presented with object pictures and synchronous sounds that were spatially and/or semantically congruent or incongruent. On each trial observers located the sound, identified the picture and rated the picture's visibility. We observed a robust ventriloquist effect for subjectively visible and invisible pictures indicating that pictures that evade our perceptual awareness influence where we perceive sounds. Critically, semantic congruency enhanced these visual biases on perceived sound location only when the picture entered observers' awareness. Our results demonstrate that crossmodal influences operating from vision to audition and vice versa are interactively controlled by spatial and semantic congruency in the presence of awareness. However, when visual processing is disrupted by masking procedures audiovisual interactions no longer depend on semantic correspondences.
Collapse
Affiliation(s)
- Patrycja Delong
- Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, UK.
| | - Uta Noppeney
- Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, UK.,Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
| |
Collapse
|
4
|
Motyka P, Akbal M, Litwin P. Forward optic flow is prioritised in visual awareness independently of walking direction. PLoS One 2021; 16:e0250905. [PMID: 33945563 PMCID: PMC8096117 DOI: 10.1371/journal.pone.0250905] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2020] [Accepted: 04/15/2021] [Indexed: 12/31/2022] Open
Abstract
When two different images are presented separately to each eye, one experiences smooth transitions between them-a phenomenon called binocular rivalry. Previous studies have shown that exposure to signals from other senses can enhance the access of stimulation-congruent images to conscious perception. However, despite our ability to infer perceptual consequences from bodily movements, evidence that action can have an analogous influence on visual awareness is scarce and mainly limited to hand movements. Here, we investigated whether one's direction of locomotion affects perceptual access to optic flow patterns during binocular rivalry. Participants walked forwards and backwards on a treadmill while viewing highly-realistic visualisations of self-motion in a virtual environment. We hypothesised that visualisations congruent with walking direction would predominate in visual awareness over incongruent ones, and that this effect would increase with the precision of one's active proprioception. These predictions were not confirmed: optic flow consistent with forward locomotion was prioritised in visual awareness independently of walking direction and proprioceptive abilities. Our findings suggest the limited role of kinaesthetic-proprioceptive information in disambiguating visually perceived direction of self-motion and indicate that vision might be tuned to the (expanding) optic flow patterns prevalent in everyday life.
Collapse
Affiliation(s)
- Paweł Motyka
- Faculty of Psychology, University of Warsaw, Warsaw, Poland
| | - Mert Akbal
- Department of Neurology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
- Academy of Fine Arts Saar, Saarbrücken, Germany
| | - Piotr Litwin
- Faculty of Psychology, University of Warsaw, Warsaw, Poland
| |
Collapse
|
5
|
Huygelier H, van Ee R, Lanssens A, Wagemans J, Gillebert CR. Audiovisual looming signals are not always prioritised: evidence from exogenous, endogenous and sustained attention. JOURNAL OF COGNITIVE PSYCHOLOGY 2021. [DOI: 10.1080/20445911.2021.1896528] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Affiliation(s)
- Hanne Huygelier
- Department of Brain and Cognition, KU Leuven, Leuven, Belgium
| | - Raymond van Ee
- Department of Brain and Cognition, KU Leuven, Leuven, Belgium
- Donders Institute for Brain, Cognition and Behavior, Radboud University, Nijmegen, Netherlands
| | - Armien Lanssens
- Department of Brain and Cognition, KU Leuven, Leuven, Belgium
| | - Johan Wagemans
- Department of Brain and Cognition, KU Leuven, Leuven, Belgium
| | | |
Collapse
|
6
|
Judging Relative Onsets and Offsets of Audiovisual Events. Vision (Basel) 2020; 4:vision4010017. [PMID: 32138261 PMCID: PMC7157228 DOI: 10.3390/vision4010017] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2019] [Revised: 02/15/2020] [Accepted: 02/23/2020] [Indexed: 01/29/2023] Open
Abstract
This study assesses the fidelity with which people can make temporal order judgments (TOJ) between auditory and visual onsets and offsets. Using an adaptive staircase task administered to a large sample of young adults, we find that the ability to judge temporal order varies widely among people, with notable difficulty created when auditory events closely follow visual events. Those findings are interpretable within the context of an independent channels model. Visual onsets and offsets can be difficult to localize in time when they occur within the temporal neighborhood of sound onsets or offsets.
Collapse
|
7
|
Kim S, Kim J. Effects of Multimodal Association on Ambiguous Perception in Binocular Rivalry. Perception 2019; 48:796-819. [DOI: 10.1177/0301006619867023] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
When two eyes view dissimilar images, an observer typically reports ambiguous perception called binocular rivalry where the subjective perception fluctuates between the two inputs. This perceptual instability is often comprised of exclusive dominance of each image and a transition state called piecemeal state where the two images are intermingled in patchwork manner. Herein, we investigated the effects of multimodal association of sensory congruent pair, arbitrary pair, and reverse pair on piecemeal state in order to see how each level of association affects the ambiguous perception during binocular rivalry. To induce the multisensory associations, we designed a matching task with audiovisual feedback where subjects were required to respond according to given pairing rules. We found that explicit audiovisual associations can substantially affect the piecemeal state during binocular rivalry and that this congruency effect that reduces the amount of visual ambiguity originates primarily from explicit audiovisual association training rather than common sensory features. Furthermore, when one information is associated with multiple information, recent and preexisting associations work collectively to influence the perceptual ambiguity during rivalry. Our findings show that learned multimodal association directly affects the temporal dynamics of ambiguous perception during binocular rivalry by modulating not only the exclusive dominance but also the piecemeal state in a systematic manner.
Collapse
Affiliation(s)
- Sungyong Kim
- Graduate School of Culture Technology, Korea Advanced Institute of Science and Technology, Daejeon, Republic of Korea
| | - Jeounghoon Kim
- Graduate School of Culture Technology, Korea Advanced Institute of Science and Technology, Daejeon, Republic of Korea; School of Humanities and Social Sciences, Korea Advanced Institute of Science and Technology, Daejeon, Republic of Korea
| |
Collapse
|
8
|
Delong P, Aller M, Giani AS, Rohe T, Conrad V, Watanabe M, Noppeney U. Invisible Flashes Alter Perceived Sound Location. Sci Rep 2018; 8:12376. [PMID: 30120294 PMCID: PMC6098122 DOI: 10.1038/s41598-018-30773-3] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2018] [Accepted: 07/31/2018] [Indexed: 12/05/2022] Open
Abstract
Information integration across the senses is fundamental for effective interactions with our environment. The extent to which signals from different senses can interact in the absence of awareness is controversial. Combining the spatial ventriloquist illusion and dynamic continuous flash suppression (dCFS), we investigated in a series of two experiments whether visual signals that observers do not consciously perceive can influence spatial perception of sounds. Importantly, dCFS obliterated visual awareness only on a fraction of trials allowing us to compare spatial ventriloquism for physically identical flashes that were judged as visible or invisible. Our results show a stronger ventriloquist effect for visible than invisible flashes. Critically, a robust ventriloquist effect emerged also for invisible flashes even when participants were at chance when locating the flash. Collectively, our findings demonstrate that signals that we are not aware of in one sensory modality can alter spatial perception of signals in another sensory modality.
Collapse
Affiliation(s)
- Patrycja Delong
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, B15 2TT, Birmingham, UK.
| | - Máté Aller
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, B15 2TT, Birmingham, UK
| | - Anette S Giani
- Max Planck Institute for Biological Cybernetics, 72076, Tübingen, Germany
| | - Tim Rohe
- Max Planck Institute for Biological Cybernetics, 72076, Tübingen, Germany
| | - Verena Conrad
- Max Planck Institute for Biological Cybernetics, 72076, Tübingen, Germany
| | - Masataka Watanabe
- Max Planck Institute for Biological Cybernetics, 72076, Tübingen, Germany
| | - Uta Noppeney
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, B15 2TT, Birmingham, UK
- Max Planck Institute for Biological Cybernetics, 72076, Tübingen, Germany
| |
Collapse
|
9
|
Deroy O, Faivre N, Lunghi C, Spence C, Aller M, Noppeney U. The Complex Interplay Between Multisensory Integration and Perceptual Awareness. Multisens Res 2018; 29:585-606. [PMID: 27795942 DOI: 10.1163/22134808-00002529] [Citation(s) in RCA: 27] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/22/2022]
Abstract
The integration of information has been considered a hallmark of human consciousness, as it requires information being globally available via widespread neural interactions. Yet the complex interdependencies between multisensory integration and perceptual awareness, or consciousness, remain to be defined. While perceptual awareness has traditionally been studied in a single sense, in recent years we have witnessed a surge of interest in the role of multisensory integration in perceptual awareness. Based on a recent IMRF symposium on multisensory awareness, this review discusses three key questions from conceptual, methodological and experimental perspectives: (1) What do we study when we study multisensory awareness? (2) What is the relationship between multisensory integration and perceptual awareness? (3) Which experimental approaches are most promising to characterize multisensory awareness? We hope that this review paper will provoke lively discussions, novel experiments, and conceptual considerations to advance our understanding of the multifaceted interplay between multisensory integration and consciousness.
Collapse
Affiliation(s)
- O Deroy
- Centre for the Study of the Senses, Institute of Philosophy, School of Advanced Study, University of London, London, UK
| | - N Faivre
- Laboratory of Cognitive Neuroscience, Brain Mind Institute, Ecole Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - C Lunghi
- Department of Translational Research on New Technologies in Medicine and Surgery, University of Pisa, Pisa, Italy
| | - C Spence
- Crossmodal Research Laboratory, Department of Experimental Psychology, Oxford University, Oxford, UK
| | - M Aller
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, Birmingham, UK
| | - U Noppeney
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, Birmingham, UK
| |
Collapse
|
10
|
Dittrich S, Noesselt T. Temporal Audiovisual Motion Prediction in 2D- vs. 3D-Environments. Front Psychol 2018; 9:368. [PMID: 29618999 PMCID: PMC5871701 DOI: 10.3389/fpsyg.2018.00368] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2017] [Accepted: 03/06/2018] [Indexed: 11/24/2022] Open
Abstract
Predicting motion is essential for many everyday life activities, e.g., in road traffic. Previous studies on motion prediction failed to find consistent results, which might be due to the use of very different stimulus material and behavioural tasks. Here, we directly tested the influence of task (detection, extrapolation) and stimulus features (visual vs. audiovisual and three-dimensional vs. non-three-dimensional) on temporal motion prediction in two psychophysical experiments. In both experiments a ball followed a trajectory toward the observer and temporarily disappeared behind an occluder. In audiovisual conditions a moving white noise (congruent or non-congruent to visual motion direction) was presented concurrently. In experiment 1 the ball reappeared on a predictable or a non-predictable trajectory and participants detected when the ball reappeared. In experiment 2 the ball did not reappear after occlusion and participants judged when the ball would reach a specified position at two possible distances from the occluder (extrapolation task). Both experiments were conducted in three-dimensional space (using stereoscopic screen and polarised glasses) and also without stereoscopic presentation. Participants benefitted from visually predictable trajectories and concurrent sounds during detection. Additionally, visual facilitation was more pronounced for non-3D stimulation during detection task. In contrast, for a more complex extrapolation task group mean results indicated that auditory information impaired motion prediction. However, a post hoc cross-validation procedure (split-half) revealed that participants varied in their ability to use sounds during motion extrapolation. Most participants selectively profited from either near or far extrapolation distances but were impaired for the other one. We propose that interindividual differences in extrapolation efficiency might be the mechanism governing this effect. Together, our results indicate that both a realistic experimental environment and subject-specific differences modulate the ability of audiovisual motion prediction and need to be considered in future research.
Collapse
Affiliation(s)
- Sandra Dittrich
- Department of Biological Psychology, Otto von Guericke University Magdeburg, Magdeburg, Germany
| | - Tömme Noesselt
- Department of Biological Psychology, Otto von Guericke University Magdeburg, Magdeburg, Germany.,Center for Behavioral Brain Sciences, Magdeburg, Germany
| |
Collapse
|
11
|
Piazza EA, Denison RN, Silver MA. Recent cross-modal statistical learning influences visual perceptual selection. J Vis 2018; 18:1. [PMID: 29497742 PMCID: PMC5837665 DOI: 10.1167/18.3.1] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
Incoming sensory signals are often ambiguous and consistent with multiple perceptual interpretations. Information from one sensory modality can help to resolve ambiguity in another modality, but the mechanisms by which multisensory associations come to influence the contents of conscious perception are unclear. We asked whether and how novel statistical information about the coupling between sounds and images influences the early stages of awareness of visual stimuli. We exposed subjects to consistent, arbitrary pairings of sounds and images and then measured the impact of this recent passive statistical learning on subjects' initial conscious perception of a stimulus by employing binocular rivalry, a phenomenon in which incompatible images presented separately to the two eyes result in a perceptual alternation between the two images. On each trial of the rivalry test, subjects were presented with a pair of rivalrous images (one of which had been consistently paired with a specific sound during exposure while the other had not) and an accompanying sound. We found that, at the onset of binocular rivalry, an image was significantly more likely to be perceived, and was perceived for a longer duration, when it was presented with its paired sound than when presented with other sounds. Our results indicate that recently acquired multisensory information helps resolve sensory ambiguity, and they demonstrate that statistical learning is a fast, flexible mechanism that facilitates this process.
Collapse
Affiliation(s)
- Elise A Piazza
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA.,Vision Science Graduate Group, University of California, Berkeley, Berkeley, CA, USA.,Helen Wills Neuroscience Institute, University of California, Berkeley, Berkeley, CA, USA
| | - Rachel N Denison
- Helen Wills Neuroscience Institute, University of California, Berkeley, Berkeley, CA, USA.,Department of Psychology and Center for Neural Science, New York University, New York, NY, USA
| | - Michael A Silver
- Vision Science Graduate Group, University of California, Berkeley, Berkeley, CA, USA.,Helen Wills Neuroscience Institute, University of California, Berkeley, Berkeley, CA, USA.,School of Optometry, University of California, Berkeley, Berkeley, CA, USA
| |
Collapse
|
12
|
Does direction of walking impact binocular rivalry between competing patterns of optic flow? Atten Percept Psychophys 2017; 79:1182-1194. [PMID: 28197836 DOI: 10.3758/s13414-017-1299-4] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
When dissimilar monocular images are viewed simultaneously by the two eyes, stable binocular vision gives way to unstable vision characterized by alternations in dominance between the two images in a phenomenon called binocular rivalry. These alternations in perception reveal the existence of inhibitory interactions between neural representations associated with conflicting visual inputs. Binocular rivalry has been studied since the days of Wheatstone, but one recent strategy is to investigate its susceptibility to influences caused by one's own motor activity. This paper focused on the activity of walking, which produces an expected, characteristic direction of optic flow dependent upon the direction of one's walking. In a set of experiments, we employed virtual reality technology to present dichoptic stimuli to observers who walked forward, backward, or were sitting. Optic flow was presented to a given eye, and was sometimes congruent with the direction of walking, sometimes incongruent, and sometimes random, except when the participant was sitting. Our results indicate that, while walking had a reliable influence on rivalry dynamics, the predominance of congruent or incongruent motion did not.
Collapse
|
13
|
Abstract
Multisensory integration can play a critical role in producing unified and reliable perceptual experience. When sensory information in one modality is degraded or ambiguous, information from other senses can crossmodally resolve perceptual ambiguities. Prior research suggests that auditory information can disambiguate the contents of visual awareness by facilitating perception of intermodally consistent stimuli. However, it is unclear whether these effects are truly due to crossmodal facilitation or are mediated by voluntary selective attention to audiovisually congruent stimuli. Here, we demonstrate that sounds can bias competition in binocular rivalry toward audiovisually congruent percepts, even when participants have no recognition of the congruency. When speech sounds were presented in synchrony with speech-like deformations of rivalling ellipses, ellipses with crossmodally congruent deformations were perceptually dominant over those with incongruent deformations. This effect was observed in participants who could not identify the crossmodal congruency in an open-ended interview (Experiment 1) or detect it in a simple 2AFC task (Experiment 2), suggesting that the effect was not due to voluntary selective attention or response bias. These results suggest that sound can automatically disambiguate the contents of visual awareness by facilitating perception of audiovisually congruent stimuli.
Collapse
|
14
|
Newly acquired audio-visual associations bias perception in binocular rivalry. Vision Res 2017; 133:121-129. [DOI: 10.1016/j.visres.2017.02.001] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2016] [Revised: 02/11/2017] [Accepted: 02/17/2017] [Indexed: 11/16/2022]
|
15
|
Abstract
It is known that, after a prolonged period of visual deprivation, the adult visual cortex can be recruited for nonvisual processing, reflecting cross-modal plasticity. Here, we investigated whether cross-modal plasticity can occur at short timescales in the typical adult brain by comparing the interaction between vision and touch during binocular rivalry before and after a brief period of monocular deprivation, which strongly alters ocular balance favoring the deprived eye. While viewing dichoptically two gratings of orthogonal orientation, participants were asked to actively explore a haptic grating congruent in orientation to one of the two rivalrous stimuli. We repeated this procedure before and after 150 min of monocular deprivation. We first confirmed that haptic stimulation interacted with vision during rivalry promoting dominance of the congruent visuo-haptic stimulus and that monocular deprivation increased the deprived eye and decreased the nondeprived eye dominance. Interestingly, after deprivation, we found that the effect of touch did not change for the nondeprived eye, whereas it disappeared for the deprived eye, which was potentiated after deprivation. The absence of visuo-haptic interaction for the deprived eye lasted for over 1 hr and was not attributable to a masking induced by the stronger response of the deprived eye as confirmed by a control experiment. Taken together, our results demonstrate that the adult human visual cortex retains a high degree of cross-modal plasticity, which can occur even at very short timescales.
Collapse
Affiliation(s)
- Luca Lo Verde
- University of Florence.,Institute of Neuroscience, Consiglio Nazionale Delle Ricerche, Pisa
| | | | - Claudia Lunghi
- Institute of Neuroscience, Consiglio Nazionale Delle Ricerche, Pisa.,University of Pisa
| |
Collapse
|
16
|
DeLucia PR, Preddy D, Oberfeld D. Audiovisual Integration of Time-to-Contact Information for Approaching Objects. Multisens Res 2016; 29:365-95. [DOI: 10.1163/22134808-00002520] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
Previous studies of time-to-collision (TTC) judgments of approaching objects focused on effectiveness of visual TTC information in the optical expansion pattern (e.g., visual tau, disparity). Fewer studies examined effectiveness of auditory TTC information in the pattern of increasing intensity (auditory tau), or measured integration of auditory and visual TTC information. Here, participants judged TTC of an approaching object presented in the visual or auditory modality, or both concurrently. TTC information provided by the modalities was jittered slightly against each other, so that auditory and visual TTC were not perfectly correlated. A psychophysical reverse correlation approach was used to estimate the influence of auditory and visual cues on TTC estimates. TTC estimates were shorter in the auditory than the visual condition. On average, TTC judgments in the audiovisual condition were not significantly different from judgments in the visual condition. However, multiple regression analyses showed that TTC estimates were based on both auditory and visual information. Although heuristic cues (final sound pressure level, final optical size) and more reliable information (relative rate of change in acoustic intensity, optical expansion) contributed to auditory and visual judgments, the effect of heuristics was greater in the auditory condition. Although auditory and visual information influenced judgments, concurrent presentation of both did not result in lower response variability compared to presentation of either one alone; there was no multimodal advantage. The relative weightings of heuristics and more reliable information differed between auditory and visual TTC judgments, and when both were available, visual information was weighted more heavily.
Collapse
Affiliation(s)
- Patricia R. DeLucia
- Department of Psychological Sciences, MS 2051, Texas Tech University, Lubbock, TX 79409-2051, USA
| | - Doug Preddy
- Department of Psychological Sciences, MS 2051, Texas Tech University, Lubbock, TX 79409-2051, USA
| | - Daniel Oberfeld
- Department of Psychology, Johannes Gutenberg-Universität, 55099 Mainz, Germany
| |
Collapse
|
17
|
Moors P, Huygelier H, Wagemans J, de-Wit L, van Ee R. Suppressed visual looming stimuli are not integrated with auditory looming signals: Evidence from continuous flash suppression. Iperception 2015; 6:48-62. [PMID: 26034573 PMCID: PMC4441023 DOI: 10.1068/i0678] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2014] [Revised: 02/16/2015] [Indexed: 11/13/2022] Open
Abstract
Previous studies using binocular rivalry have shown that signals in a modality other than the visual can bias dominance durations depending on their congruency with the rivaling stimuli. More recently, studies using continuous flash suppression (CFS) have reported that multisensory integration influences how long visual stimuli remain suppressed. In this study, using CFS, we examined whether the contrast thresholds for detecting visual looming stimuli are influenced by a congruent auditory stimulus. In Experiment 1, we show that a looming visual stimulus can result in lower detection thresholds compared to a static concentric grating, but that auditory tone pips congruent with the looming stimulus did not lower suppression thresholds any further. In Experiments 2, 3, and 4, we again observed no advantage for congruent multisensory stimuli. These results add to our understanding of the conditions under which multisensory integration is possible, and suggest that certain forms of multisensory integration are not evident when the visual stimulus is suppressed from awareness using CFS.
Collapse
Affiliation(s)
- Pieter Moors
- Laboratory of Experimental Psychology, University of Leuven (KU Leuven), Leuven, Belgium; e-mail:
| | - Hanne Huygelier
- Laboratory of Experimental Psychology, University of Leuven (KU Leuven), Leuven, Belgium; e-mail:
| | - Johan Wagemans
- Laboratory of Experimental Psychology, University of Leuven (KU Leuven), Leuven, Belgium; e-mail:
| | - Lee de-Wit
- Laboratory of Experimental Psychology, University of Leuven (KU Leuven), Leuven, Belgium; e-mail:
| | - Raymond van Ee
- Laboratory of Experimental Psychology, University of Leuven (KU Leuven), Leuven, Belgium; Department of Biophysics, Donders Institute, Radboud University, Nijmegen, The Netherlands; Department of Brain, Body, & Behavior, Philips Research Laboratories, Eindhoven, The Netherlands; e-mail:
| |
Collapse
|
18
|
Aller M, Giani A, Conrad V, Watanabe M, Noppeney U. A spatially collocated sound thrusts a flash into awareness. Front Integr Neurosci 2015; 9:16. [PMID: 25774126 PMCID: PMC4343005 DOI: 10.3389/fnint.2015.00016] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2014] [Accepted: 02/05/2015] [Indexed: 11/22/2022] Open
Abstract
To interact effectively with the environment the brain integrates signals from multiple senses. It is currently unclear to what extent spatial information can be integrated across different senses in the absence of awareness. Combining dynamic continuous flash suppression (CFS) and spatial audiovisual stimulation, the current study investigated whether a sound facilitates a concurrent visual flash to elude flash suppression and enter perceptual awareness depending on audiovisual spatial congruency. Our results demonstrate that a concurrent sound boosts unaware visual signals into perceptual awareness. Critically, this process depended on the spatial congruency of the auditory and visual signals pointing towards low level mechanisms of audiovisual integration. Moreover, the concurrent sound biased the reported location of the flash as a function of flash visibility. The spatial bias of sounds on reported flash location was strongest for flashes that were judged invisible. Our results suggest that multisensory integration is a critical mechanism that enables signals to enter conscious perception.
Collapse
Affiliation(s)
- Máté Aller
- Computational Cognitive Neuroimaging Laboratory, Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham Birmingham, UK
| | - Anette Giani
- Max Planck Institute for Biological Cybernetics Tübingen, Germany
| | - Verena Conrad
- Max Planck Institute for Biological Cybernetics Tübingen, Germany
| | | | - Uta Noppeney
- Computational Cognitive Neuroimaging Laboratory, Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham Birmingham, UK ; Max Planck Institute for Biological Cybernetics Tübingen, Germany
| |
Collapse
|
19
|
Adam R, Noppeney U. A phonologically congruent sound boosts a visual target into perceptual awareness. Front Integr Neurosci 2014; 8:70. [PMID: 25309357 PMCID: PMC4160974 DOI: 10.3389/fnint.2014.00070] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2014] [Accepted: 08/20/2014] [Indexed: 11/13/2022] Open
Abstract
Capacity limitations of attentional resources allow only a fraction of sensory inputs to enter our awareness. Most prominently, in the attentional blink the observer often fails to detect the second of two rapidly successive targets that are presented in a sequence of distractor items. To investigate how auditory inputs enable a visual target to escape the attentional blink, this study presented the visual letter targets T1 and T2 together with phonologically congruent or incongruent spoken letter names. First, a congruent relative to an incongruent sound at T2 rendered visual T2 more visible. Second, this T2 congruency effect was amplified when the sound was congruent at T1 as indicated by a T1 congruency × T2 congruency interaction. Critically, these effects were observed both when the sounds were presented in synchrony with and prior to the visual target letters suggesting that the sounds may increase visual target identification via multiple mechanisms such as audiovisual priming or decisional interactions. Our results demonstrate that a sound around the time of T2 increases subjects' awareness of the visual target as a function of T1 and T2 congruency. Consistent with Bayesian causal inference, the brain may thus combine (1) prior congruency expectations based on T1 congruency and (2) phonological congruency cues provided by the audiovisual inputs at T2 to infer whether auditory and visual signals emanate from a common source and should hence be integrated for perceptual decisions.
Collapse
Affiliation(s)
- Ruth Adam
- Cognitive Neuroimaging Group, Max Planck Institute for Biological Cybernetics Tuebingen, Germany ; Department of General Psychiatry, Center of Psychosocial Medicine, University of Heidelberg Heidelberg, Germany ; Institute for Stroke and Dementia Research, Ludwig-Maximilian-University Munich, Germany
| | - Uta Noppeney
- Cognitive Neuroimaging Group, Max Planck Institute for Biological Cybernetics Tuebingen, Germany ; Department of Psychology, Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham Birmingham, UK
| |
Collapse
|
20
|
Abstract
This psychophysics study investigated whether prior auditory conditioning influences how a sound interacts with visual perception. In the conditioning phase, subjects were presented with three pure tones ( = conditioned stimuli, CS) that were paired with positive, negative or neutral unconditioned stimuli. As unconditioned reinforcers we employed pictures (highly pleasant, unpleasant and neutral) or monetary outcomes (+50 euro cents, −50 cents, 0 cents). In the subsequent visual selective attention paradigm, subjects were presented with near-threshold Gabors displayed in their left or right hemifield. Critically, the Gabors were presented in synchrony with one of the conditioned sounds. Subjects discriminated whether the Gabors were presented in their left or right hemifields. Participants determined the location more accurately when the Gabors were presented in synchrony with positive relative to neutral sounds irrespective of reinforcer type. Thus, previously rewarded relative to neutral sounds increased the bottom-up salience of the visual Gabors. Our results are the first demonstration that prior auditory conditioning is a potent mechanism to modulate the effect of sounds on visual perception.
Collapse
|
21
|
Deroy O, Chen YC, Spence C. Multisensory constraints on awareness. Philos Trans R Soc Lond B Biol Sci 2014; 369:20130207. [PMID: 24639579 DOI: 10.1098/rstb.2013.0207] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Given that multiple senses are often stimulated at the same time, perceptual awareness is most likely to take place in multisensory situations. However, theories of awareness are based on studies and models established for a single sense (mostly vision). Here, we consider the methodological and theoretical challenges raised by taking a multisensory perspective on perceptual awareness. First, we consider how well tasks designed to study unisensory awareness perform when used in multisensory settings, stressing that studies using binocular rivalry, bistable figure perception, continuous flash suppression, the attentional blink, repetition blindness and backward masking can demonstrate multisensory influences on unisensory awareness, but fall short of tackling multisensory awareness directly. Studies interested in the latter phenomenon rely on a method of subjective contrast and can, at best, delineate conditions under which individuals report experiencing a multisensory object or two unisensory objects. As there is not a perfect match between these conditions and those in which multisensory integration and binding occur, the link between awareness and binding advocated for visual information processing needs to be revised for multisensory cases. These challenges point at the need to question the very idea of multisensory awareness.
Collapse
Affiliation(s)
- Ophelia Deroy
- Centre for the Study of the Senses and Institute of Philosophy, School of Advanced Study, University of London, , London, UK
| | | | | |
Collapse
|