51
|
Lewkowicz D. Development of Multisensory Temporal Perception. Front Neurosci 2011. [DOI: 10.1201/9781439812174-22] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022] Open
|
52
|
Weisswange TH, Rothkopf CA, Rodemann T, Triesch J. Bayesian cue integration as a developmental outcome of reward mediated learning. PLoS One 2011; 6:e21575. [PMID: 21750717 PMCID: PMC3130032 DOI: 10.1371/journal.pone.0021575] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2011] [Accepted: 06/03/2011] [Indexed: 11/19/2022] Open
Abstract
Average human behavior in cue combination tasks is well predicted by bayesian inference models. As this capability is acquired over developmental timescales, the question arises, how it is learned. Here we investigated whether reward dependent learning, that is well established at the computational, behavioral, and neuronal levels, could contribute to this development. It is shown that a model free reinforcement learning algorithm can indeed learn to do cue integration, i.e. weight uncertain cues according to their respective reliabilities and even do so if reliabilities are changing. We also consider the case of causal inference where multimodal signals can originate from one or multiple separate objects and should not always be integrated. In this case, the learner is shown to develop a behavior that is closest to bayesian model averaging. We conclude that reward mediated learning could be a driving force for the development of cue integration and causal inference.
Collapse
|
53
|
Stein BE, Rowland BA. Organization and plasticity in multisensory integration: early and late experience affects its governing principles. PROGRESS IN BRAIN RESEARCH 2011; 191:145-63. [PMID: 21741550 PMCID: PMC3245961 DOI: 10.1016/b978-0-444-53752-2.00007-2] [Citation(s) in RCA: 36] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
Abstract
Neurons in the midbrain superior colliculus (SC) have the ability to integrate information from different senses to profoundly increase their sensitivity to external events. This not only enhances an organism's ability to detect and localize these events, but to program appropriate motor responses to them. The survival value of this process of multisensory integration is self-evident, and its physiological and behavioral manifestations have been studied extensively in adult and developing cats and monkeys. These studies have revealed, that contrary to expectations based on some developmental theories this process is not present in the newborn's brain. The data show that is acquired only gradually during postnatal life as a consequence of at least two factors: the maturation of cooperative interactions between association cortex and the SC, and extensive experience with cross-modal cues. Using these factors, the brain is able to craft the underlying neural circuits and the fundamental principles that govern multisensory integration so that they are adapted to the ecological circumstances in which they will be used.
Collapse
Affiliation(s)
- Barry E Stein
- Department of Neurobiology and Anatomy, Wake Forest School of Medicine, Winston-Salem, North Carolina, USA.
| | | |
Collapse
|
54
|
Gentile G, Petkova VI, Ehrsson HH. Integration of visual and tactile signals from the hand in the human brain: an FMRI study. J Neurophysiol 2010; 105:910-22. [PMID: 21148091 DOI: 10.1152/jn.00840.2010] [Citation(s) in RCA: 193] [Impact Index Per Article: 12.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
In the non-human primate brain, a number of multisensory areas have been described where individual neurons respond to visual, tactile and bimodal visuotactile stimulation of the upper limb. It has been shown that such bimodal neurons can integrate sensory inputs in a linear or nonlinear fashion. In humans, activity in a similar set of brain regions has been associated with visuotactile stimulation of the hand. However, little is known about how these areas integrate visual and tactile information. In this functional magnetic resonance imaging experiment, we employed tactile, visual, and visuotactile stimulation of the right hand in an ecologically valid setup where participants were looking directly at their upper limb. We identified brain regions that were activated by both visual and tactile stimuli as well as areas exhibiting greater activity in the visuotactile condition than in both unisensory ones. The posterior and inferior parietal, dorsal, and ventral premotor cortices, as well as the cerebellum, all showed evidence of multisensory linear (additive) responses. Nonlinear, superadditive responses were observed in the cortex lining the left anterior intraparietal sulcus, the insula, dorsal premotor cortex, and, subcortically, the putamen. These results identify a set of candidate frontal, parietal and subcortical regions that integrate visual and tactile information for the multisensory perception of one's own hand.
Collapse
Affiliation(s)
- Giovanni Gentile
- Brain, Body and Self Laboratory, Department of Neuroscience, Karolinska Institutet, Stockholm, Sweden.
| | | | | |
Collapse
|
55
|
Stein BE, Burr D, Constantinidis C, Laurienti PJ, Alex Meredith M, Perrault TJ, Ramachandran R, Röder B, Rowland BA, Sathian K, Schroeder CE, Shams L, Stanford TR, Wallace MT, Yu L, Lewkowicz DJ. Semantic confusion regarding the development of multisensory integration: a practical solution. Eur J Neurosci 2010; 31:1713-20. [PMID: 20584174 PMCID: PMC3055172 DOI: 10.1111/j.1460-9568.2010.07206.x] [Citation(s) in RCA: 80] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
There is now a good deal of data from neurophysiological studies in animals and behavioral studies in human infants regarding the development of multisensory processing capabilities. Although the conclusions drawn from these different datasets sometimes appear to conflict, many of the differences are due to the use of different terms to mean the same thing and, more problematic, the use of similar terms to mean different things. Semantic issues are pervasive in the field and complicate communication among groups using different methods to study similar issues. Achieving clarity of communication among different investigative groups is essential for each to make full use of the findings of others, and an important step in this direction is to identify areas of semantic confusion. In this way investigators can be encouraged to use terms whose meaning and underlying assumptions are unambiguous because they are commonly accepted. Although this issue is of obvious importance to the large and very rapidly growing number of researchers working on multisensory processes, it is perhaps even more important to the non-cognoscenti. Those who wish to benefit from the scholarship in this field but are unfamiliar with the issues identified here are most likely to be confused by semantic inconsistencies. The current discussion attempts to document some of the more problematic of these, begin a discussion about the nature of the confusion and suggest some possible solutions.
Collapse
Affiliation(s)
- Barry E Stein
- Department of Neurobiology and Anatomy, Wake Forest University School of Medicine, Winston-Salem, NC 27157-1010, USA.
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
Collapse
|
56
|
Barutchu A, Danaher J, Crewther SG, Innes-Brown H, Shivdasani MN, Paolini AG. Audiovisual integration in noise by children and adults. J Exp Child Psychol 2010; 105:38-50. [DOI: 10.1016/j.jecp.2009.08.005] [Citation(s) in RCA: 69] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/23/2008] [Revised: 08/31/2009] [Accepted: 08/31/2009] [Indexed: 11/28/2022]
|
57
|
Pagel B, Heed T, Röder B. Change of reference frame for tactile localization during child development. Dev Sci 2009; 12:929-37. [PMID: 19840048 DOI: 10.1111/j.1467-7687.2009.00845.x] [Citation(s) in RCA: 55] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
Abstract
Temporal order judgements (TOJ) for two tactile stimuli, one presented to the left and one to the right hand, are less precise when the hands are crossed over the midline than when the hands are uncrossed. This 'crossed hand' effect has been considered as evidence for a remapping of tactile input into an external reference frame. Since late, but not early, blind individuals show such remapping, it has been hypothesized that the use of an external reference frame develops during childhood. Five- to 10-year-old children were therefore tested with the tactile TOJ task, both with uncrossed and crossed hands. Overall performance in the TOJ task improved with age. While children older than 5 1/2 years displayed a crossed hand effect, younger children did not. Therefore the use of an external reference frame for tactile, and possibly multisensory, localization seems to be acquired at age 5.
Collapse
Affiliation(s)
- Birthe Pagel
- Biological Psychology and Neuropsychology, University of Hamburg, Germany
| | | | | |
Collapse
|
58
|
Stein BE, Stanford TR, Rowland BA. The neural basis of multisensory integration in the midbrain: its organization and maturation. Hear Res 2009; 258:4-15. [PMID: 19345256 PMCID: PMC2787841 DOI: 10.1016/j.heares.2009.03.012] [Citation(s) in RCA: 83] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/09/2009] [Revised: 03/13/2009] [Accepted: 03/16/2009] [Indexed: 11/20/2022]
Abstract
Multisensory integration describes a process by which information from different sensory systems is combined to influence perception, decisions, and overt behavior. Despite a widespread appreciation of its utility in the adult, its developmental antecedents have received relatively little attention. Here we review what is known about the development of multisensory integration, with a focus on the circuitry and experiential antecedents of its development in the model system of the multisensory (i.e., deep) layers of the superior colliculus. Of particular interest here are two sets of experimental observations: (1) cortical influences appear essential for multisensory integration in the SC, and (2) postnatal experience guides its maturation. The current belief is that the experience normally gained during early life is instantiated in the cortico-SC projection, and that this is the primary route by which ecological pressures adapt SC multisensory integration to the particular environment in which it will be used.
Collapse
Affiliation(s)
- Barry E Stein
- Department of Neurobiology and Anatomy, Wake Forest University School of Medicine, Medical Center Blvd., Winston-Salem, NC 27157-1010, USA.
| | | | | |
Collapse
|
59
|
Abstract
Studies of children treated for dense cataract shed light on the extent to which pattern stimulation drives normal visual development and whether there are sensitive periods during which an abnormal visual environment is especially detrimental. Here, we summarize the findings to date into five general principles: (1) At least for low-level vision, aspects of vision that develop the earliest are the least likely to be adversely affected by abnormal visual input whereas those that develop later are affected more severely. (2) Early visual input is necessary to preserve the neural infrastructure for later visual learning, even for visual capabilities that will not appear until later in development. (3) The development of both the dorsal and ventral streams depends on normal visual input. (4) After monocular deprivation has been treated by surgical removal of the cataractous lens, the interactions between the aphakic and phakic eyes are competitive for low-level vision but are complementary for high-level vision. (5) There are multiple sensitive periods during which experience can influence visual development.The studies described here have important implications for understanding normal development. They indicate that patterned visual input immediately after birth plays a vital role in the construction and preservation of the neural architecture that will later mediate sensitivity to both basic and higher level aspects of vision. The period during which patterned visual input is necessary for normal visual development varies widely across different aspects of vision and can range from only a few months after birth to more than the first 10 years of life. The results point to new research questions on why early visual deprivation can cause later deficits, what limits adult plasticity, and whether effective rehabilitation in other areas can provide new clues for the treatment of amblyopia.
Collapse
|
60
|
Barutchu A, Crewther DP, Crewther SG. The race that precedes coactivation: development of multisensory facilitation in children. Dev Sci 2009; 12:464-73. [PMID: 19371371 DOI: 10.1111/j.1467-7687.2008.00782.x] [Citation(s) in RCA: 72] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
RATIONALE The facilitating effect of multisensory integration on motor responses in adults is much larger than predicted by race-models and is in accordance with the idea of coactivation. However, the development of multisensory facilitation of endogenously driven motor processes and its relationship to the development of complex cognitive skills in school-age children is largely unexplored. METHOD Twenty adults and 95 children where allocated into six age groups: 6, 7, 8, 9, 10-11 and adults. Participants' motor reaction times (MRTs) and accuracy in response to the detection of auditory, visual and audiovisual stimuli were recorded. Children's reading accuracy and nonverbal IQ were also assessed. RESULTS In general, MRTs of children were significantly slower with greater variability than those of adults. Although the average level of multisensory facilitation was similar for all age groups, mean cumulative density functions (CDFs) showed that multisensory facilitation in 6 and 10-11-year-olds is within the predictive limits of race-models. Where coactivation was seen in the CDF of individual children it was not as strong or as consistent as that in adults. The degree of multisensory facilitation did not correlate with age, reading accuracy or IQ. CONCLUSION The average level of multisensory facilitation to endogenously driven motor responses does not change gradually with age nor is it related to intelligence or reading accuracy. In general, multisensory integration remains immature until 10-11 years of age and lies within the predicted confines of race-models.
Collapse
Affiliation(s)
- Ayla Barutchu
- School of Psychological Science, La Trobe University, Bundoora, Australia
| | | | | |
Collapse
|
61
|
Röder B, Föcker J, Hötting K, Spence C. Spatial coordinate systems for tactile spatial attention depend on developmental vision: evidence from event-related potentials in sighted and congenitally blind adult humans. Eur J Neurosci 2008; 28:475-83. [DOI: 10.1111/j.1460-9568.2008.06352.x] [Citation(s) in RCA: 62] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
62
|
Gori M, Del Viva M, Sandini G, Burr DC. Young Children Do Not Integrate Visual and Haptic Form Information. Curr Biol 2008; 18:694-8. [PMID: 18450446 DOI: 10.1016/j.cub.2008.04.036] [Citation(s) in RCA: 333] [Impact Index Per Article: 19.6] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2008] [Revised: 03/21/2008] [Accepted: 04/04/2008] [Indexed: 11/19/2022]
Affiliation(s)
- Monica Gori
- Istituto Italiano di Tecnologia, Genoa, Italy
| | | | | | | |
Collapse
|
63
|
Batterson VG, Rose SA, Yonas A, Grant KS, Sackett GP. The effect of experience on the development of tactual-visual transfer in pigtailed macaque monkeys. Dev Psychobiol 2008; 50:88-96. [PMID: 18085561 DOI: 10.1002/dev.20256] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
The study described here is the first to experimentally demonstrate the effects of experience on the development of tactual-visual transfer. Infant pigtailed macaque monkeys (Macaca nemestrina) were reared from birth to 2 months of age in special cages that allowed the separation of tactual and visual experience. When assessed on a battery of measures at the end of the 2-month period, animals reared without the opportunity to integrate information across the two sensory modalities performed at chance levels on a paired-comparison measure of tactual-visual transfer and performed worse than controls in a visually guided reaching task. After living in the standard laboratory environment for 2 additional months, they were reassessed. While their visually guided reaching now no longer differed from that of controls, they continued to perform at chance on the tactual-visual transfer assessment and their performance on this task was significantly worse than the control groups. Performance on visual acuity and visual recognition memory measures did not differ between groups at either age, suggesting that the deficit was limited to tactual-visual functioning. The results are discussed in terms of a possible sensitive period during which specific environmental input is required for the development of normal tactual-visual cross-modal processing.
Collapse
Affiliation(s)
- Virginia Gunderson Batterson
- Department of Comparative Medicine, Center on Human Development and Disability, Washington National Primate Research Center, University of Washington, WA, USA
| | | | | | | | | |
Collapse
|
64
|
Stein BE, Stanford TR. Multisensory integration: current issues from the perspective of the single neuron. Nat Rev Neurosci 2008; 9:255-66. [PMID: 18354398 DOI: 10.1038/nrn2331] [Citation(s) in RCA: 958] [Impact Index Per Article: 56.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
For thousands of years science philosophers have been impressed by how effectively the senses work together to enhance the salience of biologically meaningful events. However, they really had no idea how this was accomplished. Recent insights into the underlying physiological mechanisms reveal that, in at least one circuit, this ability depends on an intimate dialogue among neurons at multiple levels of the neuraxis; this dialogue cannot take place until long after birth and might require a specific kind of experience. Understanding the acquisition and usage of multisensory integration in the midbrain and cerebral cortex of mammals has been aided by a multiplicity of approaches. Here we examine some of the fundamental advances that have been made and some of the challenging questions that remain.
Collapse
Affiliation(s)
- Barry E Stein
- Department of Neurobiology and Anatomy, Wake Forest University School of Medicine, Winston-Salem, North Carolina 27157, USA.
| | | |
Collapse
|
65
|
Shore SE, Koehler S, Oldakowski M, Hughes LF, Syed S. Dorsal cochlear nucleus responses to somatosensory stimulation are enhanced after noise-induced hearing loss. Eur J Neurosci 2008; 27:155-68. [PMID: 18184319 PMCID: PMC2614620 DOI: 10.1111/j.1460-9568.2007.05983.x] [Citation(s) in RCA: 143] [Impact Index Per Article: 8.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
Multisensory neurons in the dorsal cochlear nucleus (DCN) achieve their bimodal response properties [Shore (2005) Eur. J. Neurosci., 21, 3334-3348] by integrating auditory input via VIIIth nerve fibers with somatosensory input via the axons of cochlear nucleus granule cells [Shore et al. (2000) J. Comp. Neurol., 419, 271-285; Zhou & Shore (2004)J. Neurosci. Res., 78, 901-907]. A unique feature of multisensory neurons is their propensity for receiving cross-modal compensation following sensory deprivation. Thus, we investigated the possibility that reduction of VIIIth nerve input to the cochlear nucleus results in trigeminal system compensation for the loss of auditory inputs. Responses of DCN neurons to trigeminal and bimodal (trigeminal plus acoustic) stimulation were compared in normal and noise-damaged guinea pigs. The guinea pigs with noise-induced hearing loss had significantly lower thresholds, shorter latencies and durations, and increased amplitudes of response to trigeminal stimulation than normal animals. Noise-damaged animals also showed a greater proportion of inhibitory and a smaller proportion of excitatory responses compared with normal. The number of cells exhibiting bimodal integration, as well as the degree of integration, was enhanced after noise damage. In accordance with the greater proportion of inhibitory responses, bimodal integration was entirely suppressive in the noise-damaged animals with no indication of the bimodal enhancement observed in a sub-set of normal DCN neurons. These results suggest that projections from the trigeminal system to the cochlear nucleus are increased and/or redistributed after hearing loss. Furthermore, the finding that only neurons activated by trigeminal stimulation showed increased spontaneous rates after cochlear damage suggests that somatosensory neurons may play a role in the pathogenesis of tinnitus.
Collapse
Affiliation(s)
- S E Shore
- Department of Otolaryngology, Kresge Hearing Research Institute, University of Michigan Medical School, Ann Arbor, MI 48109, USA.
| | | | | | | | | |
Collapse
|
66
|
Carriere BN, Royal DW, Perrault TJ, Morrison SP, Vaughan JW, Stein BE, Wallace MT. Visual deprivation alters the development of cortical multisensory integration. J Neurophysiol 2007; 98:2858-67. [PMID: 17728386 DOI: 10.1152/jn.00587.2007] [Citation(s) in RCA: 90] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
It has recently been demonstrated that the maturation of normal multisensory circuits in the cortex of the cat takes place over an extended period of postnatal life. Such a finding suggests that the sensory experiences received during this time may play an important role in this developmental process. To test the necessity of sensory experience for normal cortical multisensory development, cats were raised in the absence of visual experience from birth until adulthood, effectively precluding all visual and visual-nonvisual multisensory experiences. As adults, semichronic single-unit recording experiments targeting the anterior ectosylvian sulcus (AES), a well-defined multisensory cortical area in the cat, were initiated and continued at weekly intervals in anesthetized animals. Despite having very little impact on the overall sensory representations in AES, dark-rearing had a substantial impact on the integrative capabilities of multisensory AES neurons. A significant increase was seen in the proportion of multisensory neurons that were modulated by, rather than driven by, a second sensory modality. More important, perhaps, there was a dramatic shift in the percentage of these modulated neurons in which the pairing of weakly effective and spatially and temporally coincident stimuli resulted in response depressions. In normally reared animals such combinations typically give rise to robust response enhancements. These results illustrate the important role sensory experience plays in shaping the development of mature multisensory cortical circuits and suggest that dark-rearing shifts the relative balance of excitation and inhibition in these circuits.
Collapse
Affiliation(s)
- Brian N Carriere
- Kennedy Center for Research on Human Development, Vanderbilt University, Nashville, Tennessee, USA.
| | | | | | | | | | | | | |
Collapse
|
67
|
Gondan M, Vorberg D, Greenlee MW. Modality shift effects mimic multisensory interactions: an event-related potential study. Exp Brain Res 2007; 182:199-214. [PMID: 17562033 DOI: 10.1007/s00221-007-0982-4] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2006] [Accepted: 05/08/2007] [Indexed: 11/26/2022]
Abstract
A frequent approach to study interactions of the auditory and the visual system is to measure event-related potentials (ERPs) to auditory, visual, and auditory-visual stimuli (A, V, AV). A nonzero result of the AV - (A + V) comparison indicates that the sensory systems interact at a specific processing stage. Two possible biases weaken the conclusions drawn by this approach: first, subtracting two ERPs from one requires that A, V, and AV do not share any common activity. We have shown before (Gondan and Röder in Brain Res 1073-1074:389-397, 2006) that the problem of common activity can be avoided using an additional tactile stimulus (T) and evaluating the ERP difference (T + TAV) - (TA + TV). A second possible confound is the modality shift effect (MSE): for example, the auditory N1 is increased if an auditory stimulus follows a visual stimulus, whereas it is smaller if the modality is unchanged (ipsimodal stimulus). Bimodal stimuli might be affected less by MSEs because at least one component always matches the preceding trial. Consequently, an apparent amplitude modulation of the N1 would be observed in AV. We tested the influence of MSEs on auditory-visual interactions by comparing the results of AV - (A + V) using (a) all stimuli and using (b) only ipsimodal stimuli. (a) and (b) differed around 150 ms, this indicates that AV - (A + V) is indeed affected by the MSE. We then formally and empirically demonstrate that (T + TAV) - (TA + TV) is robust against possible biases due to the MSE.
Collapse
Affiliation(s)
- Matthias Gondan
- Department of Psychology, University of Regensburg, 93050 Regensburg, Germany.
| | | | | |
Collapse
|
68
|
Abstract
Congruent information conveyed over different sensory modalities often facilitates a variety of cognitive processes, including speech perception (Sumby & Pollack, 1954). Since auditory processing is substantially faster than visual processing, auditory-visual integration can occur over a surprisingly wide temporal window (Stein, 1998). We investigated the processing architecture mediating the integration of acoustic digit names with corresponding symbolic visual forms. The digits "1" or "2" were presented in auditory, visual, or bimodal format at several stimulus onset asynchronies (SOAs; 0, 75, 150, and 225 msec). The reaction times (RTs) for echoing unimodal auditory stimuli were approximately 100 msec faster than the RTs for naming their visual forms. Correspondingly, bimodal facilitation violated race model predictions, but only at SOA values greater than 75 msec. These results indicate that the acoustic and visual information are pooled prior to verbal response programming. However, full expression of this bimodal summation is dependent on the central coincidence of the visual and auditory inputs. These results are considered in the context of studies demonstrating multimodal activation of regions involved in speech production.
Collapse
|
69
|
Röder B, Kusmierek A, Spence C, Schicke T. Developmental vision determines the reference frame for the multisensory control of action. Proc Natl Acad Sci U S A 2007; 104:4753-8. [PMID: 17360596 PMCID: PMC1838672 DOI: 10.1073/pnas.0607158104] [Citation(s) in RCA: 102] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Both animal and human studies suggest that action goals are defined in external coordinates regardless of their sensory modality. The present study used an auditory-manual task to test whether the default use of such an external reference frame is innately determined or instead acquired during development because of the increasing dominance of vision over manual control. In Experiment I, congenitally blind, late blind, and age-matched sighted adults had to press a left or right response key depending on the bandwidth of pink noise bursts presented from either the left or right loudspeaker. Although the spatial location of the sounds was entirely task-irrelevant, all groups responded more efficiently with uncrossed hands when the sound was presented from the same side as the responding hand ("Simon effect"). This effect reversed with crossed hands only in the congenitally blind: They responded faster with the hand that was located contralateral to the sound source. In Experiment II, the instruction to the participants was changed: They now had to respond with the hand located next to the sound source. In contrast to Experiment I ("Simon-task"), this task required an explicit matching of the sound's location with the position of the responding hand. In Experiment II, the congenitally blind participants showed a significantly larger crossing deficit than both the sighted and late blind adults. This pattern of results implies that developmental vision induces the default use of an external coordinate frame for multisensory action control; this facilitates not only visual but also auditory-manual control.
Collapse
Affiliation(s)
- Brigitte Röder
- *Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, D-20146 Hamburg, Germany; and
- To whom correspondence should be addressed. E-mail:
| | - Anna Kusmierek
- *Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, D-20146 Hamburg, Germany; and
| | - Charles Spence
- Department of Experimental Psychology, University of Oxford, South Parks Road, Oxford OX1 3UD, England
| | - Tobias Schicke
- *Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, D-20146 Hamburg, Germany; and
| |
Collapse
|
70
|
Abstract
Multisensory integration refers to the process by which the brain synthesizes information from different senses to enhance sensitivity to external events. In the present experiments, animals were reared in an altered sensory environment in which visual and auditory stimuli were temporally coupled but originated from different locations. Neurons in the superior colliculus developed a seemingly anomalous form of multisensory integration in which spatially disparate visual-auditory stimuli were integrated in the same way that neurons in normally reared animals integrated visual-auditory stimuli from the same location. The data suggest that the principles governing multisensory integration are highly plastic and that there is no a priori spatial relationship between stimuli from different senses that is required for their integration. Rather, these principles appear to be established early in life based on the specific features of an animal's environment to best adapt it to deal with that environment later in life.
Collapse
Affiliation(s)
- Mark T Wallace
- Department of Neurobiology and Anatomy, Wake Forest University School of Medicine, Winston-Salem, North Carolina, USA.
| | | |
Collapse
|
71
|
Neil PA, Chee-Ruiter C, Scheier C, Lewkowicz DJ, Shimojo S. Development of multisensory spatial integration and perception in humans. Dev Sci 2006; 9:454-64. [PMID: 16911447 DOI: 10.1111/j.1467-7687.2006.00512.x] [Citation(s) in RCA: 103] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
Previous studies have shown that adults respond faster and more reliably to bimodal compared to unimodal localization cues. The current study investigated for the first time the development of audiovisual (A-V) integration in spatial localization behavior in infants between 1 and 10 months of age. We observed infants' head and eye movements in response to auditory, visual, or both kinds of stimuli presented either 25 degrees or 45 degrees to the right or left of midline. Infants under 8 months of age intermittently showed response latencies significantly faster toward audiovisual targets than toward either auditory or visual targets alone They did so, however, without exhibiting a reliable violation of the Race Model, suggesting that probability summation alone could explain the faster bimodal response. In contrast, infants between 8 and 10 months of age exhibited bimodal response latencies significantly faster than unimodal latencies for both eccentricity conditions and their latencies violated the Race Model at 25 degrees eccentricity. In addition to this main finding, we found age-dependent eccentricity and modality effects on response latencies. Together, these findings suggest that audiovisual integration emerges late in the first year of life and are consistent with neurophysiological findings from multisensory sites in the superior colliculus of infant monkeys showing that multisensory enhancement of responsiveness is not present at birth but emerges later in life.
Collapse
Affiliation(s)
- Patricia A Neil
- Computation and Neural Systems Department, California Institute of Technology, USA.
| | | | | | | | | |
Collapse
|
72
|
Lewkowicz DJ, Ghazanfar AA. The decline of cross-species intersensory perception in human infants. Proc Natl Acad Sci U S A 2006; 103:6771-4. [PMID: 16618919 PMCID: PMC1458955 DOI: 10.1073/pnas.0602027103] [Citation(s) in RCA: 110] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2006] [Indexed: 11/18/2022] Open
Abstract
Between 6 and 10 months of age, infants become better at discriminating among native voices and human faces and worse at discriminating among nonnative voices and other species' faces. We tested whether these unisensory perceptual narrowing effects reflect a general ontogenetic feature of perceptual systems by testing across sensory modalities. We showed pairs of monkey faces producing two different vocalizations to 4-, 6-, 8-, and 10-month-old infants and asked whether they would prefer to look at the corresponding face when they heard one of the two vocalizations. Only the two youngest groups exhibited intersensory matching, indicating that perceptual narrowing is pan-sensory and a fundamental feature of perceptual development.
Collapse
Affiliation(s)
- David J Lewkowicz
- Department of Psychology, Florida Atlantic University, 777 Glades Road, Boca Raton, FL 33431, USA.
| | | |
Collapse
|
73
|
Wu CWH, Bichot NP, Kaas JH. Somatosensory areas S2 and PV project to the superior colliculus of a prosimian primate, Galago garnetti. Somatosens Mot Res 2006; 22:221-31. [PMID: 16338830 DOI: 10.1080/08990220500262661] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
Abstract
As part of an effort to describe the connections of the somatosensory system in Galago garnetti, a small prosimian primate, injections of tracers into cortex revealed that two somatosensory areas, the second somatosensory area (S2) and the parietal ventral somatosensory area (PV), project densely to the ipsilateral superior colliculus, while the primary somatosensory area (S1 or area 3b) does not. The three cortical areas were defined in microelectrode mapping experiments and recordings were used to identify appropriate injection sites in the same cases. Injections of wheat germ agglutinin conjugated with horseradish peroxidase (WGA-HRP) were placed in S1 in different mediolateral locations representing body regions from toes to face in five galagos, and none of these injections labeled projections to the superior colliculus. In contrast, each of the two injections in the face representation of S2 in two galagos and three injections in face and forelimb representations of PV in three galagos produced dense patches of labeled terminations and axons in the intermediate gray (layer IV) over the full extent of the superior colliculus. The results suggest that the higher-order somatosensory areas, PV and S2, are directly involved in the visuomotor functions of the superior colliculus in prosimian primates, while S1 is not. The somatosensory inputs appear to be too widespread to contribute to a detailed somatotopic representation in the superior colliculus, but they may be a source of somatosensory modulation of retinotopically guided oculomotor instructions.
Collapse
Affiliation(s)
- Carolyn W-H Wu
- Department of Psychology, Vanderbilt University, Nashville, TN 37240, USA
| | | | | |
Collapse
|
74
|
Gondan M, Röder B. A new method for detecting interactions between the senses in event-related potentials. Brain Res 2006; 1073-1074:389-97. [PMID: 16427613 DOI: 10.1016/j.brainres.2005.12.050] [Citation(s) in RCA: 44] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2005] [Revised: 12/05/2005] [Accepted: 12/08/2005] [Indexed: 10/25/2022]
Abstract
Event-related potentials (ERPs) can be used in multisensory research to determine the point in time when different senses start to interact, for example, the auditory and the visual system. For this purpose, the ERP to bimodal stimuli (AV) is often compared to the sum of the ERPs to auditory (A) and visual (V) stimuli: AV - (A + V). If the result is non-zero, this is interpreted as an indicator for multisensory interactions. Using this method, several studies have demonstrated auditory-visual interactions as early as 50 ms after stimulus onset. The subtraction requires that A, V, and AV do not contain common activity: This activity would be subtracted twice from one ERP and would, therefore, contaminate the result. In the present study, ERPs to unimodal, bimodal, and trimodal auditory, visual, and tactile stimuli (T) were recorded. We demonstrate that (T + TAV) - (TA + TV) is equivalent to AV - (A + V), but common activity is eliminated because two ERPs are subtracted from two others. With this new comparison technique, the first auditory-visual interaction starts around 80 ms after stimulus onset for the present experimental setting. It is possible to apply the new comparison method to other brain imaging techniques, as well, e.g. functional magnetic resonance imaging.
Collapse
Affiliation(s)
- Matthias Gondan
- Department of Experimental Psychology, University of Regensburg, D-93050 Regensburg, Germany.
| | | |
Collapse
|
75
|
Tsurudome K, Li X, Matsumoto N. Intracellular and current source density analyses of somatosensory input to the optic tectum of the frog. Brain Res 2005; 1064:32-41. [PMID: 16289401 DOI: 10.1016/j.brainres.2005.09.064] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2005] [Revised: 09/29/2005] [Accepted: 09/30/2005] [Indexed: 11/15/2022]
Abstract
This is the first report of current source density (CSD) and intracellular analyses of non-optic processing in the frog optic tectum. Sciatic nerve stimulation was used to test for somatosensory input to the optic tectum. To demonstrate the distribution of somatosensory input, field potentials were recorded from the whole surface of both tecta. Two components were observed. An early component was found in the whole area, but a late component was detected only in medial and caudal regions of the contralateral tectum. The effect of different stimulus intensity suggested that the optic tectum receives mainly the tactile sensation with fast conducting, low threshold level afferents from the sciatic nerve. The result of CSD analysis suggests that somatosensory afferents terminate on the tectal neurons with vertically expanding dendrites at the medial site of the contralateral optic tectum where the late component was found. Intracellular recordings demonstrated postsynaptic potentials in the middle and deeper layers, which is consistent with results from mammalian superior colliculus in earlier studies. Additional stimulation of the optic tract demonstrated that some somatosensory neurons had bimodal responses. The responses of those in the middle layers appeared to participate in avoidance behavior, based upon previous CSD analysis of the tectum using optic tract stimulation. All somatosensory responses elicited in these neurons were IPSPs. The findings imply that the somatosensory input to the optic tectum gives a suppressive effect on avoidance behavior. A somatosensory effect on prey-catching behavior could not be found in the present small number of intracellular data.
Collapse
Affiliation(s)
- Kazuya Tsurudome
- Kyushu Institute of Technology, Graduate School of Life Science and Systems Engineering, Department of Brain Science and Engineering, Hibikino 2-4, Kitakyushu, Fukuoka 808-0196, Japan
| | | | | |
Collapse
|
76
|
Senkowski D, Talsma D, Herrmann CS, Woldorff MG. Multisensory processing and oscillatory gamma responses: effects of spatial selective attention. Exp Brain Res 2005; 166:411-26. [PMID: 16151775 DOI: 10.1007/s00221-005-2381-z] [Citation(s) in RCA: 91] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2004] [Accepted: 10/22/2004] [Indexed: 10/25/2022]
Abstract
Here we describe an EEG study investigating the interactions between multisensory (audio-visual) integration and spatial attention, using oscillatory gamma-band responses (GBRs). The results include a comparison with previously reported event-related potential (ERP) findings from the same paradigm. Unisensory-auditory (A), unisensory-visual (V), and multisensory (AV) stimuli were presented to the left and right hemispaces while subjects attended to a designated side to detect deviant target stimuli in either sensory modality. For attended multisensory stimuli we observed larger evoked GBRs approximately 40-50 ms post-stimulus over medial-frontal brain areas compared with those same multisensory stimuli when unattended. Further analysis indicated that the integration effect and its attentional enhancement may be caused in part by a stimulus-triggered phase resetting of ongoing gamma-band responses. Interestingly, no such early interaction effects (<90 ms) could be found in the ERP waveforms, suggesting that oscillatory GBRs may be more sensitive than ERPs to these early latency attention effects. Moreover, no GBR attention effects could be found for the unisensory auditory or unisensory visual stimuli, suggesting that attention particularly affects the integrative processing of audiovisual stimuli at these early latencies.
Collapse
Affiliation(s)
- Daniel Senkowski
- Center for Cognitive Neuroscience, Duke University, Box 90999, Durham, NC 27708-0999, USA
| | | | | | | |
Collapse
|
77
|
Jones SS. Exploration or imitation? The effect of music on 4-week-old infants' tongue protrusions. Infant Behav Dev 2005; 29:126-30. [PMID: 17138267 DOI: 10.1016/j.infbeh.2005.08.004] [Citation(s) in RCA: 66] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2004] [Revised: 07/22/2005] [Accepted: 08/04/2005] [Indexed: 11/19/2022]
Abstract
In a newborn imitation paradigm, an auditory stimulus--music--replaced the standard adult behavioral model. Alternating intervals of music and silence affected 4-week-old infants' rates of tongue protruding--evidence that tongue protruding is a general response to interesting distal stimuli.
Collapse
Affiliation(s)
- Susan S Jones
- Department of Psychology, Indiana University, 1101 E. 10th Street, Bloomington, IN 47401, USA.
| |
Collapse
|
78
|
Meyer GF, Wuerger SM, Röhrbein F, Zetzsche C. Low-level integration of auditory and visual motion signals requires spatial co-localisation. Exp Brain Res 2005; 166:538-47. [PMID: 16143858 DOI: 10.1007/s00221-005-2394-7] [Citation(s) in RCA: 71] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2004] [Accepted: 12/17/2004] [Indexed: 10/25/2022]
Abstract
It is well known that the detection thresholds for stationary auditory and visual signals are lower if the signals are presented bimodally rather than unimodally, provided the signals coincide in time and space. Recent work on auditory-visual motion detection suggests that the facilitation seen for stationary signals is not seen for motion signals. We investigate the conditions under which motion perception also benefits from the integration of auditory and visual signals. We show that the integration of cross-modal local motion signals that are matched in position and speed is consistent with thresholds predicted by a neural summation model. If the signals are presented in different hemi-fields, move in different directions, or both, then behavioural thresholds are predicted by a probability-summation model. We conclude that cross-modal signals have to be co-localised and co-incident for effective motion integration. We also argue that facilitation is only seen if the signals contain all localisation cues that would be produced by physical objects.
Collapse
Affiliation(s)
- Georg F Meyer
- Centre for Cognitive Neuroscience, School of Psychology, University of Liverpool, Eleanor Rathbone Bldg., Bedford Street South, Liverpool, L69 7AZ, UK.
| | | | | | | |
Collapse
|
79
|
Dharani NE. The role of vestibular system and the cerebellum in adapting to gravitoinertial, spatial orientation and postural challenges of REM sleep. Med Hypotheses 2005; 65:83-9. [PMID: 15893123 DOI: 10.1016/j.mehy.2005.01.033] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2004] [Accepted: 01/27/2005] [Indexed: 10/25/2022]
Abstract
The underlying reasons for, and mechanisms of rapid eye movement (REM) sleep events remain a mystery. The mystery has arisen from interpreting REM sleep events as occurring in 'isolation' from the world at large, and phylogenetically ancient brain areas using 'primal' gravity-dependent coordinates, reflexes and stimuli parameters to relay and process information about self and environment. This paper views REM sleep as a phylogenetically older form of wakefulness, wherein the brain uses a gravitoinertial-centred reference frame and an internal self-object model to evaluate and integrate inputs from several sensory systems and to adapt to spatial-temporal disintegration and malignant cholinergic-induced vasodepressor/ventilatory threat. The integration of vestibular and non-vestibular sensory graviceptor signals enables estimation and control of centre of the body mass, position and spatial relationship of body parts, gaze, head and whole-body tilt, spatial orientation and autonomic functions relative to gravity. The vestibulocerebellum and vermis, via vestibular and fastigial nucleus, coordinate inputs and outputs from several sensory systems and modulate the amplitude and duration of 'fight-or-flight' vestibulo-orienting and autonomic 'burst' responses to overcome the ongoing challenges. Resolving multisystem conflicts during the unique stresses (gravitoinertial, hypoxic, thermal, immobilisation, etc.) of REM sleep enables learning, cross-modal plasticity, higher-order integration and multidimensional spatial updating of sensory-motor-cognitive components. This paper aims to generate discussion, reinterpretation and creative testing of this novel hypothesis, which, if experimentally confirmed, has major implications across medicine, bioscience and space physiology, from developmental, clinical, research and theoretical perspectives.
Collapse
Affiliation(s)
- Nataraj E Dharani
- Royal Australian and New Zealand College of Psychiatrists, 309 La Trobe Street, Melbourne, Victoria 3000, Australia.
| |
Collapse
|
80
|
Talsma D, Woldorff MG. Selective Attention and Multisensory Integration: Multiple Phases of Effects on the Evoked Brain Activity. J Cogn Neurosci 2005; 17:1098-114. [PMID: 16102239 DOI: 10.1162/0898929054475172] [Citation(s) in RCA: 292] [Impact Index Per Article: 14.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
We used event-related potentials (ERPs) to evaluate the role of attention in the integration of visual and auditory features of multisensory objects. This was done by contrasting the ERPs to multisensory stimuli (AV) to the sum of the ERPs to the corresponding auditory-only (A) and visual-only (V) stimuli [i.e., AV vs. (A + V)]. V, A, and VA stimuli were presented in random order to the left and right hemispaces. Subjects attended to a designated side to detect infrequent target stimuli in either modality there. The focus of this report is on the ERPs to the standard (i.e., nontarget) stimuli. We used rapid variable stimulus onset asynchronies (350-650 msec) to mitigate anticipatory activity and included “no-stim” trials to estimate and remove ERP overlap from residual anticipatory processes and from adjacent stimuli in the sequence. Spatial attention effects on the processing of the unisensory stimuli consisted of a modulation of visual P1 and N1 components (at 90-130 msec and 160-200 msec, respectively) and of the auditory N1 and processing negativity (100-200 msec). Attended versus unattended multisensory ERPs elicited a combination of these effects. Multisensory integration effects consisted of an initial frontal positivity around 100 msec that was larger for attended stimuli. This was followed by three phases of centro-medially distributed effects of integration and/or attention beginning at around 160 msec, and peaking at 190 (scalp positivity), 250 (negativity), and 300-500 msec (positivity) after stimulus onset. These integration effects were larger in amplitude for attended than for unattended stimuli, providing neural evidence that attention can modulate multisensory-integration processes at multiple stages.
Collapse
|
81
|
Menning H, Ackermann H, Hertrich I, Mathiak K. Spatial auditory attention is modulated by tactile priming. Exp Brain Res 2005; 164:41-7. [PMID: 15726341 DOI: 10.1007/s00221-004-2212-7] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2004] [Accepted: 11/16/2004] [Indexed: 11/28/2022]
Abstract
Previous studies have shown that cross-modal processing affects perception at a variety of neuronal levels. In this study, event-related brain responses were recorded via whole-head magnetoencephalography (MEG). Spatial auditory attention was directed via tactile pre-cues (primes) to one of four locations in the peripersonal space (left and right hand versus face). Auditory stimuli were white noise bursts, convoluted with head-related transfer functions, which ensured spatial perception of the four locations. Tactile primes (200-300 ms prior to acoustic onset) were applied randomly to one of these locations. Attentional load was controlled by three different visual distraction tasks. The auditory P50m (about 50 ms after stimulus onset) showed a significant "proximity" effect (larger responses to face stimulation as well as a "contralaterality" effect between side of stimulation and hemisphere). The tactile primes essentially reduced both the P50m and N100m components. However, facial tactile pre-stimulation yielded an enhanced ipsilateral N100m. These results show that earlier responses are mainly governed by exogenous stimulus properties whereas cross-sensory interaction is spatially selective at a later (endogenous) processing stage.
Collapse
Affiliation(s)
- Hans Menning
- MEG Center, Center for Neurology, University of Tübingen, Otfried-Müller-Str. 47, 72076, Tübingen, Germany
| | | | | | | |
Collapse
|
82
|
Skaliora I, Doubell TP, Holmes NP, Nodal FR, King AJ. Functional Topography of Converging Visual and Auditory Inputs to Neurons in the Rat Superior Colliculus. J Neurophysiol 2004; 92:2933-46. [PMID: 15229210 DOI: 10.1152/jn.00450.2004] [Citation(s) in RCA: 36] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
We have used a slice preparation of the infant rat midbrain to examine converging inputs onto neurons in the deeper multisensory layers of the superior colliculus (dSC). Electrical stimulation of the superficial visual layers (sSC) and of the auditory nucleus of the brachium of the inferior colliculus (nBIC) evoked robust monosynaptic responses in dSC cells. Furthermore, the inputs from the sSC were found to be topographically organized as early as the second postnatal week and thus before opening of the eyes and ear canals. This precocious topography was found to be sculpted by GABAA-mediated inhibition of a more widespread set of connections. Tracer injections in the nBIC, both in coronal slices as well as in hemisected brains, confirmed a robust projection originating in the nBIC with distinct terminals in the proximity of the cell bodies of dSC neurons. Combined stimulation of the sSC and nBIC sites revealed that the presumptive visual and auditory inputs are summed linearly. Finally, whereas either input on its own could manifest a significant degree of paired-pulse facilitation, temporally offset stimulation of the two sites revealed no synaptic interactions, indicating again that the two inputs function independently. Taken together, these data provide the first detailed intracellular analysis of convergent sensory inputs onto dSC neurons and form the basis for further exploration of multisensory integration and developmental plasticity.
Collapse
Affiliation(s)
- Irini Skaliora
- University Laboratory of Physiology, University of Oxford, Oxford OX1 3PT, UK.
| | | | | | | | | |
Collapse
|
83
|
Hötting K, Rösler F, Röder B. Altered auditory-tactile interactions in congenitally blind humans: an event-related potential study. Exp Brain Res 2004; 159:370-81. [PMID: 15241575 DOI: 10.1007/s00221-004-1965-3] [Citation(s) in RCA: 48] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2003] [Accepted: 04/28/2004] [Indexed: 10/26/2022]
Abstract
It has been shown that stimuli of a task-irrelevant modality receive enhanced processing when they are presented at an attended location in space (crossmodal attention). The present study investigated the effects of visual deprivation on the interaction of the intact sensory systems. Random streams of tactile and auditory stimuli were presented at the left or right index finger of congenitally blind participants. They had to attend to one modality (auditory or tactile) of one side (left or right) and had to respond to deviant stimuli of the attended modality and side. While in a group of sighted participants, early event-related potentials (ERPs) were negatively displaced to stimuli presented at the attended position, compared to the unattended, for both the task-relevant and the task-irrelevant modality, starting as early as 80 ms after stimulus onset (unimodal and crossmodal spatial attention effects, respectively), corresponding crossmodal effects could not be detected in the blind. In the sighted, spatial attention effects after 200 ms were only significant for the task-relevant modality, whereas a crossmodal effect for this late time window was observed in the blind. This positive rather than negative effect possibly indicates an active suppression of task-irrelevant stimuli at an attended location in space. The present data suggest that developmental visual input is essential for the use of space to integrate input of the non-visual modalities, possibly because of its high spatial resolution. Alternatively, enhanced perceptual skills of the blind within the intact modalities may result in reduced multisensory interactions ("inverse effectiveness of multisensory integration").
Collapse
Affiliation(s)
- Kirsten Hötting
- Department of Psychology, Philipps-University Marburg, Gutenbergstrasse 18, 35032 Marburg, Germany.
| | | | | |
Collapse
|
84
|
|
85
|
|
86
|
Anastasio TJ, Patton PE. A two-stage unsupervised learning algorithm reproduces multisensory enhancement in a neural network model of the corticotectal system. J Neurosci 2003; 23:6713-27. [PMID: 12890764 PMCID: PMC6740726] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/27/2002] [Revised: 04/17/2003] [Accepted: 04/17/2003] [Indexed: 03/04/2023] Open
Abstract
Multisensory enhancement (MSE) is the augmentation of the response to sensory stimulation of one modality by stimulation of a different modality. It has been described for multisensory neurons in the deep superior colliculus (DSC) of mammals, which function to detect, and direct orienting movements toward, the sources of stimulation (targets). MSE would seem to improve the ability of DSC neurons to detect targets, but many mammalian DSC neurons are unimodal. MSE requires descending input to DSC from certain regions of parietal cortex. Paradoxically, the descending projections necessary for MSE originate from unimodal cortical neurons. MSE, and the puzzling findings associated with it, can be simulated using a model of the corticotectal system. In the model, a network of DSC units receives primary sensory input that can be augmented by modulatory cortical input. Connection weights from primary and modulatory inputs are trained in stages one (Hebb) and two (Hebb-anti-Hebb), respectively, of an unsupervised two-stage algorithm. Two-stage training causes DSC units to extract information concerning simulated targets from their inputs. It also causes the DSC to develop a mixture of unimodal and multisensory units. The percentage of DSC multisensory units is determined by the proportion of cross-modal targets and by primary input ambiguity. Multisensory DSC units develop MSE, which depends on unimodal modulatory connections. Removal of the modulatory influence greatly reduces MSE but has little effect on DSC unit responses to stimuli of a single modality. The correspondence between model and data suggests that two-stage training captures important features of self-organization in the real corticotectal system.
Collapse
Affiliation(s)
- Thomas J Anastasio
- Department of Molecular and Integrative Physiology, University of Illinois at Urbana/Champaign, Urbana, Illinois 61801, USA.
| | | |
Collapse
|
87
|
Doubell TP, Skaliora I, Baron J, King AJ. Functional connectivity between the superficial and deeper layers of the superior colliculus: an anatomical substrate for sensorimotor integration. J Neurosci 2003; 23:6596-607. [PMID: 12878701 PMCID: PMC6740636] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/03/2023] Open
Abstract
The superior colliculus (SC) transforms both visual and nonvisual sensory signals into motor commands that control orienting behavior. Although the afferent and efferent connections of this midbrain nucleus have been well characterized, little is know about the intrinsic circuitry involved in sensorimotor integration. Transmission of visual signals from the superficial (sSC) to the deeper layers (dSC) of the SC has been implicated in both the triggering of orienting movements and the activity-dependent processes that align maps of different sensory modalities during development. However, evidence for the synaptic connectivity appropriate for these functions is lacking. In this study, we used a variety of anatomical and physiological methods to examine the functional organization of the sSC-dSC pathway in juvenile and adult ferrets. Axonal tracing in adult ferrets showed that, as in other species, sSC neurons project topographically to the dSC, providing a route for the transmission of visual signals to the multisensory output layers of the SC. We found that sSC axons terminate on dSC neurons that stain prominently for the NR1 subunit of the NMDA receptor, a subpopulation of which were identified as tectoreticulospinal projection neurons. We also show that the sSC-dSC pathway is topographically organized and mediated by monosynaptic excitatory synapses even before eye opening in young ferrets, suggesting that visual signals routed via the sSC may influence the activity of dSC neurons before the emergence of their multisensory response properties. These findings indicate that superficial- to deep-layer projections provide spatially ordered visual signals, both during development and into adulthood, directly to SC neurons that are involved in coordinating sensory inputs with motor outputs.
Collapse
Affiliation(s)
- Timothy P Doubell
- University Laboratory of Physiology, University of Oxford, Oxford OX1 3PT, United Kingdom.
| | | | | | | |
Collapse
|