1
|
Jertberg RM, Begeer S, Geurts HM, Chakrabarti B, Van der Burg E. Age, not autism, influences multisensory integration of speech stimuli among adults in a McGurk/MacDonald paradigm. Eur J Neurosci 2024; 59:2979-2994. [PMID: 38570828 DOI: 10.1111/ejn.16319] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2023] [Revised: 02/27/2024] [Accepted: 02/28/2024] [Indexed: 04/05/2024]
Abstract
Differences between autistic and non-autistic individuals in perception of the temporal relationships between sights and sounds are theorized to underlie difficulties in integrating relevant sensory information. These, in turn, are thought to contribute to problems with speech perception and higher level social behaviour. However, the literature establishing this connection often involves limited sample sizes and focuses almost entirely on children. To determine whether these differences persist into adulthood, we compared 496 autistic and 373 non-autistic adults (aged 17 to 75 years). Participants completed an online version of the McGurk/MacDonald paradigm, a multisensory illusion indicative of the ability to integrate audiovisual speech stimuli. Audiovisual asynchrony was manipulated, and participants responded both to the syllable they perceived (revealing their susceptibility to the illusion) and to whether or not the audio and video were synchronized (allowing insight into temporal processing). In contrast with prior research with smaller, younger samples, we detected no evidence of impaired temporal or multisensory processing in autistic adults. Instead, we found that in both groups, multisensory integration correlated strongly with age. This contradicts prior presumptions that differences in multisensory perception persist and even increase in magnitude over the lifespan of autistic individuals. It also suggests that the compensatory role multisensory integration may play as the individual senses decline with age is intact. These findings challenge existing theories and provide an optimistic perspective on autistic development. They also underline the importance of expanding autism research to better reflect the age range of the autistic population.
Collapse
Affiliation(s)
- Robert M Jertberg
- Department of Clinical and Developmental Psychology, Vrije Universiteit Amsterdam, The Netherlands and Amsterdam Public Health Research Institute, Amsterdam, Netherlands
| | - Sander Begeer
- Department of Clinical and Developmental Psychology, Vrije Universiteit Amsterdam, The Netherlands and Amsterdam Public Health Research Institute, Amsterdam, Netherlands
| | - Hilde M Geurts
- Dutch Autism and ADHD Research Center (d'Arc), Brain & Cognition, Department of Psychology, Universiteit van Amsterdam, Amsterdam, The Netherlands
- Leo Kannerhuis (Youz/Parnassiagroup), Den Haag, The Netherlands
| | - Bhismadev Chakrabarti
- Centre for Autism, School of Psychology and Clinical Language Sciences, University of Reading, Reading, UK
- India Autism Center, Kolkata, India
- Department of Psychology, Ashoka University, Sonipat, India
| | - Erik Van der Burg
- Dutch Autism and ADHD Research Center (d'Arc), Brain & Cognition, Department of Psychology, Universiteit van Amsterdam, Amsterdam, The Netherlands
| |
Collapse
|
2
|
Basharat A, Mehrabi S, Muñoz JE, Middleton LE, Cao S, Boger J, Barnett-Cowan M. Virtual reality as a tool to explore multisensory processing before and after engagement in physical activity. Front Aging Neurosci 2023; 15:1207651. [PMID: 38020766 PMCID: PMC10652573 DOI: 10.3389/fnagi.2023.1207651] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2023] [Accepted: 10/02/2023] [Indexed: 12/01/2023] Open
Abstract
Introduction This pilot study employed a non-randomized control trial design to explore the impact of physical activity within a virtual reality (VR) environment on multisensory processing among community-dwelling older adults. Methods The investigation compared both chronic (over 6 weeks) and acute effects of VR-based physical activity to a reading control group. The evaluation metrics for multisensory processing included audiovisual response time (RT), simultaneity judgments (SJ), sound-induced flash illusion (SIFI), and temporal order judgments (TOJ). A total of 13 older adults were provided with VR headsets featuring custom-designed games, while another 14 older adults were assigned to a reading-based control group. Results Results indicated that acute engagement in physical activity led to higher accuracy in the SIFI task (experimental group: 85.6%; control group: 78.2%; p = 0.037). Additionally, both chronic and acute physical activity resulted in quicker response times (chronic: experimental group = 336.92; control group = 381.31; p = 0.012; acute: experimental group = 333.38; control group = 383.09; p = 0.006). Although the reading group showed a non-significant trend for greater improvement in mean RT, covariate analyses revealed that this discrepancy was due to the older age of the reading group. Discussion The findings suggest that immersive VR has potential utility for enhancing multisensory processing in older adults. However, future studies must rigorously control for participant variables like age and sex to ensure more accurate comparisons between experimental and control conditions.
Collapse
Affiliation(s)
- Aysha Basharat
- Department of Kinesiology and Health Sciences, Faculty of Health, University of Waterloo, Waterloo, ON, Canada
| | - Samira Mehrabi
- Department of Kinesiology and Health Sciences, Faculty of Health, University of Waterloo, Waterloo, ON, Canada
| | - John E. Muñoz
- Department of Systems Design Engineering, Faculty of Engineering, University of Waterloo, Waterloo, ON, Canada
| | | | - Shi Cao
- Department of Systems Design Engineering, Faculty of Engineering, University of Waterloo, Waterloo, ON, Canada
| | | | - Michael Barnett-Cowan
- Department of Kinesiology and Health Sciences, Faculty of Health, University of Waterloo, Waterloo, ON, Canada
| |
Collapse
|
3
|
Pepper JL, Nuttall HE. Age-Related Changes to Multisensory Integration and Audiovisual Speech Perception. Brain Sci 2023; 13:1126. [PMID: 37626483 PMCID: PMC10452685 DOI: 10.3390/brainsci13081126] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2023] [Revised: 07/20/2023] [Accepted: 07/22/2023] [Indexed: 08/27/2023] Open
Abstract
Multisensory integration is essential for the quick and accurate perception of our environment, particularly in everyday tasks like speech perception. Research has highlighted the importance of investigating bottom-up and top-down contributions to multisensory integration and how these change as a function of ageing. Specifically, perceptual factors like the temporal binding window and cognitive factors like attention and inhibition appear to be fundamental in the integration of visual and auditory information-integration that may become less efficient as we age. These factors have been linked to brain areas like the superior temporal sulcus, with neural oscillations in the alpha-band frequency also being implicated in multisensory processing. Age-related changes in multisensory integration may have significant consequences for the well-being of our increasingly ageing population, affecting their ability to communicate with others and safely move through their environment; it is crucial that the evidence surrounding this subject continues to be carefully investigated. This review will discuss research into age-related changes in the perceptual and cognitive mechanisms of multisensory integration and the impact that these changes have on speech perception and fall risk. The role of oscillatory alpha activity is of particular interest, as it may be key in the modulation of multisensory integration.
Collapse
Affiliation(s)
| | - Helen E. Nuttall
- Department of Psychology, Lancaster University, Bailrigg LA1 4YF, UK;
| |
Collapse
|
4
|
Basharat A, Thayanithy A, Barnett-Cowan M. A Scoping Review of Audiovisual Integration Methodology: Screening for Auditory and Visual Impairment in Younger and Older Adults. Front Aging Neurosci 2022; 13:772112. [PMID: 35153716 PMCID: PMC8829696 DOI: 10.3389/fnagi.2021.772112] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2021] [Accepted: 12/17/2021] [Indexed: 11/13/2022] Open
Abstract
With the rise of the aging population, many scientists studying multisensory integration have turned toward understanding how this process may change with age. This scoping review was conducted to understand and describe the scope and rigor with which researchers studying audiovisual sensory integration screen for hearing and vision impairment. A structured search in three licensed databases (Scopus, PubMed, and PsychInfo) using the key concepts of multisensory integration, audiovisual modality, and aging revealed 2,462 articles, which were screened for inclusion by two reviewers. Articles were included if they (1) tested healthy older adults (minimum mean or median age of 60) with younger adults as a comparison (mean or median age between 18 and 35), (2) measured auditory and visual integration, (3) were written in English, and (4) reported behavioral outcomes. Articles that included the following were excluded: (1) tested taste exclusively, (2) tested olfaction exclusively, (3) tested somatosensation exclusively, (4) tested emotion perception, (5) were not written in English, (6) were clinical commentaries, editorials, interviews, letters, newspaper articles, abstracts only, or non-peer reviewed literature (e.g., theses), and (7) focused on neuroimaging without a behavioral component. Data pertaining to the details of the study (e.g., country of publication, year of publication, etc.) were extracted, however, of higher importance to our research question, data pertaining to screening measures used for hearing and vision impairment (e.g., type of test used, whether hearing- and visual-aids were worn, thresholds used, etc.) were extracted, collated, and summarized. Our search revealed that only 64% of studies screened for age-abnormal hearing impairment, 51% screened for age-abnormal vision impairment, and that consistent definitions of normal or abnormal vision and hearing were not used among the studies that screened for sensory abilities. A total of 1,624 younger adults and 4,778 older participants were included in the scoping review with males composing approximately 44% and females composing 56% of the total sample and most of the data was obtained from only four countries. We recommend that studies investigating the effects of aging on multisensory integration should screen for normal vision and hearing by using the World Health Organization's (WHO) hearing loss and visual impairment cut-off scores in order to maintain consistency among other aging researchers. As mild cognitive impairment (MCI) has been defined as a “transitional” or a “transitory” stage between normal aging and dementia and because approximately 3–5% of the aging population will develop MCI each year, it is therefore important that when researchers aim to study a healthy aging population, that they appropriately screen for MCI. One of our secondary aims was to determine how often researchers were screening for cognitive impairment and the types of tests that were used to do so. Our results revealed that only 55 out of 72 studies tested for neurological and cognitive function, and only a subset used standardized tests. Additionally, among the studies that used standardized tests, the cut-off scores used were not always adequate for screening out mild cognitive impairment. An additional secondary aim of this scoping review was to determine the feasibility of whether a meta-analysis could be conducted in the future to further quantitatively evaluate the results (i.e., are the findings obtained from studies using self-reported vision and hearing impairment screening methods significantly different from those measuring vision and hearing impairment in the lab) and to assess the scope of this problem. We found that it may not be feasible to conduct a meta-analysis with the entire dataset of this scoping review. However, a meta-analysis can be conducted if stricter parameters are used (e.g., focusing on accuracy or response time data only).Systematic Review Registration:https://doi.org/10.17605/OSF.IO/GTUHD.
Collapse
|
5
|
Dias JW, McClaskey CM, Harris KC. Audiovisual speech is more than the sum of its parts: Auditory-visual superadditivity compensates for age-related declines in audible and lipread speech intelligibility. Psychol Aging 2021; 36:520-530. [PMID: 34124922 PMCID: PMC8427734 DOI: 10.1037/pag0000613] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/10/2023]
Abstract
Multisensory input can improve perception of ambiguous unisensory information. For example, speech heard in noise can be more accurately identified when listeners see a speaker's articulating face. Importantly, these multisensory effects can be superadditive to listeners' ability to process unisensory speech, such that audiovisual speech identification is better than the sum of auditory-only and visual-only speech identification. Age-related declines in auditory and visual speech perception have been hypothesized to be concomitant with stronger cross-sensory influences on audiovisual speech identification, but little evidence exists to support this. Currently, studies do not account for the multisensory superadditive benefit of auditory-visual input in their metrics of the auditory or visual influence on audiovisual speech perception. Here we treat multisensory superadditivity as independent from unisensory auditory and visual processing. In the current investigation, older and younger adults identified auditory, visual, and audiovisual speech in noisy listening conditions. Performance across these conditions was used to compute conventional metrics of the auditory and visual influence on audiovisual speech identification and a metric of auditory-visual superadditivity. Consistent with past work, auditory and visual speech identification declined with age, audiovisual speech identification was preserved, and no age-related differences in the auditory or visual influence on audiovisual speech identification were observed. However, we found that auditory-visual superadditivity improved with age. The novel findings suggest that multisensory superadditivity is independent of unisensory processing. As auditory and visual speech identification decline with age, compensatory changes in multisensory superadditivity may preserve audiovisual speech identification in older adults. (PsycInfo Database Record (c) 2021 APA, all rights reserved).
Collapse
Affiliation(s)
- James W Dias
- Department of Otolaryngology-Head and Neck Surgery
| | | | | |
Collapse
|
6
|
Kenney DM, Jabbari Y, von Mohrenschildt M, Shedden JM. Visual-vestibular integration is preserved with healthy aging in a simple acceleration detection task. Neurobiol Aging 2021; 104:71-81. [PMID: 33975121 DOI: 10.1016/j.neurobiolaging.2021.03.017] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2020] [Revised: 03/29/2021] [Accepted: 03/31/2021] [Indexed: 10/21/2022]
Abstract
Aging is associated with a gradual decline in the sensory systems and noisier sensory information. Some research has found that older adults compensate for this with enhanced multisensory integration. However, less is known about how aging influences visual-vestibular integration, an ability that underlies self-motion perception. We examined how visual-vestibular integration changes in participants from across the lifespan (18-79 years old) with a simple reaction time task. Participants were instructed to respond to visual (optic flow) and vestibular (inertial motion) acceleration cues, presented either alone or at a stimulus onset asynchrony. We measured reaction times and computed the violation area relative to the race model inequality as a measure of visual-vestibular integration. Across all ages, the greatest visual-vestibular integration occurred when the vestibular cue was presented first. Age was associated with longer reaction times and a significantly lower detection rate in the vestibular-only condition, a finding that is consistent with an age-related increase in vestibular noise. Although the relationship between age and visual-vestibular integration was positive, the effect size was very small and did not reach statistical significance. Our results suggest that although age is associated with a significant increase in vestibular perceptual threshold, the relative amount of visual-vestibular integration remains largely intact.
Collapse
Affiliation(s)
- Darren M Kenney
- Department of Psychology, Neuroscience & Behaviour, McMaster University, Hamilton, Ontario, Canada.
| | - Yasaman Jabbari
- Department of Psychology, Neuroscience & Behaviour, McMaster University, Hamilton, Ontario, Canada
| | | | - Judith M Shedden
- Department of Psychology, Neuroscience & Behaviour, McMaster University, Hamilton, Ontario, Canada
| |
Collapse
|
7
|
Jones SA, Noppeney U. Ageing and multisensory integration: A review of the evidence, and a computational perspective. Cortex 2021; 138:1-23. [PMID: 33676086 DOI: 10.1016/j.cortex.2021.02.001] [Citation(s) in RCA: 27] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2020] [Revised: 01/23/2021] [Accepted: 02/02/2021] [Indexed: 11/29/2022]
Abstract
The processing of multisensory signals is crucial for effective interaction with the environment, but our ability to perform this vital function changes as we age. In the first part of this review, we summarise existing research into the effects of healthy ageing on multisensory integration. We note that age differences vary substantially with the paradigms and stimuli used: older adults often receive at least as much benefit (to both accuracy and response times) as younger controls from congruent multisensory stimuli, but are also consistently more negatively impacted by the presence of intersensory conflict. In the second part, we outline a normative Bayesian framework that provides a principled and computationally informed perspective on the key ingredients involved in multisensory perception, and how these are affected by ageing. Applying this framework to the existing literature, we conclude that changes to sensory reliability, prior expectations (together with attentional control), and decisional strategies all contribute to the age differences observed. However, we find no compelling evidence of any age-related changes to the basic inference mechanisms involved in multisensory perception.
Collapse
Affiliation(s)
- Samuel A Jones
- The Staffordshire Centre for Psychological Research, Staffordshire University, Stoke-on-Trent, UK.
| | - Uta Noppeney
- Donders Institute for Brain, Cognition & Behaviour, Radboud University, Nijmegen, the Netherlands.
| |
Collapse
|
8
|
Park H, Nannt J, Kayser C. Sensory- and memory-related drivers for altered ventriloquism effects and aftereffects in older adults. Cortex 2021; 135:298-310. [PMID: 33422888 PMCID: PMC7856550 DOI: 10.1016/j.cortex.2020.12.001] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2020] [Revised: 10/21/2020] [Accepted: 12/03/2020] [Indexed: 01/05/2023]
Abstract
The manner in which humans exploit multisensory information for subsequent decisions changes with age. Multiple causes for such age-effects are being discussed, including a reduced precision in peripheral sensory representations, changes in cognitive inference about causal relations between sensory cues, and a decline in memory contributing to altered sequential patterns of multisensory behaviour. To dissociate these putative contributions, we investigated how healthy young and older adults integrate audio-visual spatial information within trials (the ventriloquism effect) and between trials (the ventriloquism aftereffect). With both a model-free and (Bayesian) model-based analyses we found that both biases differed between groups. Our results attribute the age-change in the ventriloquism bias to a decline in spatial hearing rather than a change in cognitive processes. This decline in peripheral function, combined with a more prominent influence from preceding responses rather than preceding stimuli in the elderly, can also explain the observed age-effect in the ventriloquism aftereffect. Our results suggest a transition from a sensory-to a behavior-driven influence of past multisensory experience on perceptual decisions with age, due to reduced sensory precision and change in memory capacity.
Collapse
Affiliation(s)
- Hame Park
- Department for Cognitive Neuroscience, Faculty of Biology, Bielefeld University, Bielefeld, Germany; Center for Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany.
| | - Julia Nannt
- Department for Cognitive Neuroscience, Faculty of Biology, Bielefeld University, Bielefeld, Germany; Center for Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| | - Christoph Kayser
- Department for Cognitive Neuroscience, Faculty of Biology, Bielefeld University, Bielefeld, Germany; Center for Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany.
| |
Collapse
|
9
|
Michaelis K, Erickson LC, Fama ME, Skipper-Kallal LM, Xing S, Lacey EH, Anbari Z, Norato G, Rauschecker JP, Turkeltaub PE. Effects of age and left hemisphere lesions on audiovisual integration of speech. BRAIN AND LANGUAGE 2020; 206:104812. [PMID: 32447050 PMCID: PMC7379161 DOI: 10.1016/j.bandl.2020.104812] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/21/2019] [Revised: 04/02/2020] [Accepted: 05/04/2020] [Indexed: 06/11/2023]
Abstract
Neuroimaging studies have implicated left temporal lobe regions in audiovisual integration of speech and inferior parietal regions in temporal binding of incoming signals. However, it remains unclear which regions are necessary for audiovisual integration, especially when the auditory and visual signals are offset in time. Aging also influences integration, but the nature of this influence is unresolved. We used a McGurk task to test audiovisual integration and sensitivity to the timing of audiovisual signals in two older adult groups: left hemisphere stroke survivors and controls. We observed a positive relationship between age and audiovisual speech integration in both groups, and an interaction indicating that lesions reduce sensitivity to timing offsets between signals. Lesion-symptom mapping demonstrated that damage to the left supramarginal gyrus and planum temporale reduces temporal acuity in audiovisual speech perception. This suggests that a process mediated by these structures identifies asynchronous audiovisual signals that should not be integrated.
Collapse
Affiliation(s)
- Kelly Michaelis
- Neurology Department and Center for Brain Plasticity and Recovery, Georgetown University Medical Center, Washington DC, USA
| | - Laura C Erickson
- Neurology Department and Center for Brain Plasticity and Recovery, Georgetown University Medical Center, Washington DC, USA; Neuroscience Department, Georgetown University Medical Center, Washington DC, USA
| | - Mackenzie E Fama
- Neurology Department and Center for Brain Plasticity and Recovery, Georgetown University Medical Center, Washington DC, USA; Department of Speech-Language Pathology & Audiology, Towson University, Towson, MD, USA
| | - Laura M Skipper-Kallal
- Neurology Department and Center for Brain Plasticity and Recovery, Georgetown University Medical Center, Washington DC, USA
| | - Shihui Xing
- Neurology Department and Center for Brain Plasticity and Recovery, Georgetown University Medical Center, Washington DC, USA; Department of Neurology, First Affiliated Hospital of Sun Yat-Sen University, Guangzhou, China
| | - Elizabeth H Lacey
- Neurology Department and Center for Brain Plasticity and Recovery, Georgetown University Medical Center, Washington DC, USA; Research Division, MedStar National Rehabilitation Hospital, Washington DC, USA
| | - Zainab Anbari
- Neurology Department and Center for Brain Plasticity and Recovery, Georgetown University Medical Center, Washington DC, USA
| | - Gina Norato
- Clinical Trials Unit, National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, MD, USA
| | - Josef P Rauschecker
- Neuroscience Department, Georgetown University Medical Center, Washington DC, USA
| | - Peter E Turkeltaub
- Neurology Department and Center for Brain Plasticity and Recovery, Georgetown University Medical Center, Washington DC, USA; Research Division, MedStar National Rehabilitation Hospital, Washington DC, USA.
| |
Collapse
|
10
|
Wallace MT, Woynaroski TG, Stevenson RA. Multisensory Integration as a Window into Orderly and Disrupted Cognition and Communication. Annu Rev Psychol 2020; 71:193-219. [DOI: 10.1146/annurev-psych-010419-051112] [Citation(s) in RCA: 37] [Impact Index Per Article: 9.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
During our everyday lives, we are confronted with a vast amount of information from several sensory modalities. This multisensory information needs to be appropriately integrated for us to effectively engage with and learn from our world. Research carried out over the last half century has provided new insights into the way such multisensory processing improves human performance and perception; the neurophysiological foundations of multisensory function; the time course for its development; how multisensory abilities differ in clinical populations; and, most recently, the links between multisensory processing and cognitive abilities. This review summarizes the extant literature on multisensory function in typical and atypical circumstances, discusses the implications of the work carried out to date for theory and research, and points toward next steps for advancing the field.
Collapse
Affiliation(s)
- Mark T. Wallace
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, Tennessee 37232, USA;,
- Departments of Psychology and Pharmacology, Vanderbilt University, Nashville, Tennessee 37232, USA
- Department of Psychiatry and Behavioral Sciences, Vanderbilt University Medical Center, Nashville, Tennessee 37232, USA
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, Tennessee 37232, USA
- Vanderbilt Kennedy Center, Nashville, Tennessee 37203, USA
| | - Tiffany G. Woynaroski
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, Tennessee 37232, USA;,
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, Tennessee 37232, USA
- Vanderbilt Kennedy Center, Nashville, Tennessee 37203, USA
| | - Ryan A. Stevenson
- Departments of Psychology and Psychiatry and Program in Neuroscience, University of Western Ontario, London, Ontario N6A 3K7, Canada
- Brain and Mind Institute, University of Western Ontario, London, Ontario N6A 3K7, Canada
| |
Collapse
|
11
|
Jones SA, Beierholm U, Meijer D, Noppeney U. Older adults sacrifice response speed to preserve multisensory integration performance. Neurobiol Aging 2019; 84:148-157. [PMID: 31586863 DOI: 10.1016/j.neurobiolaging.2019.08.017] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2018] [Revised: 07/10/2019] [Accepted: 08/17/2019] [Indexed: 01/27/2023]
Abstract
Aging has been shown to impact multisensory perception, but the underlying computational mechanisms are unclear. For effective interactions with the environment, observers should integrate signals that share a common source, weighted by their reliabilities, and segregate those from separate sources. Observers are thought to accumulate evidence about the world's causal structure over time until a decisional threshold is reached. Combining psychophysics and Bayesian modeling, we investigated how aging affects audiovisual perception of spatial signals. Older and younger adults were comparable in their final localization and common-source judgment responses under both speeded and unspeeded conditions, but were disproportionately slower for audiovisually incongruent trials. Bayesian modeling showed that aging did not affect the ability to arbitrate between integration and segregation under either unspeeded or speeded conditions. However, modeling the within-trial dynamics of evidence accumulation under speeded conditions revealed that older observers accumulate noisier auditory representations for longer, set higher decisional thresholds, and have impaired motor speed. Older observers preserve audiovisual localization performance, despite noisier sensory representations, by sacrificing response speed.
Collapse
Affiliation(s)
- Samuel A Jones
- Computational Cognitive Neuroimaging Laboratory, Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, Birmingham, UK; The Staffordshire Centre for Psychological Research, Staffordshire University, Stoke-on-Trent, UK.
| | | | - David Meijer
- Computational Cognitive Neuroimaging Laboratory, Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, Birmingham, UK
| | - Uta Noppeney
- Computational Cognitive Neuroimaging Laboratory, Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, Birmingham, UK
| |
Collapse
|
12
|
Brown VA, Hedayati M, Zanger A, Mayn S, Ray L, Dillman-Hasso N, Strand JF. What accounts for individual differences in susceptibility to the McGurk effect? PLoS One 2018; 13:e0207160. [PMID: 30418995 PMCID: PMC6231656 DOI: 10.1371/journal.pone.0207160] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2018] [Accepted: 10/25/2018] [Indexed: 11/29/2022] Open
Abstract
The McGurk effect is a classic audiovisual speech illusion in which discrepant auditory and visual syllables can lead to a fused percept (e.g., an auditory /bɑ/ paired with a visual /gɑ/ often leads to the perception of /dɑ/). The McGurk effect is robust and easily replicated in pooled group data, but there is tremendous variability in the extent to which individual participants are susceptible to it. In some studies, the rate at which individuals report fusion responses ranges from 0% to 100%. Despite its widespread use in the audiovisual speech perception literature, the roots of the wide variability in McGurk susceptibility are largely unknown. This study evaluated whether several perceptual and cognitive traits are related to McGurk susceptibility through correlational analyses and mixed effects modeling. We found that an individual's susceptibility to the McGurk effect was related to their ability to extract place of articulation information from the visual signal (i.e., a more fine-grained analysis of lipreading ability), but not to scores on tasks measuring attentional control, processing speed, working memory capacity, or auditory perceptual gradiency. These results provide support for the claim that a small amount of the variability in susceptibility to the McGurk effect is attributable to lipreading skill. In contrast, cognitive and perceptual abilities that are commonly used predictors in individual differences studies do not appear to underlie susceptibility to the McGurk effect.
Collapse
Affiliation(s)
- Violet A. Brown
- Department of Psychology, Carleton College, Northfield, Minnesota, United States of America
| | - Maryam Hedayati
- Department of Psychology, Carleton College, Northfield, Minnesota, United States of America
| | - Annie Zanger
- Department of Psychology, Carleton College, Northfield, Minnesota, United States of America
| | - Sasha Mayn
- Department of Psychology, Carleton College, Northfield, Minnesota, United States of America
| | - Lucia Ray
- Department of Psychology, Carleton College, Northfield, Minnesota, United States of America
| | - Naseem Dillman-Hasso
- Department of Psychology, Carleton College, Northfield, Minnesota, United States of America
| | - Julia F. Strand
- Department of Psychology, Carleton College, Northfield, Minnesota, United States of America
| |
Collapse
|
13
|
Brooks CJ, Chan YM, Anderson AJ, McKendrick AM. Audiovisual Temporal Perception in Aging: The Role of Multisensory Integration and Age-Related Sensory Loss. Front Hum Neurosci 2018; 12:192. [PMID: 29867415 PMCID: PMC5954093 DOI: 10.3389/fnhum.2018.00192] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2017] [Accepted: 04/20/2018] [Indexed: 11/26/2022] Open
Abstract
Within each sensory modality, age-related deficits in temporal perception contribute to the difficulties older adults experience when performing everyday tasks. Since perceptual experience is inherently multisensory, older adults also face the added challenge of appropriately integrating or segregating the auditory and visual cues present in our dynamic environment into coherent representations of distinct objects. As such, many studies have investigated how older adults perform when integrating temporal information across audition and vision. This review covers both direct judgments about temporal information (the sound-induced flash illusion, temporal order, perceived synchrony, and temporal rate discrimination) and judgments regarding stimuli containing temporal information (the audiovisual bounce effect and speech perception). Although an age-related increase in integration has been demonstrated on a variety of tasks, research specifically investigating the ability of older adults to integrate temporal auditory and visual cues has produced disparate results. In this short review, we explore what factors could underlie these divergent findings. We conclude that both task-specific differences and age-related sensory loss play a role in the reported disparity in age-related effects on the integration of auditory and visual temporal information.
Collapse
Affiliation(s)
- Cassandra J Brooks
- Department of Optometry and Vision Sciences, The University of Melbourne, Melbourne, VIC, Australia
| | - Yu Man Chan
- Department of Optometry and Vision Sciences, The University of Melbourne, Melbourne, VIC, Australia
| | - Andrew J Anderson
- Department of Optometry and Vision Sciences, The University of Melbourne, Melbourne, VIC, Australia
| | - Allison M McKendrick
- Department of Optometry and Vision Sciences, The University of Melbourne, Melbourne, VIC, Australia
| |
Collapse
|
14
|
Basharat A, Adams MS, Staines WR, Barnett-Cowan M. Simultaneity and Temporal Order Judgments Are Coded Differently and Change With Age: An Event-Related Potential Study. Front Integr Neurosci 2018; 12:15. [PMID: 29755327 PMCID: PMC5932149 DOI: 10.3389/fnint.2018.00015] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2017] [Accepted: 04/05/2018] [Indexed: 11/13/2022] Open
Abstract
Multisensory integration is required for a number of daily living tasks where the inability to accurately identify simultaneity and temporality of multisensory events results in errors in judgment leading to poor decision-making and dangerous behavior. Previously, our lab discovered that older adults exhibited impaired timing of audiovisual events, particularly when making temporal order judgments (TOJs). Simultaneity judgments (SJs), however, were preserved across the lifespan. Here, we investigate the difference between the TOJ and SJ tasks in younger and older adults to assess neural processing differences between these two tasks and across the lifespan. Event-related potentials (ERPs) were studied to determine between-task and between-age differences. Results revealed task specific differences in perceiving simultaneity and temporal order, suggesting that each task may be subserved via different neural mechanisms. Here, auditory N1 and visual P1 ERP amplitudes confirmed that unisensory processing of audiovisual stimuli did not differ between the two tasks within both younger and older groups, indicating that performance differences between tasks arise either from multisensory integration or higher-level decision-making. Compared to younger adults, older adults showed a sustained higher auditory N1 ERP amplitude response across SOAs, suggestive of broader response properties from an extended temporal binding window. Our work provides compelling evidence that different neural mechanisms subserve the SJ and TOJ tasks and that simultaneity and temporal order perception are coded differently and change with age.
Collapse
Affiliation(s)
- Aysha Basharat
- Department of Kinesiology, University of Waterloo, Waterloo, ON, Canada
| | - Meaghan S Adams
- Department of Kinesiology, University of Waterloo, Waterloo, ON, Canada
| | - William R Staines
- Department of Kinesiology, University of Waterloo, Waterloo, ON, Canada
| | | |
Collapse
|
15
|
Billino J, Drewing K. Age Effects on Visuo-Haptic Length Discrimination: Evidence for Optimal Integration of Senses in Senior Adults. Multisens Res 2018; 31:273-300. [PMID: 31264626 DOI: 10.1163/22134808-00002601] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2017] [Accepted: 07/25/2017] [Indexed: 11/19/2022]
Abstract
Demographic changes in most developed societies have fostered research on functional aging. While cognitive changes have been characterized elaborately, understanding of perceptual aging lacks behind. We investigated age effects on the mechanisms of how multiple sources of sensory information are merged into a common percept. We studied visuo-haptic integration in a length discrimination task. A total of 24 young (20-25 years) and 27 senior (69-77 years) adults compared standard stimuli to appropriate sets of comparison stimuli. Standard stimuli were explored under visual, haptic, or visuo-haptic conditions. The task procedure allowed introducing an intersensory conflict by anamorphic lenses. Comparison stimuli were exclusively explored haptically. We derived psychometric functions for each condition, determining points of subjective equality and discrimination thresholds. We notably evaluated visuo-haptic perception by different models of multisensory processing, i.e., the Maximum-Likelihood-Estimate model of optimal cue integration, a suboptimal integration model, and a cue switching model. Our results support robust visuo-haptic integration across the adult lifespan. We found suboptimal weighted averaging of sensory sources in young adults, however, senior adults exploited differential sensory reliabilities more efficiently to optimize thresholds. Indeed, evaluation of the MLE model indicates that young adults underweighted visual cues by more than 30%; in contrast, visual weights of senior adults deviated only by about 3% from predictions. We suggest that close to optimal multisensory integration might contribute to successful compensation for age-related sensory losses and provides a critical resource. Differentiation between multisensory integration during healthy aging and age-related pathological challenges on the sensory systems awaits further exploration.
Collapse
Affiliation(s)
- Jutta Billino
- Department of Psychology, Justus-Liebig-Universität, Otto-Behaghel-Str. 10F, 35394 Giessen, Germany
| | - Knut Drewing
- Department of Psychology, Justus-Liebig-Universität, Otto-Behaghel-Str. 10F, 35394 Giessen, Germany
| |
Collapse
|
16
|
Neural Mechanisms Underlying Cross-Modal Phonetic Encoding. J Neurosci 2017; 38:1835-1849. [PMID: 29263241 DOI: 10.1523/jneurosci.1566-17.2017] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/06/2017] [Revised: 11/17/2017] [Accepted: 12/08/2017] [Indexed: 11/21/2022] Open
Abstract
Audiovisual (AV) integration is essential for speech comprehension, especially in adverse listening situations. Divergent, but not mutually exclusive, theories have been proposed to explain the neural mechanisms underlying AV integration. One theory advocates that this process occurs via interactions between the auditory and visual cortices, as opposed to fusion of AV percepts in a multisensory integrator. Building upon this idea, we proposed that AV integration in spoken language reflects visually induced weighting of phonetic representations at the auditory cortex. EEG was recorded while male and female human subjects watched and listened to videos of a speaker uttering consonant vowel (CV) syllables /ba/ and /fa/, presented in Auditory-only, AV congruent or incongruent contexts. Subjects reported whether they heard /ba/ or /fa/. We hypothesized that vision alters phonetic encoding by dynamically weighting which phonetic representation in the auditory cortex is strengthened or weakened. That is, when subjects are presented with visual /fa/ and acoustic /ba/ and hear /fa/ (illusion-fa), the visual input strengthens the weighting of the phone /f/ representation. When subjects are presented with visual /ba/ and acoustic /fa/ and hear /ba/ (illusion-ba), the visual input weakens the weighting of the phone /f/ representation. Indeed, we found an enlarged N1 auditory evoked potential when subjects perceived illusion-ba, and a reduced N1 when they perceived illusion-fa, mirroring the N1 behavior for /ba/ and /fa/ in Auditory-only settings. These effects were especially pronounced in individuals with more robust illusory perception. These findings provide evidence that visual speech modifies phonetic encoding at the auditory cortex.SIGNIFICANCE STATEMENT The current study presents evidence that audiovisual integration in spoken language occurs when one modality (vision) acts on representations of a second modality (audition). Using the McGurk illusion, we show that visual context primes phonetic representations at the auditory cortex, altering the auditory percept, evidenced by changes in the N1 auditory evoked potential. This finding reinforces the theory that audiovisual integration occurs via visual networks influencing phonetic representations in the auditory cortex. We believe that this will lead to the generation of new hypotheses regarding cross-modal mapping, particularly whether it occurs via direct or indirect routes (e.g., via a multisensory mediator).
Collapse
|
17
|
Abstract
Purpose of Review The integration of information across sensory modalities into unified percepts is a fundamental sensory process upon which a multitude of cognitive processes are based. We review the body of literature exploring aging-related changes in audiovisual integration published over the last five years. Specifically, we review the impact of changes in temporal processing, the influence of the effectiveness of sensory inputs, the role of working memory, and the newer studies of intra-individual variability during these processes. Recent Findings Work in the last five years on bottom-up influences of sensory perception has garnered significant attention. Temporal processing, a driving factors of multisensory integration, has now been shown to decouple with multisensory integration in aging, despite their co-decline with aging. The impact of stimulus effectiveness also changes with age, where older adults show maximal benefit from multisensory gain at high signal-to-noise ratios. Following sensory decline, high working memory capacities have now been shown to be somewhat of a protective factor against age-related declines in audiovisual speech perception, particularly in noise. Finally, newer research is emerging focusing on the general intra-individual variability observed with aging. Summary Overall, the studies of the past five years have replicated and expanded on previous work that highlights the role of bottom-up sensory changes with aging and their influence on audiovisual integration, as well as the top-down influence of working memory.
Collapse
Affiliation(s)
- Sarah H Baum
- Department of Psychology, University of Washington
| | - Ryan Stevenson
- Department of Psychology, Western University.,Brain and Mind Institute, Western University.,Department of Psychiatry, Schulich School of Medicine and Dentistry, Western University.,Program in Neuroscience, Schulich School of Medicine and Dentistry, Western University.,Centre for Vision Research, York University
| |
Collapse
|
18
|
Brown DR, Cavanagh JF. The sound and the fury: Late positive potential is sensitive to sound affect. Psychophysiology 2017; 54:1812-1825. [PMID: 28726287 DOI: 10.1111/psyp.12959] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2016] [Revised: 04/19/2017] [Accepted: 04/24/2017] [Indexed: 01/10/2023]
Abstract
Emotion is an emergent construct of multiple distinct neural processes. EEG is uniquely sensitive to real-time neural computations, and thus is a promising tool to study the construction of emotion. This series of studies aimed to probe the mechanistic contribution of the late positive potential (LPP) to multimodal emotion perception. Experiment 1 revealed that LPP amplitudes for visual images, sounds, and visual images paired with sounds were larger for negatively rated stimuli than for neutrally rated stimuli. Experiment 2 manipulated this audiovisual enhancement by altering the valence pairings with congruent (e.g., positive audio + positive visual) or conflicting emotional pairs (e.g., positive audio + negative visual). Negative visual stimuli evoked larger early LPP amplitudes than positive visual stimuli, regardless of sound pairing. However, time frequency analyses revealed significant midfrontal theta-band power differences for conflicting over congruent stimuli pairs, suggesting very early (∼500 ms) realization of thematic fidelity violations. Interestingly, late LPP modulations were reflective of the opposite pattern of congruency, whereby congruent over conflicting pairs had larger LPP amplitudes. Together, these findings suggest that enhanced parietal activity for affective valence is modality independent and sensitive to complex affective processes. Furthermore, these findings suggest that altered neural activities for affective visual stimuli are enhanced by concurrent affective sounds, paving the way toward an understanding of the construction of multimodal affective experience.
Collapse
Affiliation(s)
- Darin R Brown
- Department of Psychology, University of New Mexico, Albuquerque, New Mexico, USA
| | - James F Cavanagh
- Department of Psychology, University of New Mexico, Albuquerque, New Mexico, USA
| |
Collapse
|
19
|
Festa EK, Katz AP, Ott BR, Tremont G, Heindel WC. Dissociable Effects of Aging and Mild Cognitive Impairment on Bottom-Up Audiovisual Integration. J Alzheimers Dis 2017; 59:155-167. [DOI: 10.3233/jad-161062] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Affiliation(s)
- Elena K. Festa
- Department of Cognitive, Linguistic, and Psychological Sciences, Brown University, Providence, RI, USA
| | - Andrew P. Katz
- Department of Cognitive, Linguistic, and Psychological Sciences, Brown University, Providence, RI, USA
| | - Brian R. Ott
- Department of Neurology, Alpert Medical School, Brown University, Providence, RI, USA
- Department of Neurology, Rhode Island Hospital, Providence, RI, USA
| | - Geoffrey Tremont
- Department of Psychiatry and Human Behavior, Alpert Medical School, Brown University, Providence, RI, USA
- Department of Psychiatry, Rhode Island Hospital, Providence, RI, USA
| | - William C. Heindel
- Department of Cognitive, Linguistic, and Psychological Sciences, Brown University, Providence, RI, USA
| |
Collapse
|
20
|
Skilled musicians are not subject to the McGurk effect. Sci Rep 2016; 6:30423. [PMID: 27453363 PMCID: PMC4958963 DOI: 10.1038/srep30423] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2016] [Accepted: 07/05/2016] [Indexed: 11/25/2022] Open
Abstract
The McGurk effect is a compelling illusion in which humans auditorily perceive mismatched audiovisual speech as a completely different syllable. In this study evidences are provided that professional musicians are not subject to this illusion, possibly because of their finer auditory or attentional abilities. 80 healthy age-matched graduate students volunteered to the study. 40 were musicians of Brescia Luca Marenzio Conservatory of Music with at least 8–13 years of musical academic studies. /la/, /da/, /ta/, /ga/, /ka/, /na/, /ba/, /pa/ phonemes were presented to participants in audiovisual congruent and incongruent conditions, or in unimodal (only visual or only auditory) conditions while engaged in syllable recognition tasks. Overall musicians showed no significant McGurk effect for any of the phonemes. Controls showed a marked McGurk effect for several phonemes (including alveolar-nasal, velar-occlusive and bilabial ones). The results indicate that the early and intensive musical training might affect the way the auditory cortex process phonetic information.
Collapse
|
21
|
Altieri N, Yang CT. Parallel linear dynamic models can mimic the McGurk effect in clinical populations. J Comput Neurosci 2016; 41:143-55. [PMID: 27272510 DOI: 10.1007/s10827-016-0610-z] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2015] [Revised: 05/27/2016] [Accepted: 05/31/2016] [Indexed: 12/20/2022]
Abstract
One of the most common examples of audiovisual speech integration is the McGurk effect. As an example, an auditory syllable /ba/ recorded over incongruent lip movements that produce "ga" typically causes listeners to hear "da". This report hypothesizes reasons why certain clinical and listeners who are hard of hearing might be more susceptible to visual influence. Conversely, we also examine why other listeners appear less susceptible to the McGurk effect (i.e., they report hearing just the auditory stimulus without being influenced by the visual). Such explanations are accompanied by a mechanistic explanation of integration phenomena including visual inhibition of auditory information, or slower rate of accumulation of inputs. First, simulations of a linear dynamic parallel interactive model were instantiated using inhibition and facilitation to examine potential mechanisms underlying integration. In a second set of simulations, we systematically manipulated the inhibition parameter values to model data obtained from listeners with autism spectrum disorder. In summary, we argue that cross-modal inhibition parameter values explain individual variability in McGurk perceptibility. Nonetheless, different mechanisms should continue to be explored in an effort to better understand current data patterns in the audiovisual integration literature.
Collapse
Affiliation(s)
- Nicholas Altieri
- Department of Communication Sciences and Disorders, Idaho State University, 921 S. 8th Ave. Stop 8116, Pocatello, ID, 83209, USA.
| | - Cheng-Ta Yang
- Department of Psychology, National Cheng Kung University, No. 1, Daxue Rd, East District, Tainan City, Taiwan, 701
| |
Collapse
|
22
|
Ramkhalawansingh R, Keshavarz B, Haycock B, Shahab S, Campos JL. Age Differences in Visual-Auditory Self-Motion Perception during a Simulated Driving Task. Front Psychol 2016; 7:595. [PMID: 27199829 PMCID: PMC4848465 DOI: 10.3389/fpsyg.2016.00595] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2016] [Accepted: 04/11/2016] [Indexed: 11/17/2022] Open
Abstract
Recent evidence suggests that visual-auditory cue integration may change as a function of age such that integration is heightened among older adults. Our goal was to determine whether these changes in multisensory integration are also observed in the context of self-motion perception under realistic task constraints. Thus, we developed a simulated driving paradigm in which we provided older and younger adults with visual motion cues (i.e., optic flow) and systematically manipulated the presence or absence of congruent auditory cues to self-motion (i.e., engine, tire, and wind sounds). Results demonstrated that the presence or absence of congruent auditory input had different effects on older and younger adults. Both age groups demonstrated a reduction in speed variability when auditory cues were present compared to when they were absent, but older adults demonstrated a proportionally greater reduction in speed variability under combined sensory conditions. These results are consistent with evidence indicating that multisensory integration is heightened in older adults. Importantly, this study is the first to provide evidence to suggest that age differences in multisensory integration may generalize from simple stimulus detection tasks to the integration of the more complex and dynamic visual and auditory cues that are experienced during self-motion.
Collapse
Affiliation(s)
- Robert Ramkhalawansingh
- Research/iDAPT, Toronto Rehabilitation InstituteToronto, ON, Canada; Department of Psychology, University of TorontoToronto, ON, Canada
| | - Behrang Keshavarz
- Research/iDAPT, Toronto Rehabilitation Institute Toronto, ON, Canada
| | - Bruce Haycock
- Research/iDAPT, Toronto Rehabilitation Institute Toronto, ON, Canada
| | - Saba Shahab
- Research/iDAPT, Toronto Rehabilitation InstituteToronto, ON, Canada; Department of Psychology, University of TorontoToronto, ON, Canada; Institute of Medical Science, Faculty of Medicine, University of TorontoToronto, ON, Canada
| | - Jennifer L Campos
- Research/iDAPT, Toronto Rehabilitation InstituteToronto, ON, Canada; Department of Psychology, University of TorontoToronto, ON, Canada
| |
Collapse
|
23
|
Brooks CJ, Anderson AJ, Roach NW, McGraw PV, McKendrick AM. Age-related changes in auditory and visual interactions in temporal rate perception. J Vis 2016; 15:2. [PMID: 26624937 DOI: 10.1167/15.16.2] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
We investigated how aging affects the integration of temporal rate for auditory flutter (amplitude modulation) presented with visual flicker. Since older adults were poorer at detecting auditory amplitude modulation, modulation depth was individually adjusted so that temporal rate was equally discriminable for 10 Hz flutter and flicker, thereby balancing the reliability of rate information available to each sensory modality. With age-related sensory differences normalized in this way, rate asynchrony skewed both auditory and visual rate judgments to the same extent in younger and older adults. Therefore, reliability-based weighting of temporal rate is preserved in older adults. Concurrent presentation of synchronous 10 Hz flicker and flutter improved temporal rate discrimination consistent with statistically optimal integration in younger but not older adults. In a control experiment, younger adults were presented with the same physical auditory stimulus as older adults. This time, rate asynchrony skewed perceived rate with greater auditory weighting rather than balanced integration. Taken together, our results indicate that integration of discrepant auditory and visual rates is not altered due to the healthy aging process once sensory deficits are accounted for, but that aging does abolish the minor improvement in discrimination performance seen in younger observers when concordant rates are integrated.
Collapse
|
24
|
White TP, Wigton RL, Joyce DW, Bobin T, Ferragamo C, Wasim N, Lisk S, Shergill SS. Eluding the illusion? Schizophrenia, dopamine and the McGurk effect. Front Hum Neurosci 2014; 8:565. [PMID: 25140138 PMCID: PMC4122162 DOI: 10.3389/fnhum.2014.00565] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2014] [Accepted: 07/11/2014] [Indexed: 11/13/2022] Open
Abstract
Perceptions are inherently probabilistic; and can be potentially manipulated to induce illusory experience by the presentation of ambiguous or improbable evidence under selective (spatio-temporal) constraints. Accordingly, perception of the McGurk effect, by which individuals misperceive specific incongruent visual and auditory vocal cues, rests upon effective probabilistic inference. Here, we report findings from a behavioral investigation of illusory perception and related metacognitive evaluation during audiovisual integration, conducted in individuals with schizophrenia (n = 30) and control subjects (n = 24) matched in terms of age, sex, handedness and parental occupation. Controls additionally performed the task after an oral dose of amisulpride (400 mg). Individuals with schizophrenia were observed to exhibit illusory perception less frequently than controls, despite non-significant differences in perceptual performance during control conditions. Furthermore, older individuals with schizophrenia exhibited reduced rates of illusory perception. Subsequent analysis revealed a robust inverse relationship between illness chronicity and the illusory perception rate in this group. Controls demonstrated non-significant modulation of perception by amisulpride; amisulpride was, however, found to elicit increases in subjective confidence in perceptual performance. Overall, these findings are consistent with the idea that impairments in probabilistic inference are exhibited in schizophrenia and exacerbated by illness chronicity. The latter suggests that associated processes are a potentially worthwhile target for therapeutic intervention.
Collapse
Affiliation(s)
- Thomas P White
- Department of Psychosis Studies, Institute of Psychiatry, King's College London London, UK
| | - Rebekah L Wigton
- Department of Psychosis Studies, Institute of Psychiatry, King's College London London, UK
| | - Dan W Joyce
- Department of Psychosis Studies, Institute of Psychiatry, King's College London London, UK
| | - Tracy Bobin
- Department of Psychosis Studies, Institute of Psychiatry, King's College London London, UK
| | - Christian Ferragamo
- Department of Psychosis Studies, Institute of Psychiatry, King's College London London, UK
| | - Nisha Wasim
- Department of Psychosis Studies, Institute of Psychiatry, King's College London London, UK
| | - Stephen Lisk
- Department of Psychosis Studies, Institute of Psychiatry, King's College London London, UK
| | - Sukhwinder S Shergill
- Department of Psychosis Studies, Institute of Psychiatry, King's College London London, UK
| |
Collapse
|
25
|
Affiliation(s)
- Kaisa Tiippana
- Division of Cognitive Psychology and Neuropsychology, Institute of Behavioural Sciences, University of Helsinki Helsinki, Finland
| |
Collapse
|
26
|
Sekiyama K, Soshi T, Sakamoto S. Enhanced audiovisual integration with aging in speech perception: a heightened McGurk effect in older adults. Front Psychol 2014; 5:323. [PMID: 24782815 PMCID: PMC3995044 DOI: 10.3389/fpsyg.2014.00323] [Citation(s) in RCA: 55] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2014] [Accepted: 03/28/2014] [Indexed: 11/13/2022] Open
Abstract
Two experiments compared young and older adults in order to examine whether aging leads to a larger dependence on visual articulatory movements in auditory-visual speech perception. These experiments examined accuracy and response time in syllable identification for auditory-visual (AV) congruent and incongruent stimuli. There were also auditory-only (AO) and visual-only (VO) presentation modes. Data were analyzed only for participants with normal hearing. It was found that the older adults were more strongly influenced by visual speech than the younger ones for acoustically identical signal-to-noise ratios (SNRs) of auditory speech (Experiment 1). This was also confirmed when the SNRs of auditory speech were calibrated for the equivalent AO accuracy between the two age groups (Experiment 2). There were no aging-related differences in VO lipreading accuracy. Combined with response time data, this enhanced visual influence for the older adults was likely to be associated with an aging-related delay in auditory processing.
Collapse
Affiliation(s)
- Kaoru Sekiyama
- Division of Cognitive Psychology, Faculty of Letters, Kumamoto University Kumamoto, Japan ; Division of Cognitive Psychology, School of Systems Information Science, Future University Hakodate, Japan
| | - Takahiro Soshi
- Division of Cognitive Psychology, Faculty of Letters, Kumamoto University Kumamoto, Japan
| | | |
Collapse
|