1
|
Increased [ 18F]Fluorodeoxyglucose Uptake in the Left Pallidum in Military Veterans with Blast-Related Mild Traumatic Brain Injury: Potential as an Imaging Biomarker and Mediation with Executive Dysfunction and Cognitive Impairment. J Neurotrauma 2024. [PMID: 38661540 DOI: 10.1089/neu.2023.0429] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/26/2024] Open
Abstract
Blast-related mild traumatic brain injury (blast-mTBI) can result in a spectrum of persistent symptoms leading to substantial functional impairment and reduced quality of life. Clinical evaluation and discernment from other conditions common to military service can be challenging and subject to patient recall bias and the limitations of available assessment measures. The need for objective biomarkers to facilitate accurate diagnosis, not just for symptom management and rehabilitation but for prognostication and disability compensation purposes is clear. Toward this end, we compared regional brain [18F]fluorodeoxyglucose-positron emission tomography ([18F]FDG-PET) intensity-scaled uptake measurements and motor, neuropsychological, and behavioral assessments in 79 combat Veterans with retrospectively recalled blast-mTBI with 41 control participants having no lifetime history of TBI. Using an agnostic and unbiased approach, we found significantly increased left pallidum [18F]FDG-uptake in Veterans with blast-mTBI versus control participants, p < 0.0001; q = 3.29 × 10-9 [Cohen's d, 1.38, 95% confidence interval (0.96, 1.79)]. The degree of left pallidum [18F]FDG-uptake correlated with the number of self-reported blast-mTBIs, r2 = 0.22; p < 0.0001. Greater [18F]FDG-uptake in the left pallidum provided excellent discrimination between Veterans with blast-mTBI and controls, with a receiver operator characteristic area under the curve of 0.859 (p < 0.0001) and likelihood ratio of 21.19 (threshold:SUVR ≥ 0.895). Deficits in executive function assessed using the Behavior Rating Inventory of Executive Function-Adult Global Executive Composite T-score were identified in Veterans with blast-mTBI compared with controls, p < 0.0001. Regression-based mediation analyses determined that in Veterans with blast-mTBI, increased [18F]FDG-uptake in the left pallidum-mediated executive function impairments, adjusted causal mediation estimate p = 0.021; total effect estimate, p = 0.039. Measures of working and prospective memory (Auditory Consonant Trigrams test and Memory for Intentions Test, respectively) were negatively correlated with left pallidum [18F]FDG-uptake, p < 0.0001, with mTBI as a covariate. Increased left pallidum [18F]FDG-uptake in Veterans with blast-mTBI compared with controls did not covary with dominant handedness or with motor activity assessed using the Unified Parkinson's Disease Rating Scale. Localized increased [18F]FDG-uptake in the left pallidum may reflect a compensatory response to functional deficits following blast-mTBI. Limited imaging resolution does not allow us to distinguish subregions of the pallidum; however, the significant correlation of our data with behavioral but not motor outcomes suggests involvement of the ventral pallidum, which is known to regulate motivation, behavior, and emotions through basal ganglia-thalamo-cortical circuits. Increased [18F]FDG-uptake in the left pallidum in blast-mTBI versus control participants was consistently identified using two different PET scanners, supporting the generalizability of this finding. Although confirmation of our results by single-subject-to-cohort analyses will be required before clinical deployment, this study provides proof of concept that [18F]FDG-PET bears promise as a readily available noninvasive biomarker for blast-mTBI. Further, our findings support a causative relationship between executive dysfunction and increased [18F]FDG-uptake in the left pallidum.
Collapse
|
2
|
Longitudinal Sleep Patterns and Cognitive Impairment in Older Adults. JAMA Netw Open 2023; 6:e2346006. [PMID: 38048131 PMCID: PMC10696486 DOI: 10.1001/jamanetworkopen.2023.46006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/27/2023] [Accepted: 10/22/2023] [Indexed: 12/05/2023] Open
Abstract
Importance Sleep disturbances and clinical sleep disorders are associated with all-cause dementia and neurodegenerative conditions, but it remains unclear how longitudinal changes in sleep impact the incidence of cognitive impairment. Objective To evaluate the association of longitudinal sleep patterns with age-related changes in cognitive function in healthy older adults. Design, Setting, and Participants This cross-sectional study is a retrospective longitudinal analyses of the Seattle Longitudinal Study (SLS), which evaluated self-reported sleep duration (1993-2012) and cognitive performance (1997-2020) in older adults. Participants within the SLS were enrolled as part of a community-based cohort from the Group Health Cooperative of Puget Sound and Health Maintenance Organization of Washington between 1956 and 2020. Data analysis was performed from September 2020 to May 2023. Main Outcomes and Measures The main outcome for this study was cognitive impairment, as defined by subthreshold performance on both the Mini-Mental State Examination and the Mattis Dementia Rating Scale. Sleep duration was defined by self-report of median nightly sleep duration over the last week and was assessed longitudinally over multiple time points. Median sleep duration, sleep phenotype (short sleep, median ≤7 hours; medium sleep, median = 7 hour; long sleep, median ≥7 hours), change in sleep duration (slope), and variability in sleep duration (SD of median sleep duration, or sleep variability) were evaluated. Results Of the participants enrolled in SLS, only 1104 participants who were administered both the Health Behavior Questionnaire and the neuropsychologic battery were included for analysis in this study. A total of 826 individuals (mean [SD] age, 76.3 [11.8] years; 468 women [56.7%]; 217 apolipoprotein E ε4 allele carriers [26.3%]) had complete demographic information and were included in the study. Analysis using a Cox proportional hazard regression model (concordance, 0.76) showed that status as a short sleeper (hazard ratio, 3.67; 95% CI, 1.59-8.50) and higher sleep variability (hazard ratio, 3.06; 95% CI, 1.14-5.49) were significantly associated with the incidence of cognitive impairment. Conclusions and Relevance In this community-based longitudinal study of the association between sleep patterns and cognitive performance, the short sleep phenotype was significantly associated with impaired cognitive performance. Furthermore, high sleep variability in longitudinal sleep duration was significantly associated with the incidence of cognitive impairment, highlighting the possibility that instability in sleep duration over long periods of time may impact cognitive decline in older adults.
Collapse
|
3
|
Instability in longitudinal sleep duration predicts cognitive impairment in aged participants of the Seattle Longitudinal Study. MEDRXIV : THE PREPRINT SERVER FOR HEALTH SCIENCES 2023:2023.06.07.23291098. [PMID: 37398317 PMCID: PMC10312848 DOI: 10.1101/2023.06.07.23291098] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/04/2023]
Abstract
Importance Sleep disturbances and clinical sleep disorders are associated with all-cause dementia and neurodegenerative conditions. It remains unclear how longitudinal changes in sleep impact the incidence of cognitive impairment. Objective To evaluate how longitudinal sleep patterns contribute to age-related changes in cognitive function in healthy adults. Design Setting Participants This study utilizes retrospective longitudinal analyses of a community-based study within Seattle, evaluating self-reported sleep (1993-2012) and cognitive performance (1997-2020) in aged adults. Main Outcomes and Measures The main outcome is cognitive impairment as defined by sub-threshold performance on 2 of 4 neuropsychological batteries: Mini-Mental State Examination (MMSE), Mattis Dementia Rating Scale, Trail Making Test, and Wechsler Adult Intelligent Scale (Revised). Sleep duration was defined through self-report of 'average nightly sleep duration over the last week' and assessed longitudinally. Median sleep duration, change in sleep duration (slope), variability in sleep duration (standard deviation, Sleep Variability), and sleep phenotype ("Short Sleep" median ≤7hrs.; "Medium Sleep" median = 7hrs; "Long Sleep" median ≥7hrs.). Results A total of 822 individuals (mean age of 76.2 years [11.8]; 466 women [56.7%]; 216 APOE allele positive [26.3%]) were included in the study. Analysis using a Cox Proportional Hazard Regression model (concordance 0.70) showed that increased Sleep Variability (95% CI [1.27,3.86]) was significantly associated with the incidence of cognitive impairment. Further analysis using linear regression prediction analysis (R2=0.201, F (10, 168)=6.010, p=2.67E-07) showed that high Sleep Variability (β=0.3491; p=0.048) was a significant predictor of cognitive impairment over a 10-year period. Conclusions and Relevance High variability in longitudinal sleep duration was significantly associated with the incidence of cognitive impairment and predictive of decline in cognitive performance ten years later. These data highlight that instability in longitudinal sleep duration may contribute to age-related cognitive decline.
Collapse
|
4
|
Timing matters: Sex differences in inflammatory and behavioral outcomes following repetitive blast mild traumatic brain injury. Brain Behav Immun 2023; 110:222-236. [PMID: 36907289 PMCID: PMC10106404 DOI: 10.1016/j.bbi.2023.03.003] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/14/2022] [Revised: 02/25/2023] [Accepted: 03/01/2023] [Indexed: 03/14/2023] Open
Abstract
BACKGROUND Repetitive blast-related mild traumatic brain injury (mTBI) caused by exposure to high explosives is increasingly common among warfighters as well as civilians. While women have been serving in military positions with increased risk of blast exposure since 2016, there are few published reports examining sex as a biological variable in models of blast mTBI, greatly limiting diagnosis and treatment capabilities. As such, here we examined outcomes of repetitive blast trauma in female and male mice in relation to potential behavioral, inflammatory, microbiome, and vascular dysfunction at multiple timepoints. METHODS In this study we utilized a well-established blast overpressure model to induce repetitive (3x) blast-mTBI in both female and male mice. Acutely following repetitive exposure, we measured serum and brain cytokine levels, blood-brain barrier (BBB) disruption, fecal microbial abundance, and locomotion and anxiety-like behavior in the open field assay. At the one-month timepoint, in female and male mice we assessed behavioral correlates of mTBI and PTSD-related symptoms commonly reported by Veterans with a history of blast-mTBI using the elevated zero maze, acoustic startle, and conditioned odorant aversion paradigms. RESULTS Repetitive blast exposure resulted in both similar (e.g., increased IL-6), and disparate (e.g., IL-10 increase only in females) patterns of acute serum and brain cytokine as well as gut microbiome changes in female and male mice. Acute BBB disruption following repetitive blast exposure was apparent in both sexes. While female and male blast mice both exhibited acute locomotor and anxiety-like deficits in the open field assay, only male mice exhibited adverse behavioral outcomes that lasted at least one-month. DISCUSSION Representing a novel survey of potential sex differences following repetitive blast trauma, our results demonstrate unique similar yet divergent patterns of blast-induced dysfunction in female vs. male mice and highlight novel targets for future diagnosis and therapeutic development.
Collapse
|
5
|
Poorer prospective memory performance is associated with reduced time monitoring among OEF/OIF/OND Veterans with a history of blast-related mild traumatic brain injury. Clin Neuropsychol 2023; 37:577-594. [PMID: 35689397 DOI: 10.1080/13854046.2022.2068455] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022]
Abstract
Objective:Prospective memory (PM) or "remembering to remember" has been shown to be reduced in Veterans with histories of mild traumatic brain injury (mTBI), particularly on tasks with high strategic demands such as recalling time-based information in the absence of external cues. This study examined whether time monitoring during a PM task was reduced in Veterans with a history of mTBI and was associated with time-based PM performance. Method:Veterans with a history of mTBI (n = 49) and Veterans without a history of TBI (n = 16) completed the Memory for Intentions Screening Test (MIST) as a measure of PM during which their time monitoring (i.e. number of clock checks) was recorded. Results:Adjusting for age, education, depression, and PTSD symptoms, negative binomial regression revealed that the mTBI group checked the clock less frequently compared to the control group (Cohen's d = 0.84, p = 0.005). Within the mTBI group, less frequent time monitoring across the entire MIST task was associated with poorer time-based MIST performance (rs = .57, p < 0.001), but not with event-based MIST (rs = .04, p = 0.768). Conclusions:Veterans with a history of mTBI evidenced significantly reduced time monitoring during a PM task compared to Veterans without a history mTBI, which was associated with strategically-demanding PM. Current findings provide that mTBI-associated difficulties with strategic aspects of PM may be due to reduced time monitoring. Future studies are needed to determine if reduced time monitoring also contributes to mTBI-associated PM difficulties in the real-world (e.g. medication non-adherence).
Collapse
|
6
|
Sexually dimorphic development of the mesolimbic dopamine system is associated with nuanced sensitivity to adolescent alcohol use. Front Behav Neurosci 2023; 17:1124979. [PMID: 36910128 PMCID: PMC9992416 DOI: 10.3389/fnbeh.2023.1124979] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2022] [Accepted: 01/30/2023] [Indexed: 02/24/2023] Open
Abstract
Alcohol use remains a major public health concern and is especially prevalent during adolescence. Adolescent alcohol use has been linked to several behavioral abnormalities in later life, including increased risk taking and impulsivity. Accordingly, when modeled in animals, male rats that had moderate alcohol consumption during adolescence exhibit multiple effects in adulthood, including increased risk taking, altered incentive learning, and greater release of dopamine in the mesolimbic pathway. It has been proposed that alcohol arrests neural development, "locking in" adolescent physiological, and consequent behavioral, phenotypes. Here we examined the feasibility that the elevated dopamine levels following adolescent alcohol exposure are a "locked in" phenotype by testing mesolimbic dopamine release across adolescent development. We found that in male rats, dopamine release peaks in late adolescence, returning to lower levels in adulthood, consistent with the notion that high dopamine levels in adolescence-alcohol-exposed adults were due to arrested development. Surprisingly, dopamine release in females was stable across the tested developmental window. This result raised a quandary that arrested dopamine levels would not differ from normal development in females and, therefore, may not contribute to pathological behavior. However, the aforementioned findings related to risk-based decision-making have only been performed in male subjects. When we tested females that had undergone adolescent alcohol use, we found that neither risk attitude during probabilistic decision-making nor mesolimbic dopamine release was altered. These findings suggest that different developmental profiles of the mesolimbic dopamine system across sexes result in dimorphic susceptibility to alcohol-induced cognitive and motivational anomalies exposure.
Collapse
|
7
|
The dynorphin/kappa opioid receptor mediates adverse immunological and behavioral outcomes induced by repetitive blast trauma. J Neuroinflammation 2022; 19:288. [PMID: 36463243 PMCID: PMC9719647 DOI: 10.1186/s12974-022-02643-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2022] [Accepted: 11/11/2022] [Indexed: 12/04/2022] Open
Abstract
BACKGROUND Adverse pathophysiological and behavioral outcomes related to mild traumatic brain injury (mTBI), posttraumatic stress disorder (PTSD), and chronic pain are common following blast exposure and contribute to decreased quality of life, but underlying mechanisms and prophylactic/treatment options remain limited. The dynorphin/kappa opioid receptor (KOR) system helps regulate behavioral and inflammatory responses to stress and injury; however, it has yet to be investigated as a potential mechanism in either humans or animals exposed to blast. We hypothesized that blast-induced KOR activation mediates adverse outcomes related to inflammation and affective behavioral response. METHODS C57Bl/6 adult male mice were singly or repeatedly exposed to either sham (anesthesia only) or blast delivered by a pneumatic shock tube. The selective KOR antagonist norBNI or vehicle (saline) was administered 72 h prior to repetitive blast or sham exposure. Serum and brain were collected 10 min or 4 h post-exposure for dynorphin A-like immunoreactivity and cytokine measurements, respectively. At 1-month post-exposure, mice were tested in a series of behavioral assays related to adverse outcomes reported by humans with blast trauma. RESULTS Repetitive but not single blast exposure resulted in increased brain dynorphin A-like immunoreactivity. norBNI pretreatment blocked or significantly reduced blast-induced increase in serum and brain cytokines, including IL-6, at 4 h post exposure and aversive/anxiety-like behavioral dysfunction at 1-month post-exposure. CONCLUSIONS Our findings demonstrate a previously unreported role for the dynorphin/KOR system as a mediator of biochemical and behavioral dysfunction following repetitive blast exposure and highlight this system as a potential prophylactic/therapeutic treatment target.
Collapse
|
8
|
Repetitive Blast Exposure Increases Appetitive Motivation and Behavioral Inflexibility in Male Mice. Front Behav Neurosci 2022; 15:792648. [PMID: 35002648 PMCID: PMC8727531 DOI: 10.3389/fnbeh.2021.792648] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2021] [Accepted: 11/25/2021] [Indexed: 12/02/2022] Open
Abstract
Blast exposure (via detonation of high explosives) represents a major potential trauma source for Servicemembers and Veterans, often resulting in mild traumatic brain injury (mTBI). Executive dysfunction (e.g., alterations in memory, deficits in mental flexibility, difficulty with adaptability) is commonly reported by Veterans with a history of blast-related mTBI, leading to impaired daily functioning and decreased quality of life, but underlying mechanisms are not fully understood and have not been well studied in animal models of blast. To investigate potential underlying behavioral mechanisms contributing to deficits in executive functioning post-blast mTBI, here we examined how a history of repetitive blast exposure in male mice affects anxiety/compulsivity-like outcomes and appetitive goal-directed behavior using an established mouse model of blast mTBI. We hypothesized that repetitive blast exposure in male mice would result in anxiety/compulsivity-like outcomes and corresponding performance deficits in operant-based reward learning and behavioral flexibility paradigms. Instead, results demonstrate an increase in reward-seeking and goal-directed behavior and a congruent decrease in behavioral flexibility. We also report chronic adverse behavioral changes related to anxiety, compulsivity, and hyperarousal. In combination, these data suggest that potential deficits in executive function following blast mTBI are at least in part related to enhanced compulsivity/hyperreactivity and behavioral inflexibility and not simply due to a lack of motivation or inability to acquire task parameters, with important implications for subsequent diagnosis and treatment management.
Collapse
|
9
|
Catecholaminergic Innervation of the Lateral Nucleus of the Cerebellum Modulates Cognitive Behaviors. J Neurosci 2021; 41:3512-3530. [PMID: 33536201 PMCID: PMC8051686 DOI: 10.1523/jneurosci.2406-20.2021] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2020] [Revised: 01/12/2021] [Accepted: 01/16/2021] [Indexed: 11/21/2022] Open
Abstract
The cerebellum processes neural signals related to rewarding and aversive stimuli, suggesting that the cerebellum supports nonmotor functions in cognitive and emotional domains. Catecholamines are a class of neuromodulatory neurotransmitters well known for encoding such salient stimuli. Catecholaminergic modulation of classical cerebellar functions have been demonstrated. However, a role for cerebellar catecholamines in modulating cerebellar nonmotor functions is unknown. Using biochemical methods in male mice, we comprehensively mapped TH+ fibers throughout the entire cerebellum and known precerebellar nuclei. Using electrochemical (fast scan cyclic voltammetry), and viral/genetic methods to selectively delete Th in fibers innervating the lateral cerebellar nucleus (LCN), we interrogated sources and functional roles of catecholamines innervating the LCN, which is known for its role in supporting cognition. The LCN has the most TH+ fibers in cerebellum, as well as the most change in rostrocaudal expression among the cerebellar nuclei. Norepinephrine is the major catecholamine measured in LCN. Distinct catecholaminergic projections to LCN arise only from locus coeruleus, and a subset of Purkinje cells that are positive for staining of TH. LC stimulation was sufficient to produce catecholamine release in LCN. Deletion of Th in fibers innervating LCN (LCN-Th-cKO) resulted in impaired sensorimotor integration, associative fear learning, response inhibition, and working memory in LCN-Th-cKO mice. Strikingly, selective inhibition of excitatory LCN output neurons with inhibitory designer receptor exclusively activated by designer drugs led to facilitation of learning on the same working memory task impaired in LCN-Th-cKO mice. Collectively, these data demonstrate a role for LCN catecholamines in cognitive behaviors.SIGNIFICANCE STATEMENT Here, we report on interrogating sources and functional roles of catecholamines innervating the lateral nucleus of the cerebellum (LCN). We map and quantify expression of TH, the rate-limiting enzyme in catecholamine synthesis, in the entire cerebellar system, including several precerebellar nuclei. We used cyclic voltammetry and pharmacology to demonstrate sufficiency of LC stimulation to produce catecholamine release in LCN. We used advanced viral techniques to map and selectively KO catecholaminergic neurotransmission to the LCN, and characterized significant cognitive deficits related to this manipulation. Finally, we show that inhibition of excitatory LCN neurons with designer receptor exclusively activated by designer drugs, designed to mimic Gi-coupled catecholamine GPCR signaling, results in facilitation of a working memory task impaired in LCN-specific TH KO mice.
Collapse
|
10
|
Repetitive blast mild traumatic brain injury increases ethanol sensitivity in male mice and risky drinking behavior in male combat veterans. Alcohol Clin Exp Res 2021; 45:1051-1064. [PMID: 33760264 DOI: 10.1111/acer.14605] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2020] [Revised: 03/16/2021] [Accepted: 03/17/2021] [Indexed: 01/04/2023]
Abstract
BACKGROUND Mild traumatic brain injury (mTBI) is common in civilians and highly prevalent among military service members. mTBI can increase health risk behaviors (e.g., sensation seeking, impulsivity) and addiction risk (e.g., for alcohol use disorder (AUD)), but how mTBI and substance use might interact to promote addiction risk remains poorly understood. Likewise, potential differences in single vs. repetitive mTBI in relation to alcohol use/abuse have not been previously examined. METHODS Here, we examined how a history of single (1×) or repetitive (3×) blast exposure (blast-mTBI) affects ethanol (EtOH)-induced behavioral and physiological outcomes using an established mouse model of blast-mTBI. To investigate potential translational relevance, we also examined self-report responses to the Alcohol Use Disorders Identification Test-Consumption questions (AUDIT-C), a widely used measure to identify potential hazardous drinking and AUD, and used a novel unsupervised machine learning approach to investigate whether a history of blast-mTBI affected drinking behaviors in Iraq/Afghanistan Veterans. RESULTS Both single and repetitive blast-mTBI in mice increased the sedative properties of EtOH (with no change in tolerance or metabolism), but only repetitive blast potentiated EtOH-induced locomotor stimulation and shifted EtOH intake patterns. Specifically, mice exposed to repetitive blasts showed increased consumption "front-loading" (e.g., a higher rate of consumption during an initial 2-h acute phase of a 24-h alcohol access period and decreased total daily intake) during an intermittent 2-bottle choice condition. Examination of AUDIT-C scores in Iraq/Afghanistan Veterans revealed an optimal 3-cluster solution: "low" (low intake and low frequency), "frequent" (low intake and high frequency), and "risky" (high intake and high frequency), where Veterans with a history of blast-mTBI displayed a shift in cluster assignment from "frequent" to "risky," as compared to Veterans who were deployed to Iraq/Afghanistan but had no lifetime history of TBI. CONCLUSIONS Together, these results offer new insight into how blast-mTBI may give increase AUD risk and highlight the increased potential for adverse health risk behaviors following repetitive blast-mTBI.
Collapse
|
11
|
Repetitive Blast Promotes Chronic Aversion to Neutral Cues Encountered in the Peri-Blast Environment. J Neurotrauma 2020; 38:940-948. [PMID: 33138684 DOI: 10.1089/neu.2020.7061] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Repetitive mild traumatic brain injury (mTBI) has been called the "signature injury" of military service members in the Iraq and Afghanistan wars and is highly comorbid with post-traumatic stress disorder (PTSD). Correct attribution of adverse blast-induced mTBI and/or PTSD remains challenging. Pre-clinical research using animal models can provide important insight into the mechanisms by which blast produces injury and dysfunction-but only to the degree by which such models reflect the human experience. Avoidance of trauma reminders is a hallmark of PTSD. Here, we sought to understand whether a mouse model of blast reproduces this phenomenon, in addition to blast-induced physical injuries. Drawing on well-established work from the chronic stress and Pavlovian conditioning literature, we hypothesized that even while one is anesthetized during blast exposure, environmental cues encountered in the peri-blast environment could be conditioned to evoke aversion/dysphoria and re-experiencing of traumatic stress. Using a pneumatic shock tube that recapitulates battlefield-relevant open-field blast forces, we provide direct evidence that stress is inherent to repetitive blast exposure, resulting in chronic aversive/dysphoric-like responses to previous blast-paired cues. The results in this report demonstrate that, although both single and repetitive blast exposures produce acute stress responses (weight loss, corticosterone increase), only repetitive blast exposure also results in co-occurring aversive/dysphoric-like stress responses. These results extend appreciation of the highly complex nature of repetitive blast exposure; and lend further support for the potential translational relevance of animal modeling approaches currently used by multiple laboratories aimed at elucidating the mechanisms (both molecular and behavioral) of repetitive blast exposure.
Collapse
|
12
|
Chronic elevation of plasma vascular endothelial growth factor-A (VEGF-A) is associated with a history of blast exposure. J Neurol Sci 2020; 417:117049. [PMID: 32758764 PMCID: PMC7492467 DOI: 10.1016/j.jns.2020.117049] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2019] [Revised: 06/23/2020] [Accepted: 07/15/2020] [Indexed: 02/02/2023]
Abstract
Mounting evidence points to the significance of neurovascular-related dysfunction in veterans with blast-related mTBI, which is also associated with reduced [18F]-fluorodeoxyglucose (FDG) uptake. The goal of this study was to determine whether plasma VEGF-A is altered in veterans with blast-related mTBI and address whether VEGF-A levels correlate with FDG uptake in the cerebellum, a brain region that is vulnerable to blast-related injury 72 veterans with blast-related mTBI (mTBI) and 24 deployed control (DC) veterans with no lifetime history of TBI were studied. Plasma VEGF-A was significantly elevated in mTBIs compared to DCs. Plasma VEGF-A levels in mTBIs were significantly negatively correlated with FDG uptake in cerebellum. In addition, performance on a Stroop color/word interference task was inversely correlated with plasma VEGF-A levels in blast mTBI veterans. Finally, we observed aberrant perivascular VEGF-A immunoreactivity in postmortem cerebellar tissue and not cortical or hippocampal tissues from blast mTBI veterans. These findings add to the limited number of plasma proteins that are chronically elevated in veterans with a history of blast exposure associated with mTBI. It is likely the elevated VEGF-A levels are from peripheral sources. Nonetheless, increasing plasma VEGF-A concentrations correlated with chronically decreased cerebellar glucose metabolism and poorer performance on tasks involving cognitive inhibition and set shifting. These results strengthen an emerging view that cognitive complaints and functional brain deficits caused by blast exposure are associated with chronic blood-brain barrier injury and prolonged recovery in affected regions.
Collapse
|
13
|
Maladaptive Decision Making in Adults with a History of Adolescent Alcohol use, in a Preclinical Model, Is Attributable to the Compromised Assignment of Incentive Value during Stimulus-Reward Learning. Front Behav Neurosci 2017; 11:134. [PMID: 28790900 PMCID: PMC5524919 DOI: 10.3389/fnbeh.2017.00134] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2017] [Accepted: 07/07/2017] [Indexed: 01/22/2023] Open
Abstract
According to recent WHO reports, alcohol remains the number one substance used and abused by adolescents, despite public health efforts to curb its use. Adolescence is a critical period of biological maturation where brain development, particularly the mesocorticolimbic dopamine system, undergoes substantial remodeling. These circuits are implicated in complex decision making, incentive learning and reinforcement during substance use and abuse. An appealing theoretical approach has been to suggest that alcohol alters the normal development of these processes to promote deficits in reinforcement learning and decision making, which together make individuals vulnerable to developing substance use disorders in adulthood. Previously we have used a preclinical model of voluntary alcohol intake in rats to show that use in adolescence promotes risky decision making in adulthood that is mirrored by selective perturbations in dopamine network dynamics. Further, we have demonstrated that incentive learning processes in adulthood are also altered by adolescent alcohol use, again mirrored by changes in cue-evoked dopamine signaling. Indeed, we have proposed that these two processes, risk-based decision making and incentive learning, are fundamentally linked through dysfunction of midbrain circuitry where inputs to the dopamine system are disrupted by adolescent alcohol use. Here, we test the behavioral predictions of this model in rats and present the findings in the context of the prevailing literature with reference to the long-term consequences of early-life substance use on the vulnerability to develop substance use disorders. We utilize an impulsive choice task to assess the selectivity of alcohol’s effect on decision-making profiles and conditioned reinforcement to parse out the effect of incentive value attribution, one mechanism of incentive learning. Finally, we use the differential reinforcement of low rates of responding (DRL) task to examine the degree to which behavioral disinhibition may contribute to an overall decision-making profile. The findings presented here support the proposition that early life alcohol use selectively alters risk-based choice behavior through modulation of incentive learning processes, both of which may be inexorably linked through perturbations in mesolimbic circuitry and may serve as fundamental vulnerabilities to the development of substance use disorders.
Collapse
|
14
|
Blast-related disinhibition and risk seeking in mice and combat Veterans: Potential role for dysfunctional phasic dopamine release. Neurobiol Dis 2017; 106:23-34. [PMID: 28619545 DOI: 10.1016/j.nbd.2017.06.004] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2016] [Revised: 05/22/2017] [Accepted: 06/09/2017] [Indexed: 01/23/2023] Open
Abstract
Mild traumatic brain injury (mTBI) caused by exposure to high explosives has been called the "signature injury" of the wars in Iraq and Afghanistan. There is a wide array of chronic neurological and behavioral symptoms associated with blast-induced mTBI. However, the underlying mechanisms are not well understood. Here we used a battlefield-relevant mouse model of blast-induced mTBI and in vivo fast-scan cyclic voltammetry (FSCV) to investigate whether the mesolimbic dopamine system contributes to the mechanisms underlying blast-induced behavioral dysfunction. In mice, blast exposure increased novelty seeking, a behavior closely associated with disinhibition and risk for subsequent maladaptive behaviors. In keeping with this, we found that veterans with blast-related mTBI reported greater disinhibition and risk taking on the Frontal Systems Behavior Scale (FrSBe). In addition, in mice we report that blast exposure causes potentiation of evoked phasic dopamine release in the nucleus accumbens. Taken together these findings suggest that blast-induced changes in the dopaminergic system may mediate aspects of the complex array of behavioral dysfunctions reported in blast-exposed veterans.
Collapse
|
15
|
Reversal of Alcohol-Induced Dysregulation in Dopamine Network Dynamics May Rescue Maladaptive Decision-making. J Neurosci 2016; 36:3698-708. [PMID: 27030756 PMCID: PMC4812130 DOI: 10.1523/jneurosci.4394-15.2016] [Citation(s) in RCA: 36] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2015] [Revised: 02/23/2016] [Accepted: 02/25/2016] [Indexed: 11/21/2022] Open
Abstract
Alcohol is the most commonly abused substance among adolescents, promoting the development of substance use disorders and compromised decision-making in adulthood. We have previously demonstrated, with a preclinical model in rodents, that adolescent alcohol use results in adult risk-taking behavior that positively correlates with phasic dopamine transmission in response to risky options, but the underlying mechanisms remain unknown. Here, we show that adolescent alcohol use may produce maladaptive decision-making through a disruption in dopamine network dynamics via increased GABAergic transmission within the ventral tegmental area (VTA). Indeed, we find that increased phasic dopamine signaling after adolescent alcohol use is attributable to a midbrain circuit, including the input from the pedunculopontine tegmentum to the VTA. Moreover, we demonstrate that VTA dopamine neurons from adult rats exhibit enhanced IPSCs after adolescent alcohol exposure corresponding to decreased basal dopamine levels in adulthood that negatively correlate with risk-taking. Building on these findings, we develop a model where increased inhibitory tone on dopamine neurons leads to a persistent decrease in tonic dopamine levels and results in a potentiation of stimulus-evoked phasic dopamine release that may drive risky choice behavior. Based on this model, we take a pharmacological approach to the reversal of risk-taking behavior through normalization of this pattern in dopamine transmission. These results isolate the underlying circuitry involved in alcohol-induced maladaptive decision-making and identify a novel therapeutic target. SIGNIFICANCE STATEMENT One of the primary problems resulting from chronic alcohol use is persistent, maladaptive decision-making that is associated with ongoing addiction vulnerability and relapse. Indeed, studies with the Iowa Gambling Task, a standard measure of risk-based decision-making, have reliably shown that alcohol-dependent individuals make riskier, more maladaptive choices than nondependent individuals, even after periods of prolonged abstinence. Using a preclinical model, in the current work, we identify a selective disruption in dopamine network dynamics that may promote maladaptive decision-making after chronic adolescent alcohol use and demonstrate its pharmacological reversal in adulthood. Together, these results highlight a novel neural mechanism underlying heightened risk-taking behavior in alcohol-dependent individuals and provide a potential therapeutic target for further investigation.
Collapse
|
16
|
Chronic alcohol intake during adolescence, but not adulthood, promotes persistent deficits in risk-based decision making. Alcohol Clin Exp Res 2014; 38:1622-9. [PMID: 24689661 DOI: 10.1111/acer.12404] [Citation(s) in RCA: 47] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/27/2013] [Accepted: 02/08/2014] [Indexed: 11/27/2022]
Abstract
BACKGROUND Adolescent alcohol use is a major public health concern and is strongly correlated with the development of alcohol abuse problems in adulthood. Adolescence is characterized by maturation and remodeling of brain regions implicated in decision making and therefore may be uniquely vulnerable to environmental insults such as alcohol exposure. We have previously demonstrated that voluntary alcohol consumption in adolescence results in maladaptive risk-based decision making in adulthood. However, it is unclear whether this effect on risk-based decision making can be attributed to chronic alcohol use in general or to a selective effect of alcohol use during the adolescent period. METHODS Ethanol (EtOH) was presented to adolescent (postnatal day [PND] 30 to 49) and adult rats (PND 80 to 99) for 20 days, either 24 hours or 1 h/d, in a gel matrix consisting of distilled water, gelatin, polycose (10%), and EtOH (10%). The 24-hour time course of EtOH intake was measured and compared between adolescent and adult animals. Following 20 days of withdrawal from EtOH, we assessed risk-based decision making with a concurrent instrumental probability-discounting task. Blood EtOH concentrations (BECs) were taken from trunk blood and assessed using the Analox micro-stat GM7 in separate groups of animals at different time points. RESULTS Unlike animals exposed to EtOH during adolescence, animals exposed to alcohol during adulthood did not display differences in risk preference compared to controls. Adolescent and adult rats displayed similar EtOH intake levels and patterns when given either 24- or 1-hour access per day. In addition, while both groups reached significant BEC levels, we failed to find a difference between adult and adolescent animals. CONCLUSIONS Here, we show that adolescent, but not adult, EtOH intake leads to a persistent increase in risk preference which cannot be attributed to differences in intake levels or BECs attained. Our findings support previous work implicating adolescence as a time period of heightened susceptibility to the long-term negative effects of alcohol exposure.
Collapse
|
17
|
Long-term effects of neonatal stress on adult conditioned place preference (CPP) and hippocampal neurogenesis. Behav Brain Res 2011; 227:7-11. [PMID: 22061798 DOI: 10.1016/j.bbr.2011.10.033] [Citation(s) in RCA: 37] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2011] [Revised: 10/14/2011] [Accepted: 10/21/2011] [Indexed: 11/26/2022]
Abstract
Critically ill preterm infants are often exposed to stressors that may affect neurodevelopment and behavior. We reported that exposure of neonatal mice to stressors or morphine produced impairment of adult morphine-rewarded conditioned place preference (CPP) and altered hippocampal gene expression. We now further this line of inquiry by examining both short- and long-term effects of neonatal stress and morphine treatment. Neonatal C57BL/6 mice were treated twice daily from postnatal day (P) 5 to P9 using different combinations of factors. Subsets received saline or morphine injections (2mg/kgs.c.) or were exposed to our neonatal stress protocol (maternal separation 8h/d × 5d+gavage feedings ± hypoxia/hyperoxia). Short-term measures examined on P9 were neuronal fluorojade B and bromodeoxyuridine staining, along with urine corticosterone concentrations. Long-term measures examined in adult mice (>P60) included CPP learning to cocaine reward (± the kappa opioid receptor (KOR) agonist U50,488 injection), and adult hippocampal neurogenesis (PCNA immunolabeling). Neonatal stress (but not morphine) decreased the cocaine-CPP response and this effect was reversed by KOR stimulation. Both neonatal stress or morphine treatment increased hippocampal neurogenesis in adult mice. We conclude that reduced learning and increased hippocampal neurogenesis are both indicators that neonatal stress desensitized mice and reduced their arousal and stress responsiveness during adult CPP testing. Reconciled with other findings, these data collectively support the stress inoculation hypothesis whereby early life stressors prepare animals to tolerate future stress.
Collapse
|
18
|
Selective p38α MAPK deletion in serotonergic neurons produces stress resilience in models of depression and addiction. Neuron 2011; 71:498-511. [PMID: 21835346 DOI: 10.1016/j.neuron.2011.06.011] [Citation(s) in RCA: 208] [Impact Index Per Article: 16.0] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/01/2011] [Indexed: 12/12/2022]
Abstract
Maladaptive responses to stress adversely affect human behavior, yet the signaling mechanisms underlying stress-responsive behaviors remain poorly understood. Using a conditional gene knockout approach, the α isoform of p38 mitogen-activated protein kinase (MAPK) was selectively inactivated by AAV1-Cre-recombinase infection in specific brain regions or by promoter-driven excision of p38α MAPK in serotonergic neurons (by Slc6a4-Cre or ePet1-Cre) or astrocytes (by Gfap-CreERT2). Social defeat stress produced social avoidance (a model of depression-like behaviors) and reinstatement of cocaine preference (a measure of addiction risk) in wild-type mice, but not in mice having p38α MAPK selectively deleted in serotonin-producing neurons of the dorsal raphe nucleus. Stress-induced activation of p38α MAPK translocated the serotonin transporter to the plasma membrane and increased the rate of transmitter uptake at serotonergic nerve terminals. These findings suggest that stress initiates a cascade of molecular and cellular events in which p38α MAPK induces a hyposerotonergic state underlying depression-like and drug-seeking behaviors.
Collapse
|
19
|
Behavioral stress may increase the rewarding valence of cocaine-associated cues through a dynorphin/kappa-opioid receptor-mediated mechanism without affecting associative learning or memory retrieval mechanisms. Neuropsychopharmacology 2010; 35:1932-42. [PMID: 20445500 PMCID: PMC2904851 DOI: 10.1038/npp.2010.67] [Citation(s) in RCA: 69] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Stress exposure increases the risk of addictive drug use in human and animal models of drug addiction by mechanisms that are not completely understood. Mice subjected to repeated forced swim stress (FSS) before cocaine develop significantly greater conditioned place preference (CPP) for the drug-paired chamber than unstressed mice. Analysis of the dose dependency showed that FSS increased both the maximal CPP response and sensitivity to cocaine. To determine whether FSS potentiated CPP by enhancing associative learning mechanisms, mice were conditioned with cocaine in the absence of stress, then challenged after association was complete with the kappa-opioid receptor (KOR) agonist U50,488 or repeated FSS, before preference testing. Mice challenged with U50,488 60 min before CPP preference testing expressed significantly greater cocaine-CPP than saline-challenged mice. Potentiation by U50,488 was dose and time dependent and blocked by the KOR antagonist norbinaltorphimine (norBNI). Similarly, mice subjected to repeated FSS before the final preference test expressed significantly greater cocaine-CPP than unstressed controls, and FSS-induced potentiation was blocked by norBNI. Novel object recognition (NOR) performance was not affected by U50,488 given 60 min before assay, but was impaired when given 15 min before NOR assay, suggesting that KOR activation did not potentiate CPP by facilitating memory retrieval or expression. The results from this study show that the potentiation of cocaine-CPP by KOR activation does not result from an enhancement of associative learning mechanisms and that stress may instead enhance the rewarding valence of cocaine-associated cues by a dynorphin-dependent mechanism.
Collapse
|