1
|
In-Person Versus Remote 6-Minute Walk and Incremental Shuttle Walk Distances in Advanced Lung Disease. Respir Care 2024; 69:557-565. [PMID: 38649272 DOI: 10.4187/respcare.11417] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/25/2024]
Abstract
BACKGROUND Field-based walk tests conducted remotely may provide an alternative method to a facility-based assessment of exercise capacity for people with advanced lung disease. This prospective study evaluated the level of agreement in the distance walked between a 6-min walk test (6MWT) and an incremental shuttle walk test performed by using standard in-person procedures and test variations and settings. METHODS Adults with advanced lung disease underwent 4 study visits: (i) one in-person standard 6MWT (30-m corridor) and one in-person treadmill 6MWT, (ii) a remote 6MWT in a home setting (10-m corridor), (iii) 2 in-person standard incremental shuttle walk tests (10-m corridor), and (iv) a remote incremental shuttle walk test in a home setting (10-m corridor). A medical-grade oximeter measured heart rate and oxygen saturation before, during, and for 2 min after the tests. RESULTS Twenty-eight participants were included (23 men [82%]; 64 (57-67) y old; 19 with interstitial lung disease [68%] and 9 with COPD [32%]; and 26 used supplemental oxygen (93%) [exertional [Formula: see text] of 0.46 ± 0.1]). There was no agreement between the tests. Greater walking distances were achieved with standard testing procedures: in-person 6MWT versus treadmill 6MWT (355 ± 68 vs 296 ± 97; P = .001; n = 28), in-person 6MWT versus remote 6MWT (349 ± 68 vs 293 ± 84; P = .001; n = 24), and in-person incremental shuttle walk test versus remote incremental shuttle walk test (216 ± 62 vs 195 ± 63; P = .03; n = 22). CONCLUSIONS Differences in the distance walked may have resulted from different track lengths, widths, and walking surfaces. This should be considered in test interpretation if tests are repeated under different conditions.
Collapse
|
2
|
Impact of intraoperative therapeutic plasma exchange on bleeding in lung transplantation. J Heart Lung Transplant 2024; 43:414-419. [PMID: 37813131 DOI: 10.1016/j.healun.2023.10.003] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2023] [Revised: 09/21/2023] [Accepted: 10/02/2023] [Indexed: 10/11/2023] Open
Abstract
BACKGROUND Our program uses a desensitization protocol that includes intraoperative therapeutic plasma exchange (iTPE) for crossmatch-positive lung transplants, which improves access to lung transplant for sensitized candidates while mitigating immunologic risk. Although we have reported excellent outcomes for sensitized patients with the use of this protocol, concern for perioperative bleeding appears to have hindered broader adoption of it at other programs. We conducted a retrospective cohort study to quantify the impact of iTPE on perioperative bleeding in lung transplantation. METHODS All first-time lung transplant recipients from 2014 to 2019 who received iTPE were compared to those who did not. Multivariable logistic regression was used to determine the association between iTPE and large-volume perioperative transfusion requirements (≥5 packed red blood cell units within 24 hours of transplant start), adjusted for disease type, transplant type, and extracorporeal membrane oxygenation or cardiopulmonary bypass use. The incidence of hemothorax (requiring reoperation within 7 days of lung transplant) and 30-day posttransplant mortality were compared between the 2 groups using chi-square test. RESULTS One hundred forty-two patients (16%) received iTPE, and 755 patients (84%) did not. The mean number of perioperative pRBC transfusions was 4.2 among patients who received iTPE and 2.9 among patients who did not. iTPE was associated with increased odds of requiring large-volume perioperative transfusion (odds ratio 1.9; 95% confidence interval: 1.2-2.9, p-value = 0.007) but was not associated with an increased incidence of hemothorax (5% in both groups, p = 0.99) or 30-day posttransplant mortality (3.5% among patients who received iTPE vs 2.1% among patients who did not, p = 0.31). CONCLUSIONS This study demonstrates that the use of iTPE in lung transplantation may increase perioperative bleeding but not to a degree that impacts important posttransplant outcomes.
Collapse
|
3
|
Clinical implications of frailty assessed in hospitalized patients with acute-exacerbation of interstitial lung disease. Chron Respir Dis 2024; 21:14799731241240786. [PMID: 38515270 PMCID: PMC10958799 DOI: 10.1177/14799731241240786] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2023] [Revised: 02/08/2024] [Accepted: 02/14/2024] [Indexed: 03/23/2024] Open
Abstract
BACKGROUND Approximately 50% of patients with interstitial lung disease (ILD) experience frailty, which remains unexplored in acute exacerbations of ILD (AE-ILD). A better understanding may help with prognostication and resource planning. We evaluated the association of frailty with clinical characteristics, physical function, hospital outcomes, and post-AE-ILD recovery. METHODS Retrospective cohort study of AE-ILD patients (01/2015-10/2019) with frailty (proportion ≥0.25) on a 30-item cumulative-deficits index. Frail and non-frail patients were compared for pre- and post-hospitalization clinical characteristics, adjusted for age, sex, and ILD diagnosis. One-year mortality, considering transplantation as a competing risk, was analysed adjusting for age, frailty, and Charlson Comorbidity Index (CCI). RESULTS 89 AE-ILD patients were admitted (median: 67 years, 63% idiopathic pulmonary fibrosis). 31 were frail, which was associated with older age, greater CCI, lower 6-min walk distance, and decreased independence pre-hospitalization. Frail patients had more major complications (32% vs 10%, p = .01) and required more multidisciplinary support during hospitalization. Frailty was not associated with 1-year mortality (HR: 0.97, 95%CI: [0.45-2.10]) factoring transplantation as a competing risk. CONCLUSIONS Frailty was associated with reduced exercise capacity, increased comorbidities and hospital complications. Identifying frailty may highlight those requiring additional multidisciplinary support, but further study is needed to explore whether frailty is modifiable with AE-ILD.
Collapse
|
4
|
Outcomes after flow cytometry crossmatch-positive lung transplants managed with perioperative desensitization. Am J Transplant 2023; 23:1733-1739. [PMID: 37172694 DOI: 10.1016/j.ajt.2023.04.033] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2022] [Revised: 03/06/2023] [Accepted: 04/03/2023] [Indexed: 05/15/2023]
Abstract
Our program previously reported successful outcomes following virtual crossmatch (VXM)-positive lung transplants managed with perioperative desensitization, but our ability to stratify their immunologic risk was limited without flow cytometry crossmatch (FCXM) data before 2014. The aim of this study was to determine allograft and chronic lung allograft dysfunction (CLAD)-free survival following VXM-positive/FCXM-positive lung transplants, which are performed at a minority of programs due to the high immunologic risk and lack of data on outcomes. All first-time lung transplant recipients between January 2014 and December 2019 were divided into 3 cohorts: VXM-negative (n = 764), VXM-positive/FCXM-negative (n = 64), and VXM-positive/FCXM-positive (n = 74). Allograft and CLAD-free survival were compared using Kaplan-Meier and multivariable Cox proportional hazards models. Five-year allograft survival was 53% in the VXM-negative cohort, 64% in the VXM-positive/FCXM-negative cohort, and 57% in the VXM-positive/FCXM-positive cohort (P = .7171). Five-year CLAD-free survival was 53% in the VXM-negative cohort, 60% in the VXM-positive/FCXM-negative cohort, and 63% in the VXM-positive/FCXM-positive cohort (P = .8509). This study confirms that allograft and CLAD-free survival of patients who undergo VXM-positive/FCXM-positive lung transplants with the use of our protocol does not differ from those of other lung transplant recipients. Our protocol for VXM-positive lung transplants improves access to transplant for sensitized candidates and mitigates even high immunologic risk.
Collapse
|
5
|
Oesophageal stasis is a risk factor for chronic lung allograft dysfunction and allograft failure in lung transplant recipients. ERJ Open Res 2023; 9:00222-2023. [PMID: 37817870 PMCID: PMC10561084 DOI: 10.1183/23120541.00222-2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2023] [Accepted: 07/04/2023] [Indexed: 10/12/2023] Open
Abstract
Background Morbidity and mortality in lung transplant recipients are often triggered by recurrent aspiration events, potentiated by oesophageal and gastric disorders. Previous small studies have shown conflicting associations between oesophageal function and the development of chronic lung allograft dysfunction (CLAD). Herein, we sought to investigate the relationship between oesophageal motility disorders and long-term outcomes in a large retrospective cohort of lung transplant recipients. Methods All lung transplant recipients at the Toronto Lung Transplant Program from 2012 to 2018 with available oesophageal manometry testing within the first 7 months post-transplant were included in this study. Patients were categorised according to the Chicago Classification of oesophageal disorders (v3.0). Associations between oesophageal motility disorders with the development of CLAD and allograft failure (defined as death or re-transplantation) were assessed. Results Of 487 patients, 57 (12%) had oesophagogastric junction outflow obstruction (OGJOO) and 47 (10%) had a disorder of peristalsis (eight major, 39 minor). In a multivariable analysis, OGJOO was associated with an increased risk of CLAD (HR 1.71, 95% CI 1.15-2.55, p=0.008) and allograft failure (HR 1.69, 95% CI 1.13-2.53, p=0.01). Major disorders of peristalsis were associated with an increased risk of CLAD (HR 1.55, 95% CI 1.01-2.37, p=0.04) and allograft failure (HR 3.33, 95% CI 1.53-7.25, p=0.002). Minor disorders of peristalsis were not significantly associated with CLAD or allograft failure. Conclusion Lung transplant recipients with oesophageal stasis characterised by OGJOO or major disorders of peristalsis were at an increased risk of adverse long-term outcomes. These findings will help with risk stratification of lung transplant recipients and personalisation of treatment for aspiration prevention.
Collapse
|
6
|
Pulmonary epithelial markers in phenotypes of chronic lung allograft dysfunction. J Heart Lung Transplant 2023; 42:1152-1160. [PMID: 36963446 DOI: 10.1016/j.healun.2023.03.009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2022] [Revised: 02/21/2023] [Accepted: 03/10/2023] [Indexed: 03/26/2023] Open
Abstract
BACKGROUND Airway epithelial injury is thought to be a key event in the pathogenesis of chronic lung allograft dysfunction (CLAD). We investigated whether markers of epithelial activity and injury in bronchoalveolar lavage fluid (BAL) correlate with CLAD diagnosis and major CLAD phenotypes: bronchiolitis obliterans syndrome (BOS) vs restrictive allograft syndrome (RAS)-related phenotypes (including RAS, mixed phenotype, and all other patients with RAS-like opacities). METHODS CLAD status and phenotypes were retrospectively determined in a cohort of all consecutive adult, first, bilateral lung transplants performed 2010-2015, with available BAL samples. All patients with RAS-related phenotypes were included and 1:1 matched with BOS patients based on the time from transplant to CLAD-onset. Subjects who were CLAD-free for a minimum of 3 years post-transplant were 1:1 matched to CLAD patients and included as controls. Proteins that maintain the barrier function of the airway epithelial mucosa (club cell secretory protein, surfactant protein-D and epithelial mucins: MUC1, MUC5AC, MUC5B, MUC16), as well as epithelial cell death markers (M30&M65 representing epithelial cell apoptosis and overall death, respectively), were measured in BAL obtained within 6-months post CLAD onset using a double-sandwich ELISA or a multiplex bead assay. Protein levels were compared using Mann-Whitney-U-test. Association between protein levels and graft survival was assessed using Cox proportional hazards models, adjusted for CMV serology mismatch status and CLAD phenotype. RESULTS Fifty-four CLAD (27 BOS, 11 RAS, 7 mixed, 9 others with RAS-like opacities) patients and 23 CLAD-free controls were included. Median BAL levels were significantly higher in patients with CLAD compared to CLAD-free controls for M30 (124.5 vs 88.7 U/L), MUC1 (6.8 vs 3.2 pg/mL), and MUC16 (121.0 vs 30.1 pg/mL). When comparing CLAD phenotypes, M30 was significantly higher in patients with RAS-related phenotypes than BOS (160.9 vs 114.6 U/L). In multivariable models, higher M30 and MUC5B levels were associated with decreased allograft survival after CLAD onset independent of phenotype (p < 0.05 for all). CONCLUSIONS Airway epithelial mucins and cell death markers are enhanced in the BAL of patients with CLAD and can assist in differentiating between CLAD phenotypes and post-CLAD outcomes. Abnormal airway mucin expression and epithelial cell death may be involved in the pathogenesis of CLAD, and therefore their detection may aid in future selection of targeted therapies.
Collapse
|
7
|
Predicting outcomes in lung transplantation: From tea leaves to ChatGPT. J Heart Lung Transplant 2023; 42:905-907. [PMID: 37028775 DOI: 10.1016/j.healun.2023.03.019] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2023] [Accepted: 03/30/2023] [Indexed: 04/09/2023] Open
|
8
|
Incidence of post-transplant cytomegalovirus viremia in patients receiving lungs after ex vivo lung perfusion. JTCVS OPEN 2023; 14:590-601. [PMID: 37425481 PMCID: PMC10328819 DOI: 10.1016/j.xjon.2023.02.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/25/2022] [Revised: 02/07/2023] [Accepted: 02/13/2023] [Indexed: 07/11/2023]
Abstract
Objectives Cytomegalovirus infection after lung transplant is associated with increased morbidity and mortality. Inflammation, infection, and longer ischemic times are important risk factors for cytomegalovirus infection. Ex vivo lung perfusion has helped to successfully increase the use of high-risk donors over the last decade. However, the impact of ex vivo lung perfusion on post-transplant cytomegalovirus infection is unknown. Methods We performed a retrospective analysis of all adult lung transplant recipients from 2010 to 2020. The primary end point was comparison of cytomegalovirus viremia between patients who received ex vivo lung perfusion donor lungs and patients who received non-ex vivo lung perfusion donor lungs. Cytomegalovirus viremia was defined as cytomegalovirus viral load greater than 1000 IU/mL within 2 years post-transplant. Secondary end points were the time from lung transplant to cytomegalovirus viremia, peak cytomegalovirus viral load, and survival. Outcomes were also compared between the different donor recipient cytomegalovirus serostatus matching groups. Results Included were 902 recipients of non-ex vivo lung perfusion lungs and 403 recipients of ex vivo lung perfusion lungs. There was no significant difference in the distribution of the cytomegalovirus serostatus matching groups. A total of 34.6% of patients in the non-ex vivo lung perfusion group developed cytomegalovirus viremia, as did 30.8% in the ex vivo lung perfusion group (P = .17). There was no difference in time to viremia, peak viral loads, or survival when comparing both groups. Likewise, all outcomes were comparable in the non-ex vivo lung perfusion and ex vivo lung perfusion groups within each serostatus matching group. Conclusions The practice of using more injured donor organs via ex vivo lung perfusion has not affected cytomegalovirus viremia rates and severity in lung transplant recipients in our center.
Collapse
|
9
|
Reflux Surgery in Lung Transplantation: A Multicenter Retrospective Study. Ann Thorac Surg 2023; 115:1024-1032. [PMID: 36216086 DOI: 10.1016/j.athoracsur.2022.09.037] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/10/2022] [Revised: 08/06/2022] [Accepted: 09/26/2022] [Indexed: 11/20/2022]
Abstract
BACKGROUND Aspiration has been associated with graft dysfunction after lung transplantation, leading some to advocate for selective use of fundoplication despite minimal data supporting this practice. METHODS We performed a multicenter retrospective study at 4 academic lung transplant centers to determine the association of gastroesophageal reflux disease and fundoplication with bronchiolitis obliterans syndrome and survival using Cox multivariable regression. RESULTS Of 542 patients, 136 (25.1%) underwent fundoplication; 99 (18%) were found to have reflux disease without undergoing fundoplication. Blanking the first year after transplantation, fundoplication was not associated with a benefit regarding freedom from bronchiolitis obliterans syndrome (hazard ratio [HR], 0.93; 95% CI, 0.58-1.49) or death (HR, 0.97; 95% CI, 0.47-1.99) compared with reflux disease without fundoplication. However, a time-dependent adjusted analysis found a slight decrease in mortality (HR, 0.59; 95% CI, 0.28-1.23; P = .157), bronchiolitis obliterans syndrome (HR, 0.68; 95% CI, 0.42-1.11; P = .126), and combined bronchiolitis obliterans syndrome or death (HR, 0.66; 95% CI, 0.42-1.04; P = .073) in the fundoplication group compared with the gastroesophageal reflux disease group. CONCLUSIONS Although a statistically significant benefit from fundoplication was not determined because of limited sample size, follow-up, and potential for selection bias, a randomized, prospective study is still warranted.
Collapse
|
10
|
Early posttransplant reductions in club cell secretory protein associate with future risk for chronic allograft dysfunction in lung recipients: results from a multicenter study. J Heart Lung Transplant 2023; 42:741-749. [PMID: 36941179 DOI: 10.1016/j.healun.2023.02.1495] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2022] [Revised: 02/15/2023] [Accepted: 02/20/2023] [Indexed: 03/19/2023] Open
Abstract
BACKGROUND Chronic lung allograft dysfunction (CLAD) increases morbidity and mortality for lung transplant recipients. Club cell secretory protein (CCSP), produced by airway club cells, is reduced in the bronchoalveolar lavage fluid (BALF) of lung recipients with CLAD. We sought to understand the relationship between BALF CCSP and early posttransplant allograft injury and determine if early posttransplant BALF CCSP reductions indicate later CLAD risk. METHODS We quantified CCSP and total protein in 1606 BALF samples collected over the first posttransplant year from 392 adult lung recipients at 5 centers. Generalized estimating equation models were used to examine the correlation of allograft histology or infection events with protein-normalized BALF CCSP. We performed multivariable Cox regression to determine the association between a time-dependent binary indicator of normalized BALF CCSP level below the median in the first posttransplant year and development of probable CLAD. RESULTS Normalized BALF CCSP concentrations were 19% to 48% lower among samples corresponding to histological allograft injury as compared with healthy samples. Patients who experienced any occurrence of a normalized BALF CCSP level below the median over the first posttransplant year had a significant increase in probable CLAD risk independent of other factors previously linked to CLAD (adjusted hazard ratio 1.95; p = 0.035). CONCLUSIONS We discovered a threshold for reduced BALF CCSP to discriminate future CLAD risk; supporting the utility of BALF CCSP as a tool for early posttransplant risk stratification. Additionally, our finding that low CCSP associates with future CLAD underscores a role for club cell injury in CLAD pathobiology.
Collapse
|
11
|
Statin Use May Be Associated With a Lower Risk of Invasive Aspergillosis in Lung Transplant Recipients. Clin Infect Dis 2023; 76:e1379-e1384. [PMID: 35900334 DOI: 10.1093/cid/ciac551] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2022] [Revised: 06/23/2022] [Accepted: 06/30/2022] [Indexed: 12/24/2022] Open
Abstract
BACKGROUND Statins are competitive inhibitors of 3-hydroxy-3methylglutaryl coenzyme A reductase (HMG-CoA reductase) that catalyses HMG-CoA conversion to mevalonate, a process involved in synthesizing cholesterol in humans and ergosterol in fungi. The effect of statin use on the risk of development of invasive aspergillosis (IA) in lung transplant recipients (LTRs) is not well documented. METHODS This retrospective study included LTRs from 2010 to 2017 who were followed for one-year post-transplant. Proven or probable IA was diagnosed as per ISHLT criteria. We performed a multivariable Cox proportional hazards model of the association between IA and statin use (minimum of 2 weeks duration prior to IA), adjusting for other known IA risk factors. RESULTS We identified 785 LTRs, 44% female, mean age 53 years old, the most common underlying disease being pulmonary fibrosis (23.8%). In total, 451 LTRs (57%) received statins post-transplant, atorvastatin was the most commonly used statin (68%). The mean duration of statins post-transplant was 347 days (interquartile range [IQR]: 305 to 346). And 55 (7%) LTRs developed IA in the first-year post-transplant. Out of these 55 LTRs, 9 (16.3%) had received statin before developing IA. In multivariable analysis, statin use was independently associated with a lower risk of IA (P = .002, SHR 0.30, 95% confidence interval [CI] 95% .14-.64). Statin use was also associated with a lower incidence of post-transplant Aspergillus colonization, 114 (34%) in the no statin group vs 123 (27%) in the statin group (P = .038). CONCLUSIONS The use of statin for a minimum of two weeks during the first-year post-transplant was associated with a 70% risk reduction of IA in LTRs.
Collapse
|
12
|
Evaluation of Frailty Measures and Short-term Outcomes After Lung Transplantation. Chest 2023:S0012-3692(23)00121-6. [PMID: 36681147 DOI: 10.1016/j.chest.2023.01.017] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2022] [Revised: 12/28/2022] [Accepted: 01/10/2023] [Indexed: 01/20/2023] Open
Abstract
BACKGROUND Frailty, measured as a single construct, is associated variably with poor outcomes before and after lung transplantation. The usefulness of a comprehensive frailty assessment before transplantation is unknown. RESEARCH QUESTION How are multiple frailty constructs, including phenotypic and cumulative deficit models, muscle mass, exercise tolerance, and social vulnerabilities, measured before transplantation, associated with short-term outcomes after lung transplantation? STUDY DESIGN AND METHODS We conducted a retrospective cohort study of 515 lung recipients who underwent frailty assessments before transplantation, including the short physical performance battery (SPPB), transplant-specific frailty index (FI), 6-min walk distance (6MWD), thoracic sarcopenia, and social vulnerability indexes. We tested the association between frailty measures before transplantation and outcomes after transplantation using logistic regression to model 1-year survival and zero-inflated negative binomial regression to model hospital-free days (HFDs) in the first 90 days after transplantation. Adjustment covariates included age, sex, native lung disease, transplantation type, lung allocation score, BMI, and primary graft dysfunction. RESULTS Before transplantation, 51.3% of patients were frail by FI (FI ≥ 0.25) and no patients were frail by SPPB. In multivariate adjusted models that also included FI, SPPB, and 6MWD, greater frailty by FI, but not SPPB, was associated with fewer HFDs (-0.006 per 0.01 unit worsening; 95% CI, -0.01 to -0.002 per 0.01 unit worsening) among discharged patients. Greater SPPB deficits were associated with decreased odds of 1-year survival (OR, 0.51 per 1 unit worsening; 95% CI, 0.28-0.93 per 1 unit worsening). Correlation among frailty measurements overall was poor. No association was found between thoracic sarcopenia, 6MWD, or social vulnerability assessments and short-term outcomes after lung transplantation. INTERPRETATION Both phenotypic and cumulative deficit models measured before transplantation are associated with short-term outcomes after lung transplantation. Cumulative deficit measures of frailty may be more relevant in the first 90 days after transplantation, whereas phenotypic frailty may have a stronger association with 1-year survival.
Collapse
|
13
|
Experiences and perceptions of receiving and prescribing rehabilitation in adults with cystic fibrosis undergoing lung transplantation. Chron Respir Dis 2023; 20:14799731221139293. [PMID: 36987977 PMCID: PMC10064169 DOI: 10.1177/14799731221139293] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/30/2023] Open
Abstract
BACKGROUND Rehabilitation is prescribed to optimize fitness before lung transplantation (LTx) and facilitate post-transplant recovery. Individuals with cystic fibrosis (CF) may experience unique health issues that impact participation. METHODS Patient and healthcare provider semi-structured interviews were administered to explore perceptions and experiences of rehabilitation before and after LTx in adults with CF. Interviews were analyzed via inductive thematic analysis. RESULTS Eleven participants were interviewed between February and October 2021 (five patients, median 28 (IQR 27-29) years, one awaiting re-LTx, four following first or second LTx) and six healthcare providers. Rehabilitation was delivered both in-person and virtually using a remote monitoring App. Six key themes emerged: (i) structured exercise benefits both physical and mental health, (ii) CF-specific physiological impairments were a large barrier, (iii) supportive in-person or virtual relationships facilitated participation, (iv) CF-specific evidence and resources are needed, (v) tele-rehabilitation experiences during the COVID-19 pandemic resulted in preferences for a hybrid model and (vi) virtual platforms and clinical workflows require further optimization. There was good engagement with remote data entry alongside satisfaction with virtual support. CONCLUSIONS Structured rehabilitation provided multiple benefits and a hybrid model was preferred going forward. Future optimization of tele-rehabilitation processes and increased evidence to support exercise along the continuum of CF care are needed.
Collapse
|
14
|
Prognostic implications of and clinical risk factors for acute lung injury and organizing pneumonia after lung transplantation: Data from a multicenter prospective cohort study. Am J Transplant 2022; 22:3002-3011. [PMID: 36031951 PMCID: PMC9925227 DOI: 10.1111/ajt.17183] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2022] [Revised: 08/05/2022] [Accepted: 08/21/2022] [Indexed: 01/28/2023]
Abstract
We determined prognostic implications of acute lung injury (ALI) and organizing pneumonia (OP), including timing relative to transplantation, in a multicenter lung recipient cohort. We sought to understand clinical risks that contribute to development of ALI/OP. We analyzed prospective, histologic diagnoses of ALI and OP in 4786 lung biopsies from 803 adult lung recipients. Univariable Cox regression was used to evaluate the impact of early (≤90 days) or late (>90 days) posttransplant ALI or OP on risk for chronic lung allograft dysfunction (CLAD) or death/retransplantation. These analyses demonstrated late ALI/OP conferred a two- to threefold increase in the hazards of CLAD or death/retransplantation; there was no association between early ALI/OP and these outcomes. To determine risk factors for late ALI/OP, we used univariable Cox models considering donor/recipient characteristics and posttransplant events as candidate risks. Grade 3 primary graft dysfunction, higher degree of donor/recipient human leukocyte antigen mismatch, bacterial or viral respiratory infection, and an early ALI/OP event were significantly associated with increased late ALI/OP risk. These data from a contemporary, multicenter cohort underscore the prognostic implications of ALI/OP on lung recipient outcomes, clarify the importance of the timing of these events, and identify clinical risks to target for ALI/OP prevention.
Collapse
|
15
|
Plasma CXCL9 and CXCL10 at allograft injury predict chronic lung allograft dysfunction. Am J Transplant 2022; 22:2169-2179. [PMID: 35634722 PMCID: PMC9427677 DOI: 10.1111/ajt.17108] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2022] [Revised: 05/24/2022] [Accepted: 05/26/2022] [Indexed: 01/25/2023]
Abstract
Histopathologic lung allograft injuries are putative harbingers for chronic lung allograft dysfunction (CLAD). However, the mechanisms responsible are not well understood. CXCL9 and CXCL10 are potent chemoattractants of mononuclear cells and potential propagators of allograft injury. We hypothesized that these chemokines would be quantifiable in plasma, and would associate with subsequent CLAD development. In this prospective multicenter study, we evaluated 721 plasma samples for CXCL9/CXCL10 levels from 184 participants at the time of transbronchial biopsies during their first-year post-transplantation. We determined the association between plasma chemokines, histopathologic injury, and CLAD risk using Cox proportional hazards models. We also evaluated CXCL9/CXCL10 levels in bronchoalveolar lavage (BAL) fluid and compared plasma to BAL with respect to CLAD risk. Plasma CXCL9/CXCL10 levels were elevated during the injury patterns associated with CLAD, acute rejection, and acute lung injury, with a dose-response relationship between chemokine levels and CLAD risk. Importantly, there were strong interactions between injury and plasma CXCL9/CXCL10, where histopathologic injury associated with CLAD only in the presence of elevated plasma chemokines. We observed similar associations and interactions with BAL CXCL9/CXCL10 levels. Elevated plasma CXCL9/CXCL10 during allograft injury may contribute to CLAD pathogenesis and has potential as a minimally invasive immune monitoring biomarker.
Collapse
|
16
|
Utility of bile acids in large airway bronchial wash versus bronchoalveolar lavage as biomarkers of microaspiration in lung transplant recipients: a retrospective cohort study. Respir Res 2022; 23:219. [PMID: 36028826 PMCID: PMC9419323 DOI: 10.1186/s12931-022-02131-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Accepted: 08/04/2022] [Indexed: 11/20/2022] Open
Abstract
Background Bronchoalveolar lavage (BAL) is a key tool in respiratory medicine for sampling the distal airways. BAL bile acids are putative biomarkers of pulmonary microaspiration, which is associated with poor outcomes after lung transplantation. Compared to BAL, large airway bronchial wash (LABW) samples the tracheobronchial space where bile acids may be measurable at more clinically relevant levels. We assessed whether LABW bile acids, compared to BAL bile acids, are more strongly associated with poor clinical outcomes in lung transplant recipients. Methods Concurrently obtained BAL and LABW at 3 months post-transplant from a retrospective cohort of 61 lung transplant recipients were analyzed for taurocholic acid (TCA), glycocholic acid (GCA), and cholic acid by mass spectrometry and 10 inflammatory proteins by multiplex immunoassay. Associations between bile acids with inflammatory proteins and acute lung allograft dysfunction were assessed using Spearman correlation and logistic regression, respectively. Time to chronic lung allograft dysfunction and death were evaluated using multivariable Cox proportional hazards and Kaplan–Meier methods. Results Most bile acids and inflammatory proteins were higher in LABW than in BAL. LABW bile acids correlated with inflammatory proteins within and between sample type. LABW TCA and GCA were associated with acute lung allograft dysfunction (OR = 1.368; 95%CI = 1.036–1.806; P = 0.027, OR = 1.064; 95%CI = 1.009–1.122; P = 0.022, respectively). No bile acids were associated with chronic lung allograft dysfunction. Adjusted for risk factors, LABW TCA and GCA predicted death (HR = 1.513; 95%CI = 1.014–2.256; P = 0.042, HR = 1.597; 95%CI = 1.078–2.366; P = 0.020, respectively). Patients with LABW TCA in the highest tertile had worse survival compared to all others. Conclusions LABW bile acids are more strongly associated than BAL bile acids with inflammation, acute lung allograft dysfunction, and death in lung transplant recipients. Collection of LABW may be useful in the evaluation of microaspiration in lung transplantation and other respiratory diseases. Supplementary Information The online version contains supplementary material available at 10.1186/s12931-022-02131-5.
Collapse
|
17
|
Strongyloides hyper-infection in a lung transplant recipient: Case report and review of the literature. JOURNAL OF THE ASSOCIATION OF MEDICAL MICROBIOLOGY AND INFECTIOUS DISEASE CANADA = JOURNAL OFFICIEL DE L'ASSOCIATION POUR LA MICROBIOLOGIE MEDICALE ET L'INFECTIOLOGIE CANADA 2022; 7:150-156. [PMID: 36337355 PMCID: PMC9608110 DOI: 10.3138/jammi-2021-0034] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/10/2022] [Revised: 02/10/2022] [Accepted: 02/10/2022] [Indexed: 06/16/2023]
Abstract
CASE PRESENTATION A 63-year-old man with a left single lung transplant for end-stage combined restrictive and obstructive lung disease developed persistent pulmonary infiltrates and recurrent gram-negative bacteremia post-transplant. Bronchoalveolar lavage fluid revealed a nematode on Papanicolau staining compatible with Strongyloides stercoralis larvae on day 50 post-transplant. Although Strongyloides serology performed post-transplant was negative, a retrospective review of the medical record revealed marked peripheral blood eosinophilia on several occasions before transplantation. Despite reduction in immunosuppression and treatment with albendazole and ivermectin, the patient developed another episode of Escherichia coli bacteremia. He died 3 months post-transplant from pulmonary and neurological complications. DIAGNOSIS Strongyloides hyper-infection. DISCUSSION Strongyloides hyper-infection syndrome is known to occur in immunocompromised patients, but it has only been reported once in a lung transplant recipient. This case illustrates the importance of screening for parasitic infections before transplantation in patients with marked eosinophilia, especially among immigrants from countries in which Strongyloides is endemic. Hyper-infection syndrome may appear years after infection in the context of immunosuppression or immunodeficiency. This case also highlights the association between Strongyloides hyper-infection and bacteremia with enteric organisms.
Collapse
|
18
|
Feasibility of Virtual Assessment of Physical Frailty in Solid Organ Transplant Recipients – A Single Centre, Observational Study. Int J Telerehabil 2022; 14:e6447. [PMID: 35734387 PMCID: PMC9186907 DOI: 10.5195/ijt.2022.6447] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
Abstract
Objectives: To describe the feasibility of virtual assessments of physical frailty in solid organ transplant (SOT) recipients using a modified Fried Frailty Index (mFFI) and Short Physical Performance Battery (SPPB), and to describe the prevalence of frailty 12-months post-transplant using virtual assessment. Methods: Virtual assessments were performed using an e-questionnaire and a video-call for functional tests. Feasibility variables included: internet quality, video-call duration, presence of a companion, and adverse events. Results: 34 SOT recipients, median age 62 (46-67), 76% lung recipients, 47% female, were included. The video-call had a median duration of 12 minutes (10-15 min), without adverse events. A companion was present in 23 (68%) video-call assessments. Fifteen SOT recipients (44%) were classified as pre-frail by the mFFI, and none were frail. Three participants (8.8%) were classified as frail using the SPPB. Conclusion: Virtual frailty assessments can be used as an alternative to in-person assessments in SOT recipients.
Collapse
|
19
|
Ex vivo treatment of cytomegalovirus in human donor lungs using a novel chemokine-based immunotoxin. J Heart Lung Transplant 2022; 41:287-297. [PMID: 34802874 DOI: 10.1016/j.healun.2021.10.010] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2021] [Revised: 10/12/2021] [Accepted: 10/15/2021] [Indexed: 01/07/2023] Open
Abstract
BACKGROUND Transmission of latent human cytomegalovirus (HCMV) via organ transplantation with post-transplant viral reactivation is extremely prevalent and results in substantial adverse impact on outcomes. Therapies targeting the latent reservoir within the allograft to mitigate viral transmission would represent a major advance. Here, we delivered an immunotoxin (F49A-FTP) that targets and kills latent HCMV aiming at reducing the HCMV reservoir from donor lungs using ex-vivo lung perfusion (EVLP). METHODS HCMV seropositive human lungs were placed on EVLP alone or EVLP + 1mg/L of F49A-FTP for 6 hours (n = 6, each). CD14+ monocytes isolated from biopsies pre and post EVLP underwent HCMV reactivation assay designed to evaluate viral reactivation capacity. Off-target effects of F49A-FTP were studied evaluating cell death markers of CD34+ and CD14+ cells using flow cytometry. Lung function on EVLP and inflammatory cytokine production were evaluated as safety endpoints. RESULTS We demonstrate that lungs treated ex-vivo with F49A-FTP had a significant reduction in HCMV reactivation compared to controls, suggesting successful targeting of latent virus (76% median reduction in F49A-FTP vs 15% increase in controls, p = 0.0087). Furthermore, there was comparable cell death rates of the targeted cells between both groups, suggesting no off-target effects. Ex-vivo lung function was stable over 6 hours and no differences in key inflammatory cytokines were observed demonstrating safety of this novel treatment. CONCLUSIONS Ex-vivo F49A-FTP treatment of human lungs targets and kills latent HCMV, markedly attenuating HCMV reactivation. This approach demonstrates the first experiments targeting latent HCMV in a donor organ with promising results towards clinical translation.
Collapse
|
20
|
Significance of phenotype change after chronic lung allograft dysfunction onset. Transpl Int 2021; 34:2620-2632. [PMID: 34748217 DOI: 10.1111/tri.14157] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2021] [Revised: 10/25/2021] [Accepted: 11/03/2021] [Indexed: 01/26/2023]
Abstract
Definitions for chronic lung allograft dysfunction (CLAD) phenotypes were recently revised (2019 ISHLT consensus). Post-CLAD onset phenotype transition may occur as a result of change in obstruction, restriction, or RAS-like opacities (RLO). We aimed to assess the prevalence and prognostic implications of these transitions. This was a single-center, retrospective cohort study of bilateral lung transplants performed in 2009-2015. CLAD phenotypes were determined per ISHLT guidelines. CLAD phenotype transition was defined as a sustained change in obstruction, restriction or RLO. We specifically focused on phenotype changes based on RLO emergence. Association of RLO development with time to death or retransplant were assessed using Kaplan-Meier and Cox proportional hazards models. Among 211 patients with CLAD, 47 (22.2%) experienced a phenotype transition. Nineteen patients developed RLO. Development of RLO phenotype after CLAD onset was associated with a shorter time to death/retransplant when considering the entire CLAD patient cohort (HR = 4.00, CI 2.74-5.83, P < 0.001) and also when restricting the analysis to only patients with a Non-RLO phenotype at CLAD onset (HR 9.64, CI 5.52-16.84, P < 0.0001). CLAD phenotype change based on emergence of RAS-like opacities implies a worse outcome. This highlights the clinical importance of imaging follow-up to monitor for phenotype transitions after CLAD onset.
Collapse
|
21
|
Long-term outcomes of sensitized lung transplant recipients after peri-operative desensitization. Am J Transplant 2021; 21:3444-3448. [PMID: 34058795 DOI: 10.1111/ajt.16707] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2021] [Revised: 05/07/2021] [Accepted: 05/22/2021] [Indexed: 01/25/2023]
Abstract
The Toronto Lung Transplant Program has been using a peri-operative desensitization regimen of plasma exchange, intravenous immune globulin, and antithymocyte globulin in order to accept donor-specific antibody (DSA)-positive lung transplants safely since 2008. There are no long-term data on the impact of this practice on allograft survival or the development of chronic lung allograft dysfunction (CLAD). We extended our prior study to include long-term follow-up of 340 patients who received lung transplants between January 1, 2008 and December 31, 2011. We compared allograft survival and CLAD-free survival among patients in three cohorts: DSA-positive, panel reactive antibody (PRA)-positive/DSA-negative, and unsensitized at the time of transplant. The median follow-up time in this extension study was 6.7 years. Among DSA-positive, PRA-positive/DSA-negative, and unsensitized patients, the median allograft survival was 8.4, 7.9, and 5.8 years, respectively (p = .5908), and the median CLAD-free survival was 6.8, 7.3, and 5.7 years, respectively (p = .5448). This follow-up study confirms that long-term allograft survival and CLAD-free survival of patients who undergo DSA-positive lung transplants with the use of our protocol do not differ from other lung transplant recipients. Use of protocols such as ours, therefore, may improve access to transplant for sensitized candidates.
Collapse
|
22
|
Correlation between BAL CXCR3 chemokines and lung allograft histopathologies: A multicenter study. Am J Transplant 2021; 21:3401-3410. [PMID: 33840162 PMCID: PMC8502500 DOI: 10.1111/ajt.16601] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2020] [Revised: 03/28/2021] [Accepted: 03/28/2021] [Indexed: 01/25/2023]
Abstract
The histopathologic diagnosis of acute allograft injury is prognostically important in lung transplantation with evidence demonstrating a strong and consistent association between acute rejection (AR), acute lung injury (ALI), and the subsequent development of chronic lung allograft dysfunction (CLAD). The pathogenesis of these allograft injuries, however, remains poorly understood. CXCL9 and CXCL10 are CXC chemokines induced by interferon-γ and act as potent chemoattractants of mononuclear cells. We hypothesized that these chemokines are involved in the mononuclear cell recruitment associated with AR and ALI. We further hypothesized that the increased activity of these chemokines could be quantified as increased levels in the bronchoalveolar lavage fluid. In this prospective multicenter study, we evaluate the incidence of histopathologic allograft injury development during the first-year post-transplant and measure bronchoalveolar CXCL9 and CXCL10 levels at the time of the biopsy. In multivariable models, CXCL9 levels were 1.7-fold and 2.1-fold higher during AR and ALI compared with "normal" biopsies without histopathology. Similarly, CXCL10 levels were 1.6-fold and 2.2-fold higher during these histopathologies, respectively. These findings support the association of CXCL9 and CXCL10 with episodes of AR and ALI and provide potential insight into the pathogenesis of these deleterious events.
Collapse
|
23
|
The accuracy of forced vital capacity for diagnosing restrictive allograft syndrome and mixed phenotype of chronic lung allograft dysfunction. Eur Respir J 2021; 58:13993003.03387-2020. [PMID: 34172465 DOI: 10.1183/13993003.03387-2020] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2020] [Accepted: 02/28/2021] [Indexed: 11/05/2022]
|
24
|
Treatment outcomes of nontuberculous mycobacterial pulmonary disease in lung transplant recipients. Transpl Infect Dis 2021; 23:e13679. [PMID: 34184393 DOI: 10.1111/tid.13679] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2021] [Revised: 05/02/2021] [Accepted: 06/08/2021] [Indexed: 12/01/2022]
Abstract
BACKGROUND Lung transplant (LTX) recipients are at risk miscellaneous infections, among whom the clinical significance of nontuberculous mycobacteria (NTM) is increasingly recognized. Despite anti-mycobacterial therapy becoming standardized worldwide, there is a lack of data on treatment outcomes in LTX recipients who develop NTM-pulmonary disease (PD). We aimed to review the treatment outcomes of NTM-PD among LTX recipients in our center. METHODS Patients who underwent LTX from January 2013 to December 2014 were consecutively enrolled in the retrospective cohort, with follow-up of data retrieved to December 2017. Clinical and radiological improvement and culture conversion after anti-mycobacterial therapy were reviewed in those who developed post-transplant NTM-PD. RESULTS Sixteen of 230 LTX recipients developed post-transplant NTM-PD. Ten of 16 patients with post-transplant NTM-PD were treated with macrolide-containing anti-mycobacterial therapy, leading to clinical improvement in 5/10 (50%), radiological improvement in 5/10 (50%) and culture conversion in 6/10 (60%) patients. CONCLUSION Anti-mycobacterial therapy may relieve pulmonary symptoms and reduce microbial load among individuals with post-transplant NTM-PD.
Collapse
|
25
|
Telerehabilitation for Lung Transplant Candidates and Recipients During the COVID-19 Pandemic: Program Evaluation. JMIR Mhealth Uhealth 2021; 9:e28708. [PMID: 34048354 PMCID: PMC8213059 DOI: 10.2196/28708] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2021] [Revised: 04/30/2021] [Accepted: 05/27/2021] [Indexed: 02/06/2023] Open
Abstract
BACKGROUND The COVID-19 pandemic resulted in a rapid shift from center-based rehabilitation to telerehabilitation for chronic respiratory disease and lung transplantation due to infection control precautions. Clinical experience with this delivery model on a large scale has not been described. OBJECTIVE The aim of this study is to describe usage and satisfaction of providers and lung transplant (LTx) candidates and recipients and functional outcomes following the broad implementation of telerehabilitation with remote patient monitoring during the first wave of the COVID-19 pandemic. METHODS This study was a program evaluation of providers, LTx candidates, and early LTx recipients who used a web-based, remote monitoring app for at least four weeks between March 16 and September 1, 2020, to participate in telerehabilitation. Within-subjects analysis was performed for physical activity, Self-efficacy For Exercise (SEE) scale score, aerobic and resistance exercise volumes, 6-minute walk test results, and Short Physical Performance Battery (SPPB) results. RESULTS In total, 78 LTx candidates and 33 recipients were included (57 [51%] males, mean age 58 [SD 12] years, 58 [52%] with interstitial lung disease, 34 [31%] with chronic obstructive pulmonary disease). A total of 50 (64%) LTx candidates and 17 (51%) LTx recipients entered ≥10 prescribed exercise sessions into the app during the study time frame. In addition, 35/42 (83%) candidates agreed the app helped prepare them for surgery and 18/21 (85%) recipients found the app helpful in their self-recovery. The strongest barrier perceived by physiotherapists delivering the telerehabilitation was patient access to home exercise and monitoring equipment. Between the time of app registration and ≥4 weeks on the waiting list, 26 LTx candidates used a treadmill, with sessions increasing in mean duration (from 16 to 22 minutes, P=.002) but not speed (from 1.7 to 1.75 mph, P=.31). Quadriceps weight (pounds) for leg extension did not change (median 3.5, IQR 2.4-5 versus median 4.3, IQR 3-5; P=.08; n=37). On the Rapid Assessment of Physical Activity questionnaire (RAPA), 57% of LTx candidates scored as active, which improved to 87% (P=.02; n=23). There was a decrease in pretransplant 6-minute walk distance (6MWD) from 346 (SD 84) meters to 307 (SD 85) meters (P=.002; n=45) and no change in the SPPB result (12 [IQR 9.5-12] versus 12 [IQR 10-12]; P=.90; n=42). A total of 9 LTx recipients used a treadmill that increased in speed (from 1.9 to 2.7 mph; P=.003) between hospital discharge and three months posttransplant. Quadriceps weight increased (3 [IQR 0-3] pounds versus 5 [IQR 3.8-6.5] pounds; P<.001; n=15). At three months posttransplant, 76% of LTx recipients scored as active (n=17), with a high total SEE score of 74 (SD 11; n=12). In addition, three months posttransplant, 6MWD was 62% (SD 18%) predicted (n=8). CONCLUSIONS We were able to provide telerehabilitation despite challenges around exercise equipment. This early experience will inform the development of a robust and equitable telerehabilitation model beyond the COVID-19 pandemic.
Collapse
|
26
|
Frailty and aging-associated syndromes in lung transplant candidates and recipients. Am J Transplant 2021; 21:2018-2024. [PMID: 33296550 PMCID: PMC8178173 DOI: 10.1111/ajt.16439] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2020] [Revised: 11/17/2020] [Accepted: 11/18/2020] [Indexed: 01/25/2023]
Abstract
Many lung transplant candidates and recipients are older and frailer compared to previous eras. Older patients are at increased risk for pre- and posttransplant mortality, but this risk is not explained by numerical age alone. This manuscript represents the product of the American Society of Transplantation (AST) conference on frailty. Experts in the field reviewed the latest published research on assessment of elderly and frail lung transplant candidates. Physical frailty, often defined as slowness, weakness, low physical activity, shrinking, and exhaustion, and frailty evaluation is an important tool for evaluation of age-associated dysfunction. Another approach is assessment by cumulative deficits, and both types of frailty are common in lung transplant candidates. Frailty is associated with death or delisting before transplant, and may be associated with posttransplant mortality. Sarcopenia, cognitive dysfunction, depression, and nutrition are other important components for patient evaluation. Aging-associated inflammation, telomere dysfunction, and adaptive immune system senescence may also contribute to frailty. Developing tools for frailty assessment and interventions holds promise for improving patient outcomes before and after lung transplantation.
Collapse
|
27
|
Bronchoalveolar lavage cytokine-based risk stratification of minimal acute rejection in clinically stable lung transplant recipients. J Heart Lung Transplant 2021; 40:1540-1549. [PMID: 34215500 DOI: 10.1016/j.healun.2021.05.017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2020] [Revised: 05/14/2021] [Accepted: 05/24/2021] [Indexed: 10/21/2022] Open
Abstract
BACKGROUND Acute cellular rejection (ACR) remains the most significant risk factor for chronic lung allograft dysfunction (CLAD). While clinically significant or higher-grade (≥A2) ACR is generally treated with augmented immunosuppression (IS), the management of clinically stable grade A1 ACR remains controversial. At our center, patients with clinically stable grade A1 ACR are routinely not treated with augmented IS. While the overall outcomes in this group of patients at our center are equivalent to patients with stable A0 pathology, CLAD and death rates remain overall high. We hypothesized that a distinct cytokine signature at the time of early minimal rejection state would be associated with worse outcomes. Specifically, we aimed to determine whether bronchoalveolar lavage (BAL) biomarkers at the time of first clinically stable grade A1 ACR (CSA1R) are predictive of subsequent CLAD or death. METHODS Among all adult, bilateral, first lung transplants, performed 2010-2016, transbronchial biopsies obtained within the first-year post-transplant were categorized as clinically stable or unstable based on the presence or absence of ≥10% concurrent drop in forced expiratory volume in 1 second (FEV1). We assessed BAL samples obtained at the time of CSA1R episodes, which were not preceded by another ACR (i.e., first episodes). Twenty-one proteins previously associated with ACR or CLAD were measured in the BAL using a multiplex bead assay. Association between protein levels and subsequent CLAD or death was assessed using Cox Proportional Hazards models, adjusted for relevant peri-transplant clinical covariates. RESULTS We identified 75 patients with first CSA1R occurring at a median time of 98 days (range 48.5-197) post-transplant. Median time from transplant to CLAD or death was 1247 (756.5-1921.5) and 1641 days (1024.5-2326.5), respectively. In multivariable models, levels of MCP1/CCL2, S100A8, IL10, TNF-receptor 1, and pentraxin 3 (PTX3) were associated with both CLAD development and death (p < 0.05 for all). PTX3 remained significantly associated with both CLAD and death after adjusting for multiple comparisons. CONCLUSION Our data indicate that a focused BAL protein signature, with PTX3 having the strongest association, may be useful in determining a subset of CSA1R patients at increased risk and may benefit from a more aggressive management strategy.
Collapse
|
28
|
Lung transplantation for late-onset non-infectious chronic pulmonary complications of allogenic hematopoietic stem cell transplant. Respir Res 2021; 22:101. [PMID: 33827576 PMCID: PMC8025894 DOI: 10.1186/s12931-021-01699-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2021] [Accepted: 03/29/2021] [Indexed: 11/12/2022] Open
Abstract
Background Late onset non-infectious pulmonary complications (LONIPCs) following allogenic hematopoietic stem cell transplantation (allo-HSCT) confer a significant mortality risk. Lung transplantation (LTx) has the potential to provide survival benefit but the impact of prior allo-HSCT on post-LTx outcomes is not well studied.
Methods This retrospective, single-centre cohort study assessed the post-LTx outcomes of adults with LONIPCs of allo-HSCT. Outcomes of LTx for LONIPCs were compared to propensity-score matched LTx controls (n = 38, non-HSCT) and recipients of re-LTx (n = 70) for chronic lung allograft dysfunction (CLAD).
Results Nineteen patients underwent DLTx for LONIPCs of allo-HSCT between 2003 and 2019. Post-LTx survival was 50% at 5-years. Survival to 1-year post-LTx was similar to matched controls (p = 0.473). Survival, conditional on 1-year survival, was lower in the allo-HSCT cohort (p = 0.034). An increased risk of death due to infection was identified in the allo-HSCT cohort compared to matched controls (p = 0.003). Compared to re-LTx recipients, the allo-HSCT cohort had superior survival to 1-year post-LTx (p = 0.034) but conditional 1-year survival was similar (p = 0.145).
Conclusion This study identifies an increased risk of post-LTx mortality in recipients with previous allo-HSCT, associated with infection. It supports the hypothesis that allo-HSCT LTx recipients are relatively more immunosuppressed than patients undergoing LTx for other indications. Optimisation of post-LTx immunosuppressive and antimicrobial strategies to account for this finding should be considered.
Supplementary Information The online version contains supplementary material available at 10.1186/s12931-021-01699-8.
Collapse
|
29
|
IL-6 receptor blockade for allograft dysfunction after lung transplantation in a patient with COPA syndrome. Clin Transl Immunology 2021; 10:e1243. [PMID: 33537146 PMCID: PMC7843402 DOI: 10.1002/cti2.1243] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2020] [Revised: 01/04/2021] [Accepted: 01/05/2021] [Indexed: 12/18/2022] Open
Abstract
Objective COPA syndrome is a genetic disorder of retrograde cis‐Golgi vesicle transport that leads to upregulation of pro‐inflammatory cytokines (mainly IL‐1β and IL‐6) and the development of interstitial lung disease (ILD). The impact of COPA syndrome on post‐lung transplant (LTx) outcome is unknown but potentially detrimental. In this case report, we describe progressive allograft dysfunction following LTx for COPA‐ILD. Following the failure of standard immunosuppressive approaches, detailed cytokine analysis was performed with the intention of personalising therapy. Methods Multiplexed cytokine analysis was performed on serum and bronchoalveolar lavage (BAL) fluid obtained pre‐ and post‐LTx. Peripheral blood mononuclear cells (PMBCs) obtained pre‐ and post‐LTx were stimulated with PMA, LPS and anti‐CD3/CD28 antibodies. Post‐LTx endobronchial biopsies underwent microarray‐based gene expression analysis. Results were compared to non‐COPA LTx recipients and non‐LTx healthy controls. Results Multiplexed cytokine analysis showed rising type I/II IFNs, and IL‐6 in BAL post‐LTx that decreased following treatment of acute rejection but rebounded with further clinical deterioration. In vitro stimulation of PMBCs suggested that myeloid cells were driving deterioration, through IL‐6 signalling pathways. Tocilizumab (IL‐6 receptor antibody) administration for 3 months (4 mg kg−1, monthly) effectively suppressed IL‐6 levels in BAL. Mucosal gene expression profile following tocilizumab suggested greater similarity to normal. Conclusion Clinical effectiveness of IL‐6 receptor blockade was not observed. However, we identified IL‐6 upregulation associated with graft injury, effective IL‐6 suppression with tocilizumab and evidence of beneficial effect on molecular transcripts. This mechanistic analysis suggests a role for IL‐6 blockade in post‐LTx care that should be investigated further.
Collapse
|
30
|
Eosinophils in transbronchial biopsies: a predictor of chronic lung allograft dysfunction and reduced survival after lung transplantation - a retrospective single-center cohort study. Transpl Int 2020; 34:62-75. [PMID: 33025592 DOI: 10.1111/tri.13760] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/29/2020] [Revised: 04/06/2020] [Accepted: 09/28/2020] [Indexed: 01/14/2023]
Abstract
Long-term outcomes after lung transplantation remain inferior to those of other solid organ groups. The significance of eosinophils detected on transbronchial biopsies (TBBx) after lung transplantation and their relationship to long-term outcomes remain unknown. A retrospective single-center cohort study was performed of patients transplanted between January 01, 2001, and July 31, 2018, who had at least 1 TBBx with evaluable parenchymal tissue. Multivariable Cox proportional hazard models were used to assess the associations between eosinophil detection and: all-cause mortality and Chronic Lung Allograft Dysfunction (CLAD). 8887 TBBx reports from 1440 patients were reviewed for the mention of eosinophils in the pathology report. 112 (7.8%) patients were identified with eosinophils on at least one TBBx. The median (95% CI) survival time for all patients was 8.28 (7.32-9.31) years. Multivariable analysis, adjusted for clinical variables known to affect post-transplant outcomes, showed that the detection of eosinophils was independently associated with an increased risk of death (HR 1.51, 95% CI 1.24-1.85, p < 0.01) and CLAD (HR 1.35, 95% CI 1.07-1.70, P = 0.01). Eosinophils detected in TBBx are associated with an increased risk of CLAD and death. There may be benefit in specifically reporting the presence of eosinophils in TBBx reports and incorporating their presence in clinical decision-making.
Collapse
|
31
|
Pre-transplant short physical performance battery: Response to pre-habilitation and relationship to pre- and early post-lung-transplant outcomes. Clin Transplant 2020; 34:e14095. [PMID: 32970883 DOI: 10.1111/ctr.14095] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2020] [Revised: 08/30/2020] [Accepted: 09/15/2020] [Indexed: 12/30/2022]
Abstract
PURPOSE To evaluate whether the short physical performance battery (SPPB) pre-lung transplant (LTx) was responsive to pre-habilitation and predicted pre- and early post-transplant outcomes. METHODS A retrospective study of LTx candidates accepted for transplant between 2016 and 2017. SPPB was categorized as frail/pre-frail (≤9/12) and non-frail (≥10/12). RESULTS 150 patients had LTx assessment SPPB data (53% male, 61 [52-67] years, 59% had interstitial lung disease (ILD), 26% frail/pre-frail). 131 (87%) underwent transplant by December 31, 2018. Adjusting for age, sex, diagnosis and Canadian transplant listing urgency, and frailty/pre-frailty at LTx assessment was associated with a lower 6MWD pre-transplant [-89 meters 95%CI (-125 to -53), P < .0001]. 62 patients underwent six weeks of pre-habilitation. SPPB increased (11 [10-12) vs. 12 [11-12], P = .01) reflected in the chair stand component (11.4 ± 4.4 vs. 9.8 ± 2.8 seconds, P = .007), with larger improvements in the frail/pre-frail group. A frail/pre-frail SPPB closest to the time of transplant was associated with a lower 6MWD [-77 m 95%CI (-128 to -25), P = .004] but not with hospital length of stay or gait aid use three months post-transplant. CONCLUSIONS Frailty/pre-frailty was associated with a decreased 6MWD pre- and post-transplant. The SPPB increased following pre-habilitation, which may reflect increased lower extremity strength.
Collapse
|
32
|
Clinical outcomes associated with computed tomography‐based body composition measures in lung transplantation: a systematic review. Transpl Int 2020; 33:1610-1625. [DOI: 10.1111/tri.13749] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2020] [Revised: 04/24/2020] [Accepted: 09/13/2020] [Indexed: 12/16/2022]
|
33
|
Risk Factors for Acute Rejection in the First Year after Lung Transplant. A Multicenter Study. Am J Respir Crit Care Med 2020; 202:576-585. [PMID: 32379979 PMCID: PMC7427399 DOI: 10.1164/rccm.201910-1915oc] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2019] [Accepted: 05/07/2020] [Indexed: 11/16/2022] Open
Abstract
Rationale: Acute rejection, manifesting as lymphocytic inflammation in a perivascular (acute perivascular rejection [AR]) or peribronchiolar (lymphocytic bronchiolitis [LB]) distribution, is common in lung transplant recipients and increases the risk for chronic graft dysfunction.Objectives: To evaluate clinical factors associated with biopsy-proven acute rejection during the first post-transplant year in a present-day, five-center lung transplant cohort.Methods: We analyzed prospective diagnoses of AR and LB from over 2,000 lung biopsies in 400 newly transplanted adult lung recipients. Because LB without simultaneous AR was rare, our analyses focused on risk factors for AR. Multivariable Cox proportional hazards models were used to assess donor and recipient factors associated with the time to the first AR occurrence.Measurements and Main Results: During the first post-transplant year, 53.3% of patients experienced at least one AR episode. Multivariable proportional hazards analyses accounting for enrolling center effects identified four or more HLA mismatches (hazard ratio [HR], 2.06; P ≤ 0.01) as associated with increased AR hazards, whereas bilateral transplantation (HR, 0.57; P ≤ 0.01) was associated with protection from AR. In addition, Wilcoxon rank-sum analyses demonstrated bilateral (vs. single) lung recipients, and those with fewer than four (vs. more than four) HLA mismatches demonstrated reduced AR frequency and/or severity during the first post-transplant year.Conclusions: We found a high incidence of AR in a contemporary multicenter lung transplant cohort undergoing consistent biopsy sampling. Although not previously recognized, the finding of reduced AR in bilateral lung recipients is intriguing, warranting replication and mechanistic exploration.
Collapse
|
34
|
Cell-Mediated Immune Responses After Influenza Vaccination of Solid Organ Transplant Recipients: Secondary Outcomes Analyses of a Randomized Controlled Trial. J Infect Dis 2020; 221:53-62. [PMID: 31550354 DOI: 10.1093/infdis/jiz471] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2019] [Accepted: 09/12/2019] [Indexed: 01/04/2023] Open
Abstract
BACKGROUND Despite annual immunization, solid organ transplant (SOT) patients remain at increased risk for severe influenza infection because of suboptimal vaccine immunogenicity. We aimed to compare the CD4+ and CD8+ T-cell responses of the high-dose (HD) and the standard-dose (SD) trivalent inactivated vaccine. METHODS We collected peripheral blood mononuclear cells pre- and postimmunization from 60 patients enrolled in a randomized trial of HD versus SD vaccine (30 HD; 30 SD) during the 2016-2017 influenza season. RESULTS The HD vaccine elicited significantly greater monofunctional and polyfunctional CD4+ and CD8+ T-cell responses against influenza A/H1N1, A/H3N2, and B. For example, median vaccine-elicited influenza-specific polyfunctional CD4+ T cells were higher in recipients of the HD than SD vaccine after stimulation with influenza A/H1N1 (1193 vs 0 per 106 CD4+ T cells; P = .003), A/H3N2 (1154 vs 51; P = .008), and B (1102 vs 0; P = .001). Likewise, vaccine-elicited influenza-specific polyfunctional CD8+ T cells were higher in recipients of the HD than SD vaccine after stimulation with influenza B (367 vs 0; P = .002). CONCLUSIONS Our study provides novel evidence that HD vaccine elicits greater cellular responses compared with the SD vaccine in SOT recipients, which provides support to preferentially consider use of HD vaccination in the SOT setting.
Collapse
|
35
|
Risk assessment of chronic lung allograft dysfunction phenotypes: Validation and proposed refinement of the 2019 International Society for Heart and Lung Transplantation classification system. J Heart Lung Transplant 2020; 39:761-770. [DOI: 10.1016/j.healun.2020.04.012] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2019] [Revised: 03/14/2020] [Accepted: 04/12/2020] [Indexed: 12/26/2022] Open
|
36
|
Identifying host microRNAs in bronchoalveolar lavage samples from lung transplant recipients infected with Aspergillus. J Heart Lung Transplant 2020; 39:1228-1237. [PMID: 32771440 DOI: 10.1016/j.healun.2020.07.014] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2020] [Revised: 07/06/2020] [Accepted: 07/17/2020] [Indexed: 02/08/2023] Open
Abstract
BACKGROUND MicroRNAs (miRNAs) are small non-coding RNAs of ∼22 nucleotides that play a crucial role in post-transcriptional regulation of gene expression. Dysregulation of miRNA expression has been shown during microbial infections. We sought to identify miRNAs that distinguish invasive aspergillosis (IA) from non-IA in lung transplant recipients (LTRs). METHODS We used NanoString nCounter Human miRNA, version 3, panel to measure miRNAs in bronchoalveolar lavage (BAL) samples from LTRs with Aspergillus colonization (ASP group) (n = 10), those with Aspergillus colonization and chronic lung allograft dysfunction (CLAD) (ASPCLAD group) (n = 7), those with IA without CLAD (IA group) (n = 10), those who developed IA with CLAD (IACLAD group) (n = 9), and control patients (controls) (n = 9). The miRNA profile was compared using the permutation test of 100,000 trials for each of the comparisons. We used mirDIP to obtain their gene targets and pathDIP to determine the pathway enrichment. RESULTS We performed pairwise comparisons between patient groups to identify differentially expressed miRNAs. A total of 5 miRNAs were found to be specific to IA, including 4 (miR-145-5p, miR-424-5p, miR-99b-5p, and miR-4488) that were upregulated and the pair (miR-4454 + miR-7975) that was downregulated in IA group vs controls. The expression change for these miRNAs was specific to patients with IA; they were not significantly differentiated between IACLAD and IA groups. Signaling pathways associated with an immunologic response to IA were found to be significantly enriched. CONCLUSIONS We report a set of 5 differentially expressed miRNAs in the BAL of LTRs with IA that might help in the development of diagnostic and prognostic tools for IA in LTRs. However, further investigation is needed in a larger cohort to validate the findings.
Collapse
|
37
|
Chest computed tomography is a valid measure of body composition in individuals with advanced lung disease. Clin Physiol Funct Imaging 2020; 40:360-368. [PMID: 32544296 DOI: 10.1111/cpf.12652] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2019] [Revised: 04/03/2020] [Accepted: 06/07/2020] [Indexed: 01/06/2023]
Abstract
There is growing interest in evaluating body composition using routine clinical computed tomography (CT) scans; however, the validity of this technique in lung transplant patients has not been described. The study objectives were to determine the reliability of measuring fat compartments from thoracic CT and evaluate the validity of muscle and fat cross-sectional area (CSA) from thoracic CT by comparing to bioelectrical impedance analysis (BIA). Thoracic CT scans from lung transplant assessments were obtained for analysis. Total thoracic muscle CSA, pectoral muscle CSA, subcutaneous adipose tissue (SAT), and mediastinal adipose tissue (MAT) were manually segmented by two independent raters. Reliability was analysed using intra-class correlation coefficient (ICC). Correlations were determined between CT measures with fat-free mass index (FFMI), body fat mass index (BFMI) and per cent body fat (%BF) from BIA; and anthropometrics [body mass index (BMI) and waist circumference (WC)]. High inter- and intra-rater reliability were found for SAT and MAT (ICCs = 0.99). Pectoral and total muscle CSA were correlated with FFMI (r = .41, p = .003 and r = .57, p < .001, respectively). SAT was associated with whole-body fat from BIA and with BMI and WC (r = .61 to .80, p < .001). MAT was associated with BMI (r = .58, p < .001) and WC (r = .61, p < .001). This study supports the reliability and validity of using thoracic CT to measure muscle and fat. Future studies are needed to investigate whether these CT-based measures are predictive of clinical and post-transplant outcomes in advanced lung disease.
Collapse
|
38
|
Abstract
Importance The mortality rate for individuals on the wait list for lung transplant is 15% to 25%, and still only 20% of lungs from multiorgan donors are used for lung transplant. The lung donor pool may be increased by assessing and reconditioning high-risk extended criteria donor lungs with ex vivo lung perfusion (EVLP), with similar short-term outcomes. Objective To assess the long-term outcomes of transplant recipients of donor lungs treated with EVLP. Design, Setting, and Participants This retrospective cohort single-center study was conducted from August 1, 2008, to February 28, 2017, among 706 recipients of donor lungs not undergoing EVLP and 230 recipients of donor lungs undergoing EVLP. Exposure Donor lungs undergoing EVLP. Main Outcomes and Measures The incidence of chronic lung allograft dysfunction and allograft survival during the 10-year EVLP era were the primary outcome measures. Secondary outcomes included donor characteristics, maximum predicted percentage of forced expiratory volume in 1 second, acute cellular rejection, and de novo donor-specific antibody development. Results This study included 706 patients (311 women and 395 men; median age, 50 years [interquartile range, 34-61 years]) in the non-EVLP group and 230 patients (85 women and 145 men; median age, 46 years [interquartile range, 32-55 years]) in the EVLP group. The EVLP group donors had a significantly lower mean (SD) Pao2:fraction of inspired oxygen ratio than the non-EVLP group donors (348 [108] vs 422 [88] mm Hg; P < .001), higher prevalence of abnormal chest radiography results (135 of 230 [58.7%] vs 349 of 706 [49.4%]; P = .02), and higher proportion of smoking history (125 of 204 [61.3%] vs 322 of 650 [49.5%]; P = .007). More recipients in the EVLP group received single-lung transplants (62 of 230 [27.0%] vs 100 of 706 [14.2%]; P < .001). There was no significant difference in time to chronic lung allograft dysfunction between the EVLP and non-EVLP group (70% vs 72% at 3 years; 56% vs 56% at 5 years; and 53% vs 36% at 9 years; log-rank P = .68) or allograft survival between the EVLP and non-EVLP groups (73% vs 72% at 3 years; 62% vs 58% at 5 years; and 50% vs 44% at 9 years; log-rank P = .97) between the 2 groups. All secondary outcomes were similar between the 2 groups. Conclusions and Relevance Since 2008, 230 of 936 lung transplants (24.6%) in the Toronto Lung Transplant Program were performed after EVLP assessment and treatment. Use of EVLP-treated lungs led to an increase in the number of patients undergoing transplantation, with comparable long-term outcomes.
Collapse
|
39
|
Evaluation of Malnutrition Risk in Lung Transplant Candidates Using the Nutritional Risk Index. Transplant Direct 2020; 6:e574. [PMID: 32766429 PMCID: PMC7339342 DOI: 10.1097/txd.0000000000001028] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2020] [Revised: 05/13/2020] [Accepted: 05/14/2020] [Indexed: 02/07/2023] Open
Abstract
BACKGROUND Malnutrition in lung transplant (LTx) candidates is an important risk factor for adverse outcomes. We sought to evaluate the Nutritional Risk Index (NRI) in LTx candidates, a validated measure of malnutrition risk in chronic disease. We aimed to characterize malnutrition risk using NRI, evaluate change in body weight between nutritional risk groups, and assess association of malnutrition risk with pretransplant and posttransplant outcomes. METHODS Retrospective, single-center cohort study of LTx candidates (2014-2015) evaluated by a dietitian before listing. Nutritional parameters, weight change pretransplant and posttransplant, and clinical outcomes were abstracted up to 1-year posttransplant. NRI was calculated as follows: (1.519 × albumin) + (41.7 × current weight/ideal weight) with high malnutrition risk defined as the lowest quartile of NRI for cystic fibrosis (CF) and non-CF patients. RESULTS The cohort comprises 247 LTx candidates (57% male; median age 59 y; non-CF 88%). Non-CF candidates had a greater mean NRI compared with CF patients (109 ± 11 versus 95 ± 12; P < 0.0001). 86% with high malnutrition risk maintained/gained weight (≥5%) pretransplant. In 196 LTx recipients, malnutrition risk was not associated with hospital stay, discharge disposition, or 1-year mortality. The median percent weight gain for LTx recipients in the first year was 10.5% (4.0-20.1), with high malnutrition risk recipients having comparable or greater weight gain to the low-risk group (mean difference for non CF: 6.8%; P = 0.02 and CF: -3.8%; P = 0.65). CONCLUSIONS Malnutrition risk assessed with NRI was not prognostic of posttransplant outcomes in this retrospective cohort. LTx candidates with high malnutrition risk were able to maintain their weight pretransplant and demonstrated considerable weight gain in the first-year posttransplant.
Collapse
|
40
|
Ethical considerations regarding heart and lung transplantation and mechanical circulatory support during the COVID-19 pandemic: an ISHLT COVID-19 Task Force statement. J Heart Lung Transplant 2020; 39:619-626. [PMID: 32505492 PMCID: PMC7195343 DOI: 10.1016/j.healun.2020.04.019] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2020] [Accepted: 04/22/2020] [Indexed: 12/24/2022] Open
Abstract
To understand the challenges for thoracic transplantation and mechanical circulatory support during the current coronavirus disease 2019 pandemic, we propose separating the effects of the pandemic into 5 distinct stages from a healthcare system perspective. We discuss how the classical ethical principles of utility, justice, and efficiency may need to be adapted, and we give specific recommendations for thoracic transplantation and mechanical circulatory support centers to balance their clinical decisions and strategies for advanced heart and lung disease during the current pandemic.
Collapse
|
41
|
Short-course, direct-acting antivirals and ezetimibe to prevent HCV infection in recipients of organs from HCV-infected donors: a phase 3, single-centre, open-label study. Lancet Gastroenterol Hepatol 2020; 5:649-657. [PMID: 32389183 PMCID: PMC7391837 DOI: 10.1016/s2468-1253(20)30081-9] [Citation(s) in RCA: 67] [Impact Index Per Article: 16.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/13/2020] [Revised: 02/28/2020] [Accepted: 03/12/2020] [Indexed: 02/06/2023]
Abstract
BACKGROUND An increasing percentage of potential organ donors are infected with hepatitis C virus (HCV). After transplantation from an infected donor, establishment of HCV infection in uninfected recipients is near-universal, with the requirement for post-transplant antiviral treatment. The aim of this study was to determine if antiviral drugs combined with an HCV entry blocker given before and for 7 days after transplant would be safe and reduce the likelihood of HCV infection in recipients of organs from HCV-infected donors. METHODS HCV-uninfected organ recipients without pre-existing liver disease were treated with ezetimibe (10 mg; an HCV entry inhibitor) and glecaprevir-pibrentasvir (300 mg/120 mg) before and after transplantation from HCV-infected donors aged younger than 70 years without co-infection with HIV, hepatitis B virus, or human T-cell leukaemia virus 1 or 2. Recipients received a single dose 6-12 h before transplant and once a day for 7 days after surgery (eight doses in total). HCV RNA was assessed once a day for 14 days and then once a week until 12 weeks post-transplant. The primary endpoint was prevention of chronic HCV infection, as evidenced by undetectable serum HCV RNA at 12 weeks after transplant, and assessed in the intention-to-treat population. Safety monitoring was according to routine post-transplant practice. 12-week data are reported for the first 30 patients. The trial is registered on ClinicalTrials.gov, NCT04017338. The trial is closed to recruitment but follow-up is ongoing. FINDINGS 30 patients (23 men and seven women; median age 61 years (IQR 48-66) received transplants (13 lung, ten kidney, six heart, and one kidney-pancreas) from 18 HCV-infected donors. The median donor viral load was 5·11 log10IU/mL (IQR 4·55-5·63) and at least three HCV genotypes were represented (nine [50%] donors with genotype 1, two [11%] with genotype 2, five [28%] with genotype 3, and two [11%] with unknown genotype). All 30 (100%) transplant recipients met the primary endpoint of undetectable HCV RNA at 12 weeks post-transplant, and were HCV RNA-negative at last follow-up (median 36 weeks post-transplant [IQR 25-47]). Low-level viraemia was transiently detectable in 21 (67%) of 30 recipients in the early post-transplant period but not after day 14. Treatment was well tolerated with no dose reductions or treatment discontinuations; 32 serious adverse events occurred in 20 (67%) recipients, with one grade 3 elevation in alanine aminotransferase (ALT) possibly related to treatment. Non-serious transient elevations in ALT and creatine kinase during the study dosing period resolved with treatment completion. Among the serious adverse events were two recipient deaths due to causes unrelated to study drug treatment (sepsis at 49 days and subarachnoid haemorrhage at 109 days post-transplant), with neither patient ever being viraemic for HCV. INTERPRETATION Ezetimibe combined with glecaprevir-pibrentasvir given one dose before and for 7 days after transplant prevented the establishment of chronic HCV infection in recipients of different organs from HCV-infected donors. This study shows that an ultra-short course of direct-acting antivirals and ezetimibe can prevent the establishment of chronic HCV infection in the recipient, alleviating many of the concerns with transplanting organs from HCV-infected donors. FUNDING Canadian Institutes of Health Research; the Organ Transplant Program, University Health Network.
Collapse
|
42
|
Highlights from the clinical trials in organ transplantation (CTOT)-20 and CTOT-22 Consortium studies in lung transplant. Am J Transplant 2020; 20:1489-1494. [PMID: 32342596 PMCID: PMC7323580 DOI: 10.1111/ajt.15957] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2020] [Revised: 04/10/2020] [Accepted: 04/17/2020] [Indexed: 01/25/2023]
Abstract
Long-term survival after lung transplant lags behind that of other commonly transplanted organs, reflecting the current incomplete understanding of the mechanisms involved in the development of posttransplant lung injury, rejection, infection, and chronic allograft dysfunction. To address this unmet need, 2 ongoing National Institute of Allergy and Infectious Disease funded studies through the Clinical Trials in Organ Transplant Consortium (CTOT) CTOT-20 and CTOT-22 were dedicated to understanding the clinical factors and biological mechanisms that drive chronic lung allograft dysfunction and those that maintain cytomegalovirus polyfunctional protective immunity. The CTOT-20 and CTOT-22 studies enrolled 800 lung transplant recipients at 5 North American centers over 3 years. Given the number and complexity of subjects included, CTOT-20 and CTOT-22 utilized innovative data transfers and capitalized on patient-entered data collection to minimize site manual data entry. The data were coupled with an extensive biosample collection strategy that included DNA, RNA, plasma, serum, bronchoalveolar lavage fluid, and bronchoalveolar lavage cell pellet. This Special Article describes the CTOT-20 and CTOT-22 protocols, data and biosample strategy, initial results, and lessons learned through study execution.
Collapse
|
43
|
Bronchoalveolar bile acid and inflammatory markers to identify high-risk lung transplant recipients with reflux and microaspiration. J Heart Lung Transplant 2020; 39:934-944. [PMID: 32487471 DOI: 10.1016/j.healun.2020.05.006] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2019] [Revised: 04/20/2020] [Accepted: 05/11/2020] [Indexed: 12/12/2022] Open
Abstract
BACKGROUND Gastroesophageal reflux disease (GERD) is a risk factor for chronic lung allograft dysfunction. Bile acids-putative markers of gastric microaspiration-and inflammatory proteins in the bronchoalveolar lavage (BAL) have been associated with chronic lung allograft dysfunction, but their relationship with GERD remains unclear. Although GERD is thought to drive chronic microaspiration, the selection of patients for anti-reflux surgery lacks precision. This multicenter study aimed to test the association of BAL bile acids with GERD, lung inflammation, allograft function, and anti-reflux surgery. METHODS We analyzed BAL obtained during the first post-transplant year from a retrospective cohort of patients with and without GERD, as well as BAL obtained before and after Nissen fundoplication anti-reflux surgery from a separate cohort. Levels of taurocholic acid (TCA), glycocholic acid, and cholic acid were measured using mass spectrometry. Protein markers of inflammation and injury were measured using multiplex assay and enzyme-linked immunosorbent assay. RESULTS At 3 months after transplantation, TCA, IL-1β, IL-12p70, and CCL5 were higher in the BAL of patients with GERD than in that of no-GERD controls. Elevated TCA and glycocholic acid were associated with concurrent acute lung allograft dysfunction and inflammatory proteins. The BAL obtained after anti-reflux surgery contained reduced TCA and inflammatory proteins compared with that obtained before anti-reflux surgery. CONCLUSIONS Targeted monitoring of TCA and selected inflammatory proteins may be useful in lung transplant recipients with suspected reflux and microaspiration to support diagnosis and guide therapy. Patients with elevated biomarker levels may benefit most from anti-reflux surgery to reduce microaspiration and allograft inflammation.
Collapse
|
44
|
Cystic Fibrosis Foundation consensus guidelines for the care of individuals with advanced cystic fibrosis lung disease. J Cyst Fibros 2020; 19:344-354. [DOI: 10.1016/j.jcf.2020.02.015] [Citation(s) in RCA: 55] [Impact Index Per Article: 13.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2019] [Revised: 02/14/2020] [Accepted: 02/19/2020] [Indexed: 12/25/2022]
|
45
|
Clinical judgment versus lung allocation score in predicting lung transplant waitlist mortality. Clin Transplant 2020; 34:e13870. [PMID: 32271967 DOI: 10.1111/ctr.13870] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2020] [Revised: 03/09/2020] [Accepted: 03/30/2020] [Indexed: 01/23/2023]
Abstract
Canadian lung transplant centers currently use a subjective and dichotomous "Status" ranking to prioritize waitlisted patients for lung transplantation. The lung allocation score (LAS) is an objective composite score derived from clinical parameters associated with both waitlist and post-transplant survival. We performed a retrospective cohort study to determine whether clinical judgment (Status) or LAS better predicted waitlist mortality. All adult patients listed for lung transplantation between 2007 and 2012 at three Canadian lung transplant programs were included. Status and LAS were compared in their ability to predict waitlist mortality using Cox proportional hazards models and C-statistics. Status and LAS were available for 1122 patients. Status 2 patients had a higher LAS compared to Status 1 patients (mean 40.8 (4.4) vs 34.6 (12.5), P = .0001). Higher LAS was associated with higher risk of waitlist mortality (HR 1.06 per unit LAS, 95% CI 1.05, 1.07, P < .001). LAS predicted waitlist mortality better than Status (C-statistic 0.689 vs 0.674). Patients classified as Status 2 and LAS ≥ 37 had the worst survival awaiting transplant, HR of 8.94 (95% CI 5.97, 13.37). LAS predicted waitlist mortality better than Status; however, the best predictor of waitlist mortality may be a combination of both LAS and clinical judgment.
Collapse
|
46
|
Outcomes of a Peri- and Postoperative Management Protocol for Non-TB Mycobacteria in Lung Transplant Recipients. Chest 2020; 158:523-528. [PMID: 32247715 DOI: 10.1016/j.chest.2020.01.056] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2019] [Revised: 01/16/2020] [Accepted: 01/31/2020] [Indexed: 10/24/2022] Open
|
47
|
Extending cytomegalovirus prophylaxis in high-risk (D+/R-) lung transplant recipients from 6 to 9 months reduces cytomegalovirus disease: A retrospective study. Transpl Infect Dis 2020; 22:e13277. [PMID: 32170813 DOI: 10.1111/tid.13277] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2019] [Revised: 02/25/2020] [Accepted: 03/08/2020] [Indexed: 12/31/2022]
Abstract
RATIONALE Cytomegalovirus (CMV)-seronegative recipients receiving a seropositive allograft (D+/R-) are at a high risk of developing CMV disease. Our program increased the duration of CMV prophylaxis from 6 to 9 months in May 2013. Here, we present the impact on the incidence of CMV infection, disease, side effects, rejection, and other factors. METHODS Retrospective cohort of 241 CMV (D+/R-) patients transplanted between January 1, 2008, and December 31, 2017. Blood CMV testing was done according to protocol. All patients received ganciclovir/valganciclovir as prophylaxis. We compared the incidence and timing of CMV infection and disease up to 6 months after cessation of prophylaxis between patients who received 9 months (May 2013 onwards) and a historical control group who received 6 months of prophylaxis (prior to May 2013). CMV infection was defined as detectable CMV viremia in the absence of symptoms. CMV disease was defined as CMV syndrome or tissue-invasive disease. Side effects of prophylaxis and CMV resistance were recorded. RESULTS A total of 116 patients were included in the 6-month group and 125 in the 9-month group. The extended 9-month CMV prophylaxis delayed the onset of CMV infection (median time to CMV infection after lung transplantation 295 vs 353 days, P < .01) but did not significantly reduce the incidence of CMV infection (65% vs 64%, P = .06, log-rank). The 9-month prophylaxis delayed the onset and decreased the incidence of CMV disease from 50% in the 6-month group to 42% (P = .02 log-rank). There was no difference in the rate of adverse effects (leukopenia in 32% in both groups, P = .53) or development of CMV resistance between the two groups (4 cases in both groups, P = .92). There were no significant differences in overall survival or the rate of chronic lung allograft dysfunction between the groups. CONCLUSIONS Extending duration of CMV prophylaxis from 6 to 9 months resulted in a delayed and decreased incidence of CMV disease in our lung transplant population. The absolute risk reduction achieved by extended CMV prophylaxis was 8%. The incidence of CMV infection, and ganciclovir resistance and side effects were similar between the two groups. Our results suggest that extending CMV prophylaxis in the highest risk CMV D+/R- group is effective in reducing CMV disease.
Collapse
|
48
|
Skeletal muscle oxygenation and regional blood volume during incremental limb loading in interstitial lung disease. ERJ Open Res 2020; 6:00083-2019. [PMID: 32010722 PMCID: PMC6983499 DOI: 10.1183/23120541.00083-2019] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2019] [Accepted: 11/20/2019] [Indexed: 11/29/2022] Open
Abstract
Introduction Individuals with interstitial lung disease (ILD) exhibit reduced exercise capacity and exertional hypoxaemia. The role of peripheral (muscle) limitation to exercise tolerance in ILD is not well studied to date. Methods A prospective cross-sectional study examined skeletal muscle oxygen saturation (SmO2) and regional blood volume of the knee extensors and elbow flexors during incremental limb loading in healthy people and people with varying severity of ILD. Isotonic concentric exercise was performed on an isokinetic dynamometer. SmO2 and regional blood volume were measured by near-infrared spectroscopy over the vastus lateralis and biceps. Results Thirteen people who were dependent on oxygen, candidates for lung transplant and with severe ILD (forced vital capacity (FVC) 59±20% predicted), 10 people who were not oxygen dependent with mild ILD (FVC 81±17% predicted) and 13 healthy people (FVC 101±14% predicted) were included. Total haemoglobin, a marker of regional blood volume, was lower at task failure in the knee extensors in participants with severe ILD compared to healthy participants (p=0.05). At task failure for both knee-extensor loading and elbow-flexor loading, SmO2 was decreased to similar levels across all groups, but occurred at lower total workloads in the ILD groups (all p<0.01). Conclusions Overall, people with severe ILD had lower levels of total work and experienced less increase in blood volume in the knee extensors after knee-extensor loading compared to healthy people. Peripheral muscle dysfunction in severe ILD may have contributed to muscle deoxygenation at lower workloads. Muscle deoxygenation occurs at a lower level of lower-limb incremental loading in lung transplant candidates with interstitial lung disease and regional blood volume is attenuated compared to healthy personshttp://bit.ly/34z0Qtw
Collapse
|
49
|
Prevention of viral transmission during lung transplantation with hepatitis C-viraemic donors: an open-label, single-centre, pilot trial. THE LANCET RESPIRATORY MEDICINE 2020; 8:192-201. [DOI: 10.1016/s2213-2600(19)30268-1] [Citation(s) in RCA: 47] [Impact Index Per Article: 11.8] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/03/2019] [Revised: 07/17/2019] [Accepted: 07/17/2019] [Indexed: 12/12/2022]
|
50
|
The impact of first untreated subclinical minimal acute rejection on risk for chronic lung allograft dysfunction or death after lung transplantation. Am J Transplant 2020; 20:241-249. [PMID: 31397939 DOI: 10.1111/ajt.15561] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2019] [Revised: 06/14/2019] [Accepted: 07/26/2019] [Indexed: 01/25/2023]
Abstract
Acute cellular rejection (ACR) is a significant risk factor for chronic lung allograft dysfunction (CLAD). Although clinically manifest and higher grade (≥A2) ACR is generally treated with augmented immunosuppression, management of minimal (grade A1) ACR remains controversial. In our program, patients with subclinical and spirometrically stable A1 rejection (StA1R) are routinely not treated with augmented immunosuppression. We hypothesized that an untreated first StA1R does not increase the risk of CLAD or death compared to episodes of spirometrically stable no ACR (StNAR). The cohort was drawn from all consecutive adult, first, bilateral lung transplantations performed between 1999 and 2017. Biopsies obtained in the first-year posttransplant were paired with (forced expiratory volume in 1 second FEV1 ). The first occurrence of StA1R was compared to a time-matched StNAR. The risk of CLAD or death was assessed using univariable and multivariable Cox proportional hazards models. The analyses demonstrated no significant difference in risk of CLAD or death in patients with a first StA1R compared to StNAR. This largest study to date shows that, in clinically stable patients, an untreated first A1 ACR in the first-year posttransplant is not significantly associated with an increased risk for CLAD or death. Watchful-waiting approach may be an acceptable tactic for stable A1 episodes in lung transplant recipients.
Collapse
|