1
|
Characterizing the risk of human leukocyte antigen-incompatible living donor kidney transplantation in older recipients. Am J Transplant 2023; 23:1980-1989. [PMID: 37748554 PMCID: PMC10767749 DOI: 10.1016/j.ajt.2023.09.010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2023] [Revised: 08/26/2023] [Accepted: 09/18/2023] [Indexed: 09/27/2023]
Abstract
Older compatible living donor kidney transplant (CLDKT) recipients have higher mortality and death-censored graft failure (DCGF) compared to younger recipients. These risks may be amplified in older incompatible living donor kidney transplant (ILDKT) recipients who undergo desensitization and intense immunosuppression. In a 25-center cohort of ILDKT recipients transplanted between September 24, 1997, and December 15, 2016, we compared mortality, DCGF, delayed graft function (DGF), acute rejection (AR), and length of stay (LOS) between 234 older (age ≥60 years) and 1172 younger (age 18-59 years) recipients. To investigate whether the impact of age was different for ILDKT recipients compared to 17 542 CLDKT recipients, we used an interaction term to determine whether the relationship between posttransplant outcomes and transplant type (ILDKT vs CLDKT) was modified by age. Overall, older recipients had higher mortality (hazard ratio: 1.632.072.65, P < .001), lower DCGF (hazard ratio: 0.360.530.77, P = .001), and AR (odds ratio: 0.390.540.74, P < .001), and similar DGF (odds ratio: 0.461.032.33, P = .9) and LOS (incidence rate ratio: 0.880.981.10, P = 0.8) compared to younger recipients. The impact of age on mortality (interaction P = .052), DCGF (interaction P = .7), AR interaction P = .2), DGF (interaction P = .9), and LOS (interaction P = .5) were similar in ILDKT and CLDKT recipients. Age alone should not preclude eligibility for ILDKT.
Collapse
|
2
|
Delayed graft function and acute rejection following HLA-incompatible living donor kidney transplantation. Am J Transplant 2021; 21:1612-1621. [PMID: 33370502 PMCID: PMC8016719 DOI: 10.1111/ajt.16471] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2020] [Revised: 11/17/2020] [Accepted: 12/08/2020] [Indexed: 02/05/2023]
Abstract
Incompatible living donor kidney transplant recipients (ILDKTr) have pre-existing donor-specific antibody (DSA) that, despite desensitization, may persist or reappear with resulting consequences, including delayed graft function (DGF) and acute rejection (AR). To quantify the risk of DGF and AR in ILDKT and downstream effects, we compared 1406 ILDKTr to 17 542 compatible LDKT recipients (CLDKTr) using a 25-center cohort with novel SRTR linkage. We characterized DSA strength as positive Luminex, negative flow crossmatch (PLNF); positive flow, negative cytotoxic crossmatch (PFNC); or positive cytotoxic crossmatch (PCC). DGF occurred in 3.1% of CLDKT, 3.5% of PLNF, 5.7% of PFNC, and 7.6% of PCC recipients, which translated to higher DGF for PCC recipients (aOR = 1.03 1.682.72 ). However, the impact of DGF on mortality and DCGF risk was no higher for ILDKT than CLDKT (p interaction > .1). AR developed in 8.4% of CLDKT, 18.2% of PLNF, 21.3% of PFNC, and 21.7% of PCC recipients, which translated to higher AR (aOR PLNF = 1.45 2.093.02 ; PFNC = 1.67 2.403.46 ; PCC = 1.48 2.243.37 ). Although the impact of AR on mortality was no higher for ILDKT than CLDKT (p interaction = .1), its impact on DCGF risk was less consequential for ILDKT (aHR = 1.34 1.621.95 ) than CLDKT (aHR = 1.96 2.292.67 ) (p interaction = .004). Providers should consider these risks during preoperative counseling, and strategies to mitigate them should be considered.
Collapse
|
3
|
Abstract
BACKGROUND Desensitization protocols for HLA-incompatible living donor kidney transplantation (ILDKT) vary across centers. The impact of these, as well as other practice variations, on ILDKT outcomes remains unknown. METHODS We sought to quantify center-level variation in mortality and graft loss following ILDKT using a 25-center cohort of 1358 ILDKT recipients with linkage to Scientific Registry of Transplant Recipients for accurate outcome ascertainment. We used multilevel Cox regression with shared frailty to determine the variation in post-ILDKT outcomes attributable to between-center differences and to identify any center-level characteristics associated with improved post-ILDKT outcomes. RESULTS After adjusting for patient-level characteristics, only 6 centers (24%) had lower mortality and 1 (4%) had higher mortality than average. Similarly, only 5 centers (20%) had higher graft loss and 2 had lower graft loss than average. Only 4.7% of the differences in mortality (P < 0.01) and 4.4% of the differences in graft loss (P < 0.01) were attributable to between-center variation. These translated to a median hazard ratio of 1.36 for mortality and 1.34 of graft loss for similar candidates at different centers. Post-ILDKT outcomes were not associated with the following center-level characteristics: ILDKT volume and transplanting a higher proportion of highly sensitized, prior transplant, preemptive, or minority candidates. CONCLUSIONS Unlike most aspects of transplantation in which center-level variation and volume impact outcomes, we did not find substantial evidence for this in ILDKT. Our findings support the continued practice of ILDKT across these diverse centers.
Collapse
|
4
|
Hospital readmissions following HLA-incompatible live donor kidney transplantation: A multi-center study. Am J Transplant 2018; 18:650-658. [PMID: 28834181 PMCID: PMC5820188 DOI: 10.1111/ajt.14472] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2017] [Revised: 08/06/2017] [Accepted: 08/11/2017] [Indexed: 01/25/2023]
Abstract
Thirty percent of kidney transplant recipients are readmitted in the first month posttransplantation. Those with donor-specific antibody requiring desensitization and incompatible live donor kidney transplantation (ILDKT) constitute a unique subpopulation that might be at higher readmission risk. Drawing on a 22-center cohort, 379 ILDKTs with Medicare primary insurance were matched to compatible transplant-matched controls and to waitlist-only matched controls on panel reactive antibody, age, blood group, renal replacement time, prior kidney transplantation, race, gender, diabetes, and transplant date/waitlisting date. Readmission risk was determined using multilevel, mixed-effects Poisson regression. In the first month, ILDKTs had a 1.28-fold higher readmission risk than compatible controls (95% confidence interval [CI] 1.13-1.46; P < .001). Risk peaked at 6-12 months (relative risk [RR] 1.67, 95% CI 1.49-1.87; P < .001), attenuating by 24-36 months (RR 1.24, 95% CI 1.10-1.40; P < .001). ILDKTs had a 5.86-fold higher readmission risk (95% CI 4.96-6.92; P < .001) in the first month compared to waitlist-only controls. At 12-24 (RR 0.85, 95% CI 0.77-0.95; P = .002) and 24-36 months (RR 0.74, 95% CI 0.66-0.84; P < .001), ILDKTs had a lower risk than waitlist-only controls. These findings of ILDKTs having a higher readmission risk than compatible controls, but a lower readmission risk after the first year than waitlist-only controls should be considered in regulatory/payment schemas and planning clinical care.
Collapse
|
5
|
The Incremental Cost of Incompatible Living Donor Kidney Transplantation: A National Cohort Analysis. Am J Transplant 2017; 17:3123-3130. [PMID: 28613436 DOI: 10.1111/ajt.14392] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2017] [Revised: 05/04/2017] [Accepted: 05/22/2017] [Indexed: 01/25/2023]
Abstract
Incompatible living donor kidney transplantation (ILDKT) has been established as an effective option for end-stage renal disease patients with willing but HLA-incompatible living donors, reducing mortality and improving quality of life. Depending on antibody titer, ILDKT can require highly resource-intensive procedures, including intravenous immunoglobulin, plasma exchange, and/or cell-depleting antibody treatment, as well as protocol biopsies and donor-specific antibody testing. This study sought to compare the cost and Medicare reimbursement, exclusive of organ acquisition payment, for ILDKT (n = 926) with varying antibody titers to matched compatible transplants (n = 2762) performed between 2002 and 2011. Data were assembled from a national cohort study of ILDKT and a unique data set linking hospital cost accounting data and Medicare claims. ILDKT was more expensive than matched compatible transplantation, ranging from 20% higher adjusted costs for positive on Luminex assay but negative flow cytometric crossmatch, 26% higher for positive flow cytometric crossmatch but negative cytotoxic crossmatch, and 39% higher for positive cytotoxic crossmatch (p < 0.0001 for all). ILDKT was associated with longer median length of stay (12.9 vs. 7.8 days), higher Medicare payments ($91 330 vs. $63 782 p < 0.0001), and greater outlier payments. In conclusion, ILDKT increases the cost of and payments for kidney transplantation.
Collapse
|
6
|
Minimally Invasive Vein Harvesting: A Comparison of Endoscopic Versus Traditional Open Saphenectomy. ACTA ACUST UNITED AC 2016. [DOI: 10.1177/153857449703100502] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
The greater saphenous vein (SV) is the conduit of choice for coronary and infrapopliteal revascularization procedures. Unfortunately, to harvest an SV it is often necessary to make an incision the length of the leg, and this is associated with a significant incidence of wound complications. Minimally invasive procedures have several advantages including reduced incidence of wound complications, decreased hospital length of stay, and, therefore, health-care savings. Currently, little information is available that compares traditional open saphenectomy (OS) versus a minimally invasive procedure, endoscopic saphenectomy (ES). The purpose of this study was to compare SV harvest time, incision length, and harvested vein quality between the OS and ES techniques in six nonpreserved cadavers. Each limb was randomly selected for either OS or ES. The length of incision, number of SV leaks after harvest, length of SV, and time required for harvest were recorded for each technique. (continued on next page) The table summarizes the findings of the cadaver dissections. Per limb, no difference was noted in vein harvest length or number of leaks between OS and ES. A significant reduction was found in incision length for ES (14.4 ±1.4 cm per limb), but the time required for OS was significantly shorter (P=0.01). This study suggests an equivalent length of SV can be harvested with either OS or ES techniques; however, with the ES technique there is a reduction in incision length and, therefore, a less morbid operative technique.
Collapse
|
7
|
The Effects of Fluid Loading on Hemodynamic Changes and Right Ventricular Function with Aortic Unclamping During Abdominal Aortic Surgery. ACTA ACUST UNITED AC 2016. [DOI: 10.1177/153857449302700406] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
The authors investigated the effects of fluid loading on hemodynamic changes and right ventricular function (RVF) with aortic unclamping during abdominal aortic aneurysmal surgery in 12 patients. An ejection fraction volu metric pulmonary artery catheter was inserted through the right internal jugu lar vein to assess cardiac output and RVF with the use of an REF-1 computer. Anesthesia was maintained with a continuous infusion of sufentanil (0.5-1 μg/kg/hr), isoflurane (0.5-1%), and air/O2 (FIO2= 0.5). Fluid loading with Ring er's lactate and 5% albumin was initiated ten to twenty minutes before aortic unclamping. Hemodynamic measurements and assessment of RVF were per formed ten to twenty minutes before and after aortic unclamping. Aortic un clamping after fluid loading decreased right ventricular end-diastolic and end-systolic volumes (RVEDV, RVESV) but increased right ventricular ejection fraction (RVEF) (p < 0.01). There were no significant changes in cardiac index (CI), stroke volume (SV), systemic vascular resistance (SVR), and cardic filling pressures: central venous pressure (CVP), and pulmonary capillary wedge pres sure (PCWP). There was a poor correlation between RVEDV and PCWP. The authors conclude that adequate fluid loading before aortic unclamping, esti mated by RVEDV, provided stable hemodynamic states (CI, SV, RVEF) follow ing aortic unclamping. Volume expansion following fluid loading can be better assessed by RVEDV than by cardiac filling pressures.
Collapse
|
8
|
Outcomes of kidney transplants and risk of infection transmission from increased infectious risk donors. Clin Transplant 2016; 30:886-93. [DOI: 10.1111/ctr.12761] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/28/2016] [Indexed: 01/02/2023]
|
9
|
Abstract
BACKGROUND A report from a high-volume single center indicated a survival benefit of receiving a kidney transplant from an HLA-incompatible live donor as compared with remaining on the waiting list, whether or not a kidney from a deceased donor was received. The generalizability of that finding is unclear. METHODS In a 22-center study, we estimated the survival benefit for 1025 recipients of kidney transplants from HLA-incompatible live donors who were matched with controls who remained on the waiting list or received a transplant from a deceased donor (waiting-list-or-transplant control group) and controls who remained on the waiting list but did not receive a transplant (waiting-list-only control group). We analyzed the data with and without patients from the highest-volume center in the study. RESULTS Recipients of kidney transplants from incompatible live donors had a higher survival rate than either control group at 1 year (95.0%, vs. 94.0% for the waiting-list-or-transplant control group and 89.6% for the waiting-list-only control group), 3 years (91.7% vs. 83.6% and 72.7%, respectively), 5 years (86.0% vs. 74.4% and 59.2%), and 8 years (76.5% vs. 62.9% and 43.9%) (P<0.001 for all comparisons with the two control groups). The survival benefit was significant at 8 years across all levels of donor-specific antibody: 89.2% for recipients of kidney transplants from incompatible live donors who had a positive Luminex assay for anti-HLA antibody but a negative flow-cytometric cross-match versus 65.0% for the waiting-list-or-transplant control group and 47.1% for the waiting-list-only control group; 76.3% for recipients with a positive flow-cytometric cross-match but a negative cytotoxic cross-match versus 63.3% and 43.0% in the two control groups, respectively; and 71.0% for recipients with a positive cytotoxic cross-match versus 61.5% and 43.7%, respectively. The findings did not change when patients from the highest-volume center were excluded. CONCLUSIONS This multicenter study validated single-center evidence that patients who received kidney transplants from HLA-incompatible live donors had a substantial survival benefit as compared with patients who did not undergo transplantation and those who waited for transplants from deceased donors. (Funded by the National Institute of Diabetes and Digestive and Kidney Diseases.).
Collapse
|
10
|
Safe Conversion From Tacrolimus to Belatacept in High Immunologic Risk Kidney Transplant Recipients With Allograft Dysfunction. Am J Transplant 2015; 15:2726-31. [PMID: 25988397 DOI: 10.1111/ajt.13322] [Citation(s) in RCA: 48] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2015] [Revised: 03/08/2015] [Accepted: 03/14/2015] [Indexed: 01/25/2023]
Abstract
There is no literature on the use of belatacept for sensitized patients or regrafts in kidney transplantation. We present our initial experience in high immunologic risk kidney transplant recipients who were converted from tacrolimus to belatacept for presumed acute calcineurin inhibitor (CNI) toxicity and/or interstitial fibrosis/tubular atrophy. Six (mean age = 40 years) patients were switched from tacrolimus to belatacept at a median of 4 months posttransplant. Renal function improved significantly from a peak mean estimated glomerular filtration rate (eGFR) of 23.8 ± 12.9 mL/min/1.73 m(2) prior to the switch to an eGFR of 42 ± 12.5 mL/min/1.73 m(2) (p = 0.03) at a mean follow-up of 16.5 months postconversion. No new rejection episodes were diagnosed despite a prior history of rejection in 2/6 (33%) patients. Surveillance biopsies performed in 5/6 patients did not show subclinical rejection. No development of donor-specific antibodies (DSA) was noted. In this preliminary investigation, we report improved kidney function without a concurrent increase in risk of rejection and DSA in six sensitized patients converted from tacrolimus to belatacept. Improvement in renal function was noted even in patients with chronic allograft fibrosis without evidence of acute CNI toxicity. Further studies with protocol biopsies are needed to ensure safety and wider applicability of this approach.
Collapse
|
11
|
Multimodality therapy and liver transplantation for hepatocellular carcinoma: a 14-year prospective analysis of outcomes. Transplantation 2014; 98:100-6. [PMID: 24503764 PMCID: PMC4088318 DOI: 10.1097/01.tp.0000441090.39840.b0] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
BACKGROUND Hepatocellular carcinoma is a major cause of death among patients with cirrhosis. A standardized approach of multimodality therapy with intent-to-treat by transplantation for all patients with hepatocellular carcinoma was instituted at our transplant center in 1997. Data were prospectively collected to evaluate the impact of multimodality therapy on posttransplant patient survival, tumor recurrence, and patient survival without transplantation. METHODS All patients with hepatocellular carcinoma were eligible for multimodality therapy. Multimodality therapy consisted of hepatic resection, radiofrequency ablation, transarterial chemoembolization, transarterial chemoinfusion, yttrium-90 microsphere radioembolization, and sorafenib. RESULTS Approximately 715 patients underwent multimodality therapy; 231 patients were included in the intent-to-treat with transplantation arm, and 484 patients were treated with multimodality therapy or palliative therapy because of contraindications for transplantation. A 60.2% transplantation rate was achieved in the intent-to-treat with transplantation arm. Posttransplant survivals at 1 and 5 years were 97.1% and 72.5%, respectively. Tumor recurrence rates at 1, 3, and 5 years were 2.4%, 6.2%, and 11.6%, respectively. Patients with contraindications to transplant had increased 1- and 5-year survival from diagnosis with multimodality therapy compared with those not treated (73.1% and 46.5% versus 15.5% and 4.4%, P<0.0001). CONCLUSIONS Using multimodality therapy before liver transplantation for hepatocellular carcinoma achieved low recurrence rates and posttransplant survival equivalent to patients with primary liver disease without hepatocellular carcinoma. Multimodality therapy may help identify patients with less active tumor biology and result in improved disease-free survival and organ utilization.
Collapse
|
12
|
Quantifying the risk of incompatible kidney transplantation: a multicenter study. Am J Transplant 2014; 14:1573-80. [PMID: 24913913 DOI: 10.1111/ajt.12786] [Citation(s) in RCA: 144] [Impact Index Per Article: 14.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2014] [Revised: 03/17/2014] [Accepted: 03/17/2014] [Indexed: 01/25/2023]
Abstract
Incompatible live donor kidney transplantation (ILDKT) offers a survival advantage over dialysis to patients with anti-HLA donor-specific antibody (DSA). Program-specific reports (PSRs) fail to account for ILDKT, placing this practice at regulatory risk. We collected DSA data, categorized as positive Luminex, negative flow crossmatch (PLNF) (n = 185), positive flow, negative cytotoxic crossmatch (PFNC) (n = 536) or positive cytotoxic crossmatch (PCC) (n = 304), from 22 centers. We tested associations between DSA, graft loss and mortality after adjusting for PSR model factors, using 9669 compatible patients as a comparison. PLNF patients had similar graft loss; however, PFNC (adjusted hazard ratio [aHR] = 1.64, 95% confidence interval [CI]: 1.15-2.23, p = 0.007) and PCC (aHR = 5.01, 95% CI: 3.71-6.77, p < 0.001) were associated with increased graft loss in the first year. PLNF patients had similar mortality; however, PFNC (aHR = 2.04; 95% CI: 1.28-3.26; p = 0.003) and PCC (aHR = 4.59; 95% CI: 2.98-7.07; p < 0.001) were associated with increased mortality. We simulated Centers for Medicare & Medicaid Services flagging to examine ILDKT's effect on the risk of being flagged. Compared to equal-quality centers performing no ILDKT, centers performing 5%, 10% or 20% PFNC had a 1.19-, 1.33- and 1.73-fold higher odds of being flagged. Centers performing 5%, 10% or 20% PCC had a 2.22-, 4.09- and 10.72-fold higher odds. Failure to account for ILDKT's increased risk places centers providing this life-saving treatment in jeopardy of regulatory intervention.
Collapse
|
13
|
Abstract
UNLABELLED The increased disparity between organ supply and need has led to the use of extended criteria donors and donation after cardiac death donors with other comorbidities. METHODS We have examined the preimplantation transcriptome of 112 kidney transplant recipient samples from 100 deceased-donor kidneys by microarray profiling. Subject groups were segregated based on estimated glomerular filtration rate (eGFR) at 1 month after transplantation: the GFR-high group (n=74) included patients with eGFR 45 mL/min per 1.73 m(2), whereas the GFR-low group (n=35) included patients with eGFR 45 mL/min or less per 1.73 m(2). RESULTS Gene expression profiling identified higher expression of 160 probe sets (140 genes) in the GFR-low group, whereas expression of 37 probe sets (33 genes) was higher in the GFR-high group (P<0.01, false discovery rate <0.2). Four genes (CCL5, CXCR4, ITGB2, and EGF) were selected based on fold change and P value and further validated using an independent set of samples. A random forest analysis identified three of these genes (CCL5, CXCR4, and ITGB2) as important predictors of graft function after transplantation. CONCLUSIONS Inclusion of pretransplantation molecular gene expression profiles in donor quality assessment systems may provide the necessary information for better donor organ selection and function prediction. These biomarkers would further allow a more objective and complete assessment of procured renal allografts at pretransplantation time.
Collapse
|
14
|
Substitution of tenofovir/emtricitabine for Hepatitis B immune globulin prevents recurrence of Hepatitis B after liver transplantation. Liver Int 2012; 32:1138-45. [PMID: 22348467 DOI: 10.1111/j.1478-3231.2012.02770.x] [Citation(s) in RCA: 52] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/06/2011] [Accepted: 01/24/2012] [Indexed: 12/30/2022]
Abstract
BACKGROUND Hepatitis B immune globulin (HBIg) with or without nucleos(t)ide analogue (NA) inhibitors has been shown to prevent recurrence of hepatitis B virus (HBV) following orthotopic liver transplantation (OLT). However, the use of HBIg has many disadvantages. AIMS The present study was performed to determine if converting patients from HBIg ± NA to combination NA therapy could prevent recurrence of HBV. METHODS Twenty-one recipients without evidence of HBV recurrence on HBIg ± NA for ≥ 6 months were enrolled. Patients received their last injection of HBIg at the time they initiated tenofovir disoproxil fumarate/emtricitabine (TDF/FTC; Truvada(®) ) and were followed up for 31.1 ± 9.0 [range 15-47] months. RESULTS After 1 year, 3 patients (14%) had detectable HBsAg, one of whom was non-compliant. Two of 3 with recurrence cleared HBsAg by last follow-up on TDF/FTC; the non-compliant patient became HBV DNA-undetectable with re-institution of TDF/FTC. TDF/FTC saved $12,469/year over our standard-of-care, monthly intramuscular HBIg/lamivudine. There was no evidence of a general adverse effect of TDF/FTC on renal function. However, 3 patients developed reversible acute renal failure; on renal biopsy, 1 had possible TDF/FTC-induced acute tubular necrosis. CONCLUSIONS Substitution of TDF/FTC for HBIg prevented recurrence of HBV DNA in 100% (20/20) of patients who were compliant with the medication and led to substantial cost savings over HBIg-containing regimens.
Collapse
|
15
|
Reduced expression of inflammatory genes in deceased donor kidneys undergoing pulsatile pump preservation. PLoS One 2012; 7:e35526. [PMID: 22545113 PMCID: PMC3335841 DOI: 10.1371/journal.pone.0035526] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2012] [Accepted: 03/20/2012] [Indexed: 02/05/2023] Open
Abstract
BACKGROUND The use of expanded criteria donor kidneys (ECD) had been associated with worse outcomes. Whole gene expression of pre-implantation allograft biopsies from deceased donor kidneys (DDKs) was evaluated to compare the effect of pulsatile pump preservation (PPP) vs. cold storage preservation (CSP) on standard and ECD kidneys. METHODOLOGY/PRINCIPAL FINDINGS 99 pre-implantation DDK biopsies were studied using gene expression with GeneChips. Kidneys transplant recipients were followed post transplantation for 35.8 months (range = 24-62). The PPP group included 60 biopsies (cold ischemia time (CIT) = 1,367+/-509 minutes) and the CSP group included 39 biopsies (CIT = 1,022+/-485 minutes) (P<0.001). Donor age (42.0±14.6 vs. 34.1±14.2 years, P = 0.009) and the percentage of ECD kidneys (PPP = 35% vs. CSP = 12.8%, P = 0.012) were significantly different between groups. A two-sample t-test was performed, and probe sets having a P<0.001 were considered significant. Probe set level linear models were fit using cold ischemia time and CSP/PPP as independent variables to determine significant probe sets (P<0.001) between groups after adjusting for cold ischemia time. Thus, 43 significant genes were identified (P<0.001). Over-expression of genes associated with inflammation (CD86, CD209, CLEC4, EGFR2, TFF3, among others) was observed in the CSP group. Cell-to-cell signaling and interaction, and antigen presentation were the most important pathways with genes significantly over-expressed in CSP kidneys. When the analysis was restricted to ECD kidneys, genes involved in inflammation were also differentially up-regulated in ECD kidneys undergoing CSP. However, graft survival at the end of the study was similar between groups (P = 0.2). Moreover, the incidence of delayed graft function was not significant between groups. CONCLUSIONS/SIGNIFICANCE Inflammation was the most important up-regulated pattern associated with pre-implantation biopsies undergoing CSP even when the PPP group has a larger number of ECD kidneys. No significant difference was observed in delayed graft function incidence and graft function post-transplantation. These findings support the use of PPP in ECD donor kidneys.
Collapse
|
16
|
Pretransplant transcriptome profiles identify among kidneys with delayed graft function those with poorer quality and outcome. Mol Med 2011; 17:1311-22. [PMID: 21912807 DOI: 10.2119/molmed.2011.00159] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2011] [Accepted: 09/02/2011] [Indexed: 11/06/2022] Open
Abstract
Robust biomarkers are needed to identify donor kidneys with poor quality associated with inferior early and longer-term outcome. The occurrence of delayed graft function (DGF) is most often used as a clinical outcome marker to capture poor kidney quality. Gene expression profiles of 92 preimplantation biopsies were evaluated in relation to DGF and estimated glomerular filtration rate (eGFR) to identify preoperative gene transcript changes associated with short-term function. Patients were stratified into those who required dialysis during the first week (DGF group) versus those without (noDGF group) and subclassified according to 1-month eGFR of >45 mL/min (eGFR(hi)) versus eGFR of ≤45 mL/min (eGFR(lo)). The groups and subgroups were compared in relation to clinical donor and recipient variables and transcriptome-associated biological pathways. A validation set was used to confirm target genes. Donor and recipient characteristics were similar between the DGF versus noDGF groups. A total of 206 probe sets were significant between groups (P < 0.01), but the gene functional analyses failed to identify any significantly affected pathways. However, the subclassification of the DGF and noDGF groups identified 283 probe sets to be significant among groups and associated with biological pathways. Kidneys that developed postoperative DGF and sustained an impaired 1-month function (DGF(lo) group) showed a transcriptome profile of significant immune activation already preimplant. In addition, these kidneys maintained a poorer transplant function throughout the first-year posttransplant. In conclusion, DGF is a poor marker for organ quality and transplant outcome. In contrast, preimplant gene expression profiles identify "poor quality" grafts and may eventually improve organ allocation.
Collapse
|
17
|
Living donor liver transplantation for neonatal hemochromatosis using non-anatomically resected segments II and III: a case report. J Med Case Rep 2010; 4:372. [PMID: 21092086 PMCID: PMC2994882 DOI: 10.1186/1752-1947-4-372] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2010] [Accepted: 11/19/2010] [Indexed: 11/10/2022] Open
Abstract
INTRODUCTION Neonatal hemochromatosis is the most common cause of liver failure and liver transplantation in the newborn. The size of the infant determines the liver volume that can be transplanted safely without incurring complications arising from a large graft. Transplantation of monosegments II or III is a standard method for the newborns with liver failure. CASE PRESENTATION A three-week old African-American male neonate was diagnosed with acute liver failure secondary to neonatal hemochromatosis. Living-related liver transplantation was considered after the failure of intensive medical therapy. Intra-operatively a non-anatomical resection and transplantation of segments II and III was performed successfully. The boy is growing normally two years after the transplantation. CONCLUSION Non-anatomical resection and transplantation of liver segments II and III is preferred to the transplantation of anatomically resected monosegements, especially when the left lobe is thin and flat. It allows the use of a reduced-size donor liver with intact hilar structures and outflow veins. In an emergency, living-related liver transplantation should be offered to infants with liver failure secondary to neonatal hemochromatosis who fail to respond to medical treatment.
Collapse
|
18
|
Adult living donor versus deceased donor liver transplantation: a 10-year prospective single center experience. Ann Hepatol 2010; 8:298-307. [PMID: 20009128] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
It has been 4 years since the first, long-term (> 3 years) prospective comparison of adult-to-adult living donor liver transplantation (A2ALLTx) to adult deceased donor liver transplantation (ADDLTx) was reported. In this follow up, prospective, IRB approved, 10-year comparison of A2ALLTx to ADDLTx we expand on our initial observations. This data includes: age, gender, ethnicity, primary liver disease, waiting time, pretransplant CTP/MELD score, cold ischemia time (CIT), perioperative mortality, acute and chronic rejection, graft and patient survival, charges and post-transplant complications. In 10 years, 465 ADDLTx (81.3%) and 107 A2ALLTx (18.7%) were performed at VCUHS. Hepatitis C virus (HCV) was the most common reason for transplantation in both groups (54.5% vs. 48.2%). Data regarding overall patient and graft survival and retransplantation rates were similar. Comparison of patient/graft survivals, retransplantation rates in patients with and without HCV were not statistically different. A2ALLTx patients had less acute rejection (9.6% vs. 21.7%) and more biliary complications (27.1% vs. 17.6%). In conclusion, A2ALLTx is as durable a liver replacement technique as the ADDLTx. Patients with A2ALLTx were younger, had lower MELD scores, less acute rejection and similar histological HCV recurrence. Biliary complications were more common in A2ALLTx but were not associated with increased graft loss compared to ADDLTx.
Collapse
|
19
|
Abstract
BACKGROUND Hyponatraemia increases risk of adverse outcomes following orthotopic liver transplantation (OLT), but it is unclear whether improvement of pretransplant hyponatraemia ameliorates post-transplant complications. AIMS To assess impact of pretransplant hyponatraemia on post-transplant outcomes. METHODS We performed a retrospective analysis of 213 patients with cirrhosis who underwent liver transplantation. Patients with serum sodium <or=130 mEq/L immediately before transplantation ('hyponatraemia at OLT'; n=34) were compared with those who had experienced hyponatraemia but subsequently improved to a serum sodium >130 mEq/L at transplantation ('resolved hyponatraemia'; n=56) and to those without history of hyponatraemia before transplantation ('never hyponatraemic'; n=123). Primary endpoint was survival at 180 days post-OLT. Secondary outcomes included time until discharge alive, complications during hospitalization, length of time ventilated and length of post-transplant intensive care unit stay. RESULTS There was no survival difference at 180 days post-OLT between groups. After transplantation, patients with either hyponatraemia at OLT or resolved hyponatraemia had longer time until discharge alive and had higher rates of delirium, acute renal failure, acute cellular rejection and infection than those who were never hyponatraemic. As compared with patients with hyponatraemia at OLT, those with resolved hyponatraemia were more likely to be discharged alive within 3 weeks, but other outcomes, including survival, did not differ significantly. CONCLUSIONS We conclude that hyponatraemia at any time before liver transplantation is associated with adverse post-transplant outcome, even when hyponatraemia has resolved.
Collapse
|
20
|
Incidence of prolonged length of stay after orthotopic liver transplantation and its influence on outcomes. Liver Transpl 2009; 15:273-9. [PMID: 19243008 DOI: 10.1002/lt.21731] [Citation(s) in RCA: 39] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/14/2022]
Abstract
Orthotopic liver transplantation (OLT) is the only effective treatment for end-stage liver disease. Although most patients do well and are discharged promptly, some require prolonged length of stay (PLOS). The prevalence of PLOS, associated factors, and their impact on survival are not well defined. We reviewed our adult OLT database for patients who survived > 30 days. PLOS was defined as hospitalization > 30 days following OLT. Of 521 OLT recipients, 68 (13%) had PLOS with a median duration of 50 days versus only 10 days for patients discharged within 30 days. Significant differences in pre-OLT variables between patients with and without PLOS included the mean wait list time (P = 0.001), hospitalization at the time of OLT (P = 0.001), and prior OLT (P = 0.041). Factors independently associated with PLOS included intensive care unit status at the time of OLT [odds ratio (OR), 4; 95% confidence interval (CI), 1.6-10.4], OLT prior to Model for End-Stage Liver Disease implementation (OR, 2.27; 95% CI, 1.04-5.26), in-hospital post-OLT bacterial infection (OR, 9.34; 95% CI, 4.65-18.86), gastrointestinal bleeding (OR, 4.34; 95% CI, 1.4-14.08), renal failure (OR, 10.86; 95% CI, 5.07-23.25), and allograft rejection (OR, 3.7; 95% CI, 1.23-11.11). One-year graft survival and patient survival were significantly less in those with PLOS (for both, P < 0.0001). Among PLOS patients, factors independently associated with increased 1-year mortality were donor age (OR, 1.07; 95% CI, 1.009-1.13), primary diagnosis of hepatitis C virus (OR, 6.89; 95% CI, 1.40-34.48), in-hospital post-OLT bacterial infection (OR, 13.3; 95% CI, 2.11-83.33), and cardiac complications (OR, 20.4; 95% CI, 1.51-250; c-statistic for the model, 0.85). In conclusion, PLOS following OLT is associated with a significant decrease in survival despite a marked increase in cost and resource utilization. Efforts to modify those factors that contribute to PLOS may reduce this event, improve survival, and reduce OLT-associated costs.
Collapse
|
21
|
Molecular pathways involved in loss of kidney graft function with tubular atrophy and interstitial fibrosis. Mol Med 2008; 14:276-85. [PMID: 18286166 DOI: 10.2119/2007-00111.maluf] [Citation(s) in RCA: 44] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2007] [Accepted: 02/05/2008] [Indexed: 12/28/2022] Open
Abstract
Loss of kidney graft function with tubular atrophy (TA) and interstitial fibrosis (IF) causes most kidney allograft losses. We aimed to identify the molecular pathways involved in IF/TA progression. Kidney biopsies from normal kidneys (n = 24), normal allografts (n = 6), and allografts with IF/TA (n = 17) were analyzed using high-density oligonucleotide microarray. Probe set level tests of hypotheses tests were conducted to identify genes with a significant trend in gene expression across the three groups using Jonckheere-Terpstra test for trend. Interaction networks and functional analysis were used. An unsupervised hierarchical clustering analysis showed that all the IF/TA samples were associated with high correlation. Gene ontology classified the differentially expressed genes as related to immune response, inflammation, and matrix deposition. Chemokines (CX), CX receptor (for example, CCL5 and CXCR4), interleukin, and interleukin receptor (for example, IL-8 and IL10RA) genes were overexpressed in IF/TA samples compared with normal allografts and normal kidneys. Genes involved in apoptosis (for example, CASP4 and CASP5) were importantly overexpressed in IF/TA. Genes related to angiogenesis (for example, ANGPTL3, ANGPT2, and VEGF) were downregulated in IF/TA. Genes related to matrix production-deposition were upregulated in IF/TA. A distinctive gene expression pattern was observed in IF/TA samples compared with normal allografts and normal kidneys. We were able to establish a trend in gene expression for genes involved in different pathways among the studied groups. The top-scored networks were related to immune response, inflammation, and cell-to-cell interaction, showing the importance of chronic inflammation in progressive graft deterioration.
Collapse
|
22
|
Surveillance for hepatocellular carcinoma in patients with cirrhosis improves outcome. Am J Med 2008; 121:119-26. [PMID: 18261500 DOI: 10.1016/j.amjmed.2007.09.020] [Citation(s) in RCA: 112] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/26/2007] [Revised: 08/31/2007] [Accepted: 09/24/2007] [Indexed: 12/15/2022]
Abstract
OBJECTIVE Liver transplantation has become an effective treatment for cirrhotic patients with early-stage hepatocellular carcinoma. We hypothesized that the quality of surveillance for hepatocellular carcinoma influences prognosis by affecting access to liver transplantation. METHODS A total of 269 patients with cirrhosis and hepatocellular carcinoma were retrospectively categorized into 3 groups according to quality of surveillance: standard-of-care (n=172) (group 1); substandard surveillance (n=48) (group 2); and absence of surveillance in patients not recognized to be cirrhotic (n=59) (group 3). RESULTS Three-year survival in the 60 patients who underwent liver transplantation was 81% versus 12% for patients who did not undergo transplantation (P<.001). The percentages of patients who underwent transplantation according to tumor stage at diagnosis (T1, T2, T3, and T4) were 58%, 35%, 10%, and 1%, respectively. Hepatocellular carcinoma was diagnosed at stages 1 and 2 in 70% of patients in group 1, 37% of patients in group 2, and only 18% of patients in group 3 (P <.001). Liver transplantation was performed in 32% of patients in group 1, 13% of patients in group 2, and 7% of patients in group 3 (P<.001). Three-year survival from cancer diagnosis in patients in group 3 (12%) was significantly worse than in patients in group 1 (39%) or group 2 (27%) (each P<.05). Eighty percent of patients in group 3 had subtle abnormalities of cirrhosis on routine laboratory tests. CONCLUSION The quality of surveillance has a direct impact on hepatocellular carcinoma stage at diagnosis, access to liver transplantation, and survival.
Collapse
|
23
|
Abstract
Liver transplantation (LT) in patients with hepatitis B virus (HBV) infection is associated with a high rate of graft loss and poor survival, unless re-infection can be prevented. Human hepatitis B immune globulin (HBIG) has long been utilized to prevent re-infection. More recently, an anti-viral agent has been utilized along with HBIG. However, the regimens utilized have varied considerably among LT programmes and the optimal regimen has never been defined. We conducted a retrospective analysis of 41 patients who underwent LT for HBV at our centre since 1985 and received either HBIG with or without an anti-viral agent. The mean age of these patients was 46 years; 81% were male and 88% white. The mean and maximal follow-up were 5.9 and 15 years respectively. Eight out of 15 E-antigen-positive patients who received HBIG alone developed recurrence after a mean of 17 months. In contrast, none of 10 E-Ag-negative patients who received HBIG alone and none of the 10 E-antigen-positive patients who received both HBIG and either lamivudine or adefovir developed recurrence. As long as the anti-HB surface remained detectable, no absolute minimum serum level appeared to lead to recurrent HBV. We concluded that recurrence of HBV following LT can be prevented in E-antigen-positive patients with a combination of HBIG and an anti-viral agent. In contrast, recurrence can be prevented in E-antigen-negative patients with HBIG alone. Maintaining a serum anti-HB surface level above a minimum arbitrary titre of 200 pg/mL did not appear to be necessary for effective HBIG prophylaxis.
Collapse
|
24
|
Abstract
BACKGROUND The effect of hepatitis C virus (HCV) infection on patients undergoing kidney transplantation (KTx) is uncertain. This study aimed to evaluate the outcomes of our HCV+/end-stage renal disease (ESRD) patient population based on the therapeutic option including KTx or continuation in dialysis. METHODS KTx performed at Virginia Commonwealth University Hospital between January 2000 and December 2004 were tracked prospectively. Forty-three out of a total of 394 KTx patients included in the analysis were HCV+. A group of 52 contemporaneous HCV+/ESRD patients listed, but never transplanted, was also analyzed. HCV-negative transplanted patients were used as the control group. RESULTS Patient survival posttransplantation was 81.4% and 68.5% at 1 and 3 years in the HCV+ group, and 97.1% and 92.9% at 1 and 3 years in the HCV- group, respectively (P=0.001). Graft survival was 81.2% and 64.1% at 1 and 3 years in the HCV+ group, and 93.2% and 84.1% at 1 and 3 years posttransplantation in the HCV- group (P=0.01). Univariate analysis identified Knodell score as a predictor of mortality in HCV+ patients (P=0.04). Cox proportional hazards multivariate analysis identified deceased donor (P=0.02), previous kidney transplant (P=0.007), pretransplant diabetes (P=0.05), and Knodell Score (P=0.012) as predictors of patient mortality. Patient survival was superior in HCV+ patients undergoing KTx versus remaining on dialysis. CONCLUSIONS Patients with ESRD/HCV+ benefit from KTx without achieving the excellent survival of HCV-/ESRD patients. Liver biopsy is a useful tool to identify advanced liver disease at pretransplantation time.
Collapse
|
25
|
A prospective evaluation of fibrosis progression in patients with recurrent hepatitis C virus following liver transplantation. Liver Transpl 2007; 13:975-83. [PMID: 17600360 DOI: 10.1002/lt.21117] [Citation(s) in RCA: 109] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Recurrence of hepatitis C virus (HCV) following liver transplantation (LT) is universal. A subset of these patients develop advanced fibrosis and cirrhosis and it is believed that this leads to increased posttransplantation mortality. The specific aims of this study were to determine the incidence of advanced fibrosis and those factors associated with this process, and to evaluate causes for mortality in patients with recurrent HCV. A total of 227 patients who underwent LT with chronic HCV were monitored prospectively. The mean age of this group at LT was 49.5 yr; 76% were male and 85% were Caucasian. Fibrosis progression was monitored by protocol liver biopsy, initially performed 6 months after LT and then at 6- to 24-month intervals. Advanced fibrosis, defined as the bridging fibrosis or cirrhosis, developed in 1%, 11%, 25%, and 41% of patients after 1, 3, 5, and 6-10 yr, respectively. Acute cellular rejection hepatic steatosis, a persistent elevation in serum alanine aminotransferase and donor-race were associated with the development of advanced fibrosis. In contrast, the development of advanced fibrosis was not affected by the use of interferon prior to undergoing LT, cytomegalovirus disease, or donor age. A total of 60 patients (26%) died over 15 yr of follow-up. Although graft failure accounted for 45% of deaths in patients with advanced fibrosis, this represented only 8% of all deaths in patients with recurrent HCV. Sepsis was the most common cause of death and this was observed with similar frequency in patients who developed advanced fibrosis (45%) and in those with less advanced fibrosis (47%). In conclusion, approximately 41% of patients with recurrent HCV developed advanced fibrosis 6-10 yr after LT. However, complications associated with sepsis, not recurrent cirrhosis, was the most common cause of death in patients with recurrent HCV and this was similar in patients with or without advanced fibrosis.
Collapse
|
26
|
Multimodality therapy and liver transplantation in patients with cirrhosis and hepatocellular carcinoma: 6 years, single-center experience. Transplant Proc 2007; 39:153-9. [PMID: 17275495 DOI: 10.1016/j.transproceed.2006.10.029] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2006] [Indexed: 01/28/2023]
Abstract
The treatment of patients with cirrhosis and hepatocellular carcinoma (HCC) has improved dramatically over the past 10 years. We conducted a 6-year prospective study, using multimodality ablation therapy (MMT) combined with liver transplantation (LTx) for patients with cirrhosis and unresectable HCC. Subjects were classified as: group 1 (n = 35), intention to treat with MMT + LTx; group 2 (n = 16), contemporaneous LTx with "incidental" HCC on explants; group 3 (n = 94), MMT alone; and group 4 (n = 19), palliative care alone. MMT included trans-arterial chemo-embolization (54.4%), trans-arterial chemo-infusion (28.6%), and radio frequency ablation (17%). Group 1, with a mean wait time of 11.6 months pre-MELD era and 5.4 months post-MELD era, had a mean of 2.4 +/- 1.2 MMTs and achieved 1- 3-, and 5-year patient survivals of 100, 100, and 76%, respectively, which was not different from group 2 (incidental HCC), namely 93, 93, and 93%, respectively; or to a contemporaneous non-HCC LTx group: namely 84.3, 78.7, and 73.9%, respectively. Despite careful pretransplant HCC staging, 22.8% (8 of 35) group 1 subjects were understaged. Those subjects in group 1 with true T1-2 stage HCC achieved 100% cancer-free survival at 5 years. Only three cases of HCC recurrence occurred in our series, all of whom were understaged. Our data suggest that pretransplant MMT followed by timely LTx provides excellent disease-free survival at 5 years for patients with true T1-2 stage HCC and cirrhosis. Pretransplant HCC understaging contributes to posttransplant HCC recurrence after LTx.
Collapse
|
27
|
Molecular markers in stored kidneys using perfluorocarbon-based preservation solution: preliminary results. Transplant Proc 2006; 38:1243-6. [PMID: 16797273 DOI: 10.1016/j.transproceed.2006.02.109] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2005] [Indexed: 11/19/2022]
Abstract
BACKGROUND Delayed graft function (DGF) is a problem in kidney transplantation and cold ischemia has been identified as a risk factor. Perfluorocarbons (PFC) have an enhanced ability to dissolve and release oxygen. We evaluated histologically and a number of molecular changes induced by ischemia in stored kidneys with University of Wisconsin (UW) and PFC-based preservation solutions (PFC-UW). MATERIALS AND METHODS ACI rats were used as kidney donors. UW (control group) or PFC-UW (study group) preservation solutions were used for kidney perfusion. All kidneys were stored at 4 degrees C for 12, 24, and 36 hours. After this time, intragraft histologic evaluation as well as mRNA HO-1 and iNOS levels were also analyzed. RESULTS In the kidneys stored at 24 hours, mRNA HO-1 levels were elevated in the study group when compared with the control and mRNA iNOS was decreased. CONCLUSION We observed overexpression of HO-1 and underexpression of iNOS in the kidney tissue stored with PFC-UW solution at 24 hours. These preliminary data suggest that increasing oxygen delivery by PFC added to the perfusion solution triggers cytoprotective mechanism in kidney transplantation.
Collapse
|
28
|
Abstract
No long-term (>3 years) prospective comparison of adult-to-adult living donor liver transplantation (A2ALLTx) to adult deceased donor liver transplantation (ADDLTx) has been reported. This is a prospective, IRB approved, 6-year comparison of A2ALLTx to ADDLTx. Data include: age, gender, ethnicity, primary liver disease, waiting time, pretransplant CTP/MELD score, cold ischemia time (CIT), perioperative mortality, acute and chronic rejection, graft and patient survival, charges and post-transplant complications. In 6 years, 202 ADDLTx (74.5%) and 69 A2ALLTx (25.5%) were performed at VCUHS. Hepatitis C virus (HCV) was the most common reason for transplantation in both groups (48.1% vs. 42%). Data regarding overall patient and graft survival, monetary charges and retransplantation rates were similar. Comparison of patient/graft survivals, retransplantation rates in patients with and without HCV were not statistically different. A2ALLTx patients had less acute rejection (11.5% vs. 23.9%) and more biliary complications (26.1% vs. 11.4%). Overall, A2ALLTx is as durable a liver replacement technique as the ADDLTx. Patients with A2ALLTx were younger, had lower MELD scores, less acute rejection and similar histological HCV recurrence. Biliary complications were more common in A2ALLTx but were not associated with increased graft loss compared to ADDLTx.
Collapse
|
29
|
Four-year follow-up of a prospective randomized trial of mycophenolate mofetil with cyclosporine microemulsion or tacrolimus following liver transplantation. Clin Transplant 2004; 18:463-72. [PMID: 15233827 DOI: 10.1111/j.1399-0012.2004.00192.x] [Citation(s) in RCA: 48] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
Abstract
BACKGROUND This is a 4-yr follow-up of a trial using mycophenolate mofetil (MMF) induction in orthotopic liver transplantation (OLT). The goal of this study was to evaluate a multidrug approach that would reduce both early and long-term morbidity related to immunosuppression while maintaining an acceptable freedom from rejection. METHODS This was a prospective, randomized, intent to treat study designed to compare the primary endpoints of rejection and infection, and secondary endpoints of liver function, renal function, bone marrow function, cardiovascular risk factors, and the recurrence of hepatitis C. Ninety-nine consecutive patients with end stage liver disease who underwent OLT were randomized to receive either cyclosporine microemulsion (N) (50 patients) or tacrolimus (FK) (49 patients) starting on postoperative day 2, with MMF and an identical steroid taper begun preoperatively. RESULTS Ninety of 99 patients (N 46, FK 44) completed the 4-yr follow-up. The overall 4-yr patient and graft survivals were 93 and 89%, respectively. There was no significant difference in 4-yr patient (N 96% vs. FK 90%, p = ns) or graft (N, 90% vs. FK, 88%, p = ns) survival between groups. The 4-yr rejection rate was not significantly different in either arm (N = 34%, FK = 24%; p = 0.28). There were no differences in infection rates in either arm. The patients with hepatitis C had no differences in the viral titers or Knodell biopsy scores between groups. However, in the hepatitis C subgroup (37 patients), the FK patients had a significantly lower rejection rate (p = 0.0097) and a significantly lower clinically recurrent hepatitis C rate (p = 0.05) than the N patients. No difference was seen in the percent of patients weaned off of steroids after 4 yr (N 51%, FK 49%). There were no differences in the incidences of diabetes mellitus and hypertension. When renal dysfunction was analyzed, a significant difference in the number of patients whose creatinine had increased twofold since transplant was seen (N 63%, FK 38%, p = 0.04). CONCLUSIONS Use of MMF induction and maintenance following OLT in conjunction with either N or FK and an identical steroid taper, resulted in an acceptable long-term incidence of rejection and infection, without an increase in long-term graft or patient morbidity.
Collapse
|
30
|
Histologic recurrence of chronic hepatitis C virus in patients after living donor and deceased donor liver transplantation. Liver Transpl 2004; 10:1248-55. [PMID: 15376308 DOI: 10.1002/lt.20232] [Citation(s) in RCA: 107] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Hepatitis C virus (HCV) recurs in nearly all patients after liver transplantation. This recurrence is associated with progressive fibrosis and graft loss. It remains unclear whether the natural course of HCV recurrence is altered in patients who undergo living donor liver transplantation (LDLT). We conducted a prospective, controlled trial using protocol liver biopsies to evaluate the histologic outcome of recurrent HCV in 23 patients who underwent LDLT and 53 patients who underwent transplantation with a deceased donor liver (DDLT) during the same period of time. Patients who did not survive at least 6 months after transplantation or who had hepatocellular carcinoma or any other coexistent liver disease were excluded from analysis. All patients underwent protocol liver biopsy at 6 months and at 12 months and at yearly intervals thereafter. The mean age, sex, racial distribution, and serum HCV RNA and the percentage of patients with genotype 1 were similar in the 2 groups of patients. The model for end-stage liver disease score at the time of transplantation was slightly lower in patients who underwent LDLT, but this difference was not significant. The distribution of immunosuppression agents used, the mean doses of calcineurin agents, the use of mycophenolate mofetil, and the dose and tapering schedule for prednisone were similar in both groups of patients. The mean duration of follow-up was 40 months. No significant difference in either graft or patient survival or the percentage of patients who developed acute rejection was noted in the 2 groups of patients. At 48 months, graft and patient survival were 82% and 82% and 75% and 79% for patients who underwent DDLT and LDLT, respectively. The degree of hepatic inflammation increased stepwise over 3 years but was not significantly different in the 2 patient groups. In contrast, the mean fibrosis score and the percentage of patients with fibrosis increased stepwise after DDLT but appeared to plateau 12 months after LDLT. At 36 months, fibrosis was present in 78% of DDLT patients, and mean fibrosis score was 1.9, compared with 59% with fibrosis and a mean score of.9 after LDLT. In conclusion, these data strongly suggest that fibrosis progression from recurrent HCV is not more severe in patients after LDLT.
Collapse
|
31
|
Abstract
Most of the few reports about hepatic artery disease found in the literature describe hepatic artery aneurysms or hepatic artery calcifications. Atherosclerosis of the hepatic artery is not commonly evaluated during deceased donor liver procurement. Herein we present a case of a stable 47-year-old Caucasian female donor whose liver function tests were within normal limits and a liver biopsy showed less than 5% steatosis. The liver when received at our center appeared grossly unremarkable. Back-table evaluation showed a complete occlusion of the trunk of the proper hepatic artery. The pathology report revealed hepatic occlusion due to arterial atherosclerosis. Transplantation was canceled, and the liver was used for isolated hepatocyte perfusion, revealing < 25% hepatocyte viability. Hepatic artery atherosclerosis and patency need to be evaluated at the time of procurement to prevent recipient morbidity due to anesthetic induction, or initiation of a recipient abdominal incision prior to declining the liver graft for this rare finding.
Collapse
|
32
|
Effects of interferon treatment on liver histology and allograft rejection in patients with recurrent hepatitis C following liver transplantation. Liver Transpl 2004; 10:850-8. [PMID: 15237368 DOI: 10.1002/lt.20189] [Citation(s) in RCA: 112] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
Recurrent hepatitis C after liver transplantation remains a significant cause of graft loss and retransplantation. Although treatment of recurrent hepatitis C with interferon-based regimens has become widely accepted as safe and can lead to sustained virologic clearance of hepatitis C virus (HCV) RNA, long-term histologic improvement and the risk of precipitating graft rejection remain controversial. The present study is a retrospective evaluation of the clinical and histological consequences of treating recurrent hepatitis C with interferon-based therapy in a selected group of liver transplant recipients. Twenty-three liver transplant recipients with recurrent hepatitis C and histologic evidence of progressive fibrosis completed at least 6 months of interferon, 83% of whom received pegylated-interferon alpha-2b; only 4 tolerated ribavirin. Overall, 11 patients (48%) had undetectable HCV RNA at the end of 6 months of treatment. Of these patients, 3 remained HCV RNA-negative on maintenance interferon monotherapy for 33 months, and the other 8 (35%) completed treatment and remained HCV RNA-undetectable 24 weeks after discontinuation of interferon. Overall necroinflammatory activity in liver biopsies obtained 2 years after HCV RNA became undetectable decreased significantly (7.73 +/- 2.37 vs. 5.64 +/- 2.94 units before and after treatment, respectively; P =.016). However, 5 of these 11 patients had no histologic improvement in follow-up liver histology. Liver biopsies in the 12 nonresponders demonstrated disease progression. Of the 23 patients treated with interferon, 8 (35%) had evidence of acute or chronic rejection on posttreatment liver biopsy, most of whom had no previous history of rejection (P <.01 for comparison of pretreatment and posttreatment prevalence of histologic rejection), and 2 experienced graft loss from chronic rejection, requiring retransplantation. In conclusion, interferon treatment of recurrent hepatitis C does not consistently improve histologic disease after virologic response, and it may increase the risk of allograft rejection.
Collapse
|
33
|
Reasons for non-use of recovered kidneys: the effect of donor glomerulosclerosis and creatinine clearance on graft survival. Transplantation 2004; 77:1411-5. [PMID: 15167600 DOI: 10.1097/01.tp.0000123080.19145.59] [Citation(s) in RCA: 70] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
BACKGROUND In 2000, the United Network for Organ Sharing/Organ Procurement and Transplantation Network Registry reported 540 recovered kidneys were discarded because of biopsy results, and 210 were discarded because of poor organ function. We compared the percentage of glomerulosclerosis (GS) and creatinine clearance (CrCl) of both discarded and transplanted cadaveric kidneys and examined their effect on graft survival and function. METHODS The cohort consisted of all cadaveric kidneys (n= 3,444) with reported biopsy results between October 25, 1999 and December 31, 2001. Graft survival was calculated by univariate and multivariate models. RESULTS Fifty-one percent of discarded kidneys had GS of less than 20%, 27% had a CrCl greater than 80 mL/min, and 15% (129 kidneys) had both GS less than 20% and a CrCl of greater than 80 mL/min. Univariate analyses of kidneys with less than or equal to 20% GS revealed no difference in 1-year graft survival when the CrCl was greater than or less than or equal to 80 mL/min. When GS was greater than 20%, 1-year graft survival of kidneys with a CrCl of greater than 80 mL/min was significantly greater than that of kidneys with a CrCl of less than or equal to 80 mL/min. Multivariate results showed no significant difference in relative risk of graft loss with GS greater than 20% versus less than or equal to 20% when the CrCl was either 50 or 80 mL/min. With both GS less than or equal to 20% and greater than 20%, serum creatinine at 1 year was significantly lower in kidneys with CrCl greater 80 mL/min. CONCLUSIONS Calculated donor CrCl does, and percentage GS on donor kidney biopsies does not, correlate well with 1-year graft survival and function, and percentage GS should not be used as the sole criterion for discarding recovered cadaveric kidneys.
Collapse
|
34
|
Calcineurin inhibitor-induced chronic nephrotoxicity in liver transplant patients is reversible using rapamycin as the primary immunosuppressive agent. Clin Transplant 2003; 16 Suppl 7:49-51. [PMID: 12372044 DOI: 10.1034/j.1399-0012.16.s7.7.x] [Citation(s) in RCA: 50] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
Abstract
The purpose of this study was to determine whether calcineurin inhibitor (CNI)-induced chronic nephrotoxicity in liver transplant patients is reversible by replacement of the CNI with rapamycin as the primary immunosuppressive agent. CNIs, while providing potent immunosuppression for liver transplant patients, exhibit nephrotoxicity as a major side-effect. Whereas acute CNI-induced nephrotoxicity is reversible by withdrawal of the CNI, chronic nephrotoxicity due to CNIs is a progressive process thought to be irreversible. Eight liver transplant patients with CNI-induced chronic nephrotoxicity were converted to rapamycin as the primary immunosuppressive agent. The CNI was either discontinued (four patients) or the dosage lowered to maintain a subtherapeutic level (four patients). Renal function as assessed by serum creatinine was measured before and after conversion to rapamycin. Two patients progressed to dialysis dependence following conversion to rapamycin. These two patients had been on CNIs for a mean of 112 months (range 93-131 months) prior to conversion to rapamycin. Five patients experienced improvement in renal function. These patients had been on calcineurin inhibitors for a mean of 60 months (range 42-75 months) prior to conversion. One patient with chronic nephrolithiasis as a contributing factor to his renal dysfunction has progressed to dialysis dependence despite conversion to rapamycin following exposure to a CNI for 24 months. In the five patients with improved renal function, serum creatinine levels decreased significantly (2.4 +/- 0.3 mg/dL to 1.5 +/- 0.1 mg/dL, p < 0.05) by a mean of 7.2 months (range 5-10 months) after conversion from CNI to rapamycin-based immunosuppression. Liver function remained stable after conversion to rapamycin. CNI-induced chronic nephrotoxicity can be reversed upon withdrawal of the CNI. Rapamycin is an effective replacement agent as primary immunosuppressive therapy following withdrawal of CNIs in liver transplant patients with CNI-induced chronic nephrotoxicity.
Collapse
|
35
|
Hepatocellular carcinoma: strategy for optimizing surgical resection, transplantation and palliation. Clin Transplant 2003; 16 Suppl 7:52-8. [PMID: 12372045 DOI: 10.1034/j.1399-0012.16.s7.8.x] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/09/2023]
Abstract
In December 1997, a prospective study with informed consent was initiated to test a neoadjuvant treatment of transcatheter hepatic arterial chemo-embolization (TACE) and thermal or chemical ablation followed by transcatheter hepatic arterial chemo-infusion (TACI) in patients with hepatocellular carcinoma (HCC) referred for transplantation (OLT) and for resection. Patients were staged with American Liver Tumor Study Group-modified tumour-node-metastasis (TNM) staging classification using serial 3-6 month physical exam, alphafetoprotein (AFP), abdominal enhanced MRI, chest CT and bone scan. Sixty-five patients with HCC, out of 508 patients referred for OLT, were divided into five clinical groups and an incidental HCC patient group (n = 8), diagnosed on post-transplant explant pathology. The key focus of study was safety, site of HCC recurrence and tumour free survival. One hundred and thirty three ablation, infusion procedures were performed with an overall 24.8% morbidity, including two septic deaths. There were 13 (21.6%) HCC recurrences in 60 patients having one or more ablative treatments with only 23% hepatic HCC recurrences at 43 months of study. Eighteen HCC patients were listed for OLT (Group 3), with 12 patients transplanted after 29-424 d waiting. Two patients were removed from the OLT list due to HCC metastases, waiting a mean of 145 d. Two patients, post-OLT, had their TNM score upgraded from T2, T3 to T4. No Group 3 post-OLT patient has died or had HCC recurrence at mean follow-up of 27 +/- 15 months. No incidental HCC group post-OLT patient has died or had HCC recurrence at mean follow-up of 24 +/- 14 months. This neoadjuvant protocol is safe and effective in reducing HCC recurrence prior to and after OLT and resection.
Collapse
|
36
|
Abstract
We report the case of a girl with Hardikar syndrome who underwent living-donor liver transplantation at 2 years of age. This disease, described in 1992, includes a constellation of abnormalities, such as cleft lip and palate, pigmentary retinopathy, and multiple tubular stenoses (e.g., bile ducts, ureters). Other system involvement is variable. Rotation anomalies of the gut and cardiac abnormalities are frequently present. Pathogenesis remains obscure. Our patient was delivered at 33 weeks of gestation by cesarean section, and was jaundiced, with low birth weight and height. On day 5 after birth, the patient underwent Ladd's surgery for intestinal malrotation. One month later, she developed pyelonephritis and urosepsis. She remained jaundiced and a liver biopsy revealed cirrhosis with regenerating nodules, portal chronic inflammation with bile duct proliferation, and lobular cholestasis. The patient underwent several corrective operations, and at 12 months of age she was diagnosed with Hardikar syndrome. She failed to thrive and had progressive cholestasis and jaundice, coagulation disorders, bilateral ureterostomies, repetitive urinary tract infections, bilateral cleft lip and palate, retinopathy, and gut malrotation. She received a liver transplant at 24 months of age from a living donor. She has had an excellent clinical outcome in liver function without further decline of growth and development.
Collapse
|
37
|
Prospective validation of quantitative polymerase chain reaction for management of cytomegalovirus disease in solid-organ transplant patients. Transplantation 2002; 74:573-6. [PMID: 12352922 DOI: 10.1097/00007890-200208270-00025] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
This study evaluates the utility of quantitative polymerase chain reaction (QPCR) to determine duration of treatment of transplant patients with human cytomegalovirus (HCMV) disease. Eighteen patients with HCMV disease were prospectively evaluated and followed for recurrence using a QPCR assay. We used plasma samples from which nucleic acid was extracted. Quantification was determined by using an internal standard that contained the same primer sequences as for HCMV. During treatment, weekly QPCR assays were performed. Patients were treated with HCMV immunoglobulin-G for a finite period, but intravenous ganciclovir was continued until less than 100 viral copies (vc) per mL was detectable. After cessation of therapy, patients were followed for 6 months with monthly clinical assessment and QPCR. No patient developed recurrence of HCMV at a mean follow-up of 16 months. This preliminary study suggests that the use of QPCR to assess viral load is useful in deciding the length of HCMV treatment with ganciclovir but requires further randomized validation.
Collapse
|
38
|
Adenosine rinse in human orthotopic liver transplantation: results of a randomized, double-blind trial. INTERNATIONAL JOURNAL OF SURGICAL INVESTIGATION 2002; 1:55-66. [PMID: 11817338] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 02/23/2023]
Abstract
OBJECTIVE The study was aimed at determining whether the vasodilator, adenosine, shown to produce dramatic improvement in liver graft and animal acute and long-term survival, would be beneficial in human liver transplantation. METHODS A prospective, randomized, double-blind trial of an adenosine rinse preservation solution in human orthotopic liver transplantation (OLTX) was conducted in 43 consecutive transplants. Intraoperative and postoperative care was performed by a single transplant team utilizing a quadruple drug immunosuppressive protocol, with complete 5-year patient follow-up. At implantation all allografts were flushed with a 4 degrees C (pH 7.4) Normosol solution, with 0.12 mM adenosine or without adenosine. RESULTS Recipient characteristics were similar in the treated and control groups including age, pre-OLTX diagnosis, and United Network for Organ Sharing (UNOS) status. Donor variables were equivalent in the two groups including age, weight, prothrombin time, and serum chemistries. Operative variables showed no differences except a significant (p = 0.006) reduction of veno-venous bypass time in the adenosine treated group. Liver allograft function improved in the adenosine rinse groups as measured by both postoperative bile production (218 +/- 156cc/24h adenosine vs. 116 +/- 78 cc/24 h without adenosine, p = 0.03) and Factor 7 production at day 3 (64 +/- 26% adenosine vs. 51 +/- 20% without adenosine, p = 0.08). The adenosine treated group had an insignificant 10% patient and graft improvement in survival at 6 months to 60 months compared to the control group. CONCLUSIONS These results suggest that adenosine added to the intraoperative flush solution during human liver transplantation is safe, does not reduce cardiac stability at reperfusion, improves early liver allograft function, but has an insignificant short- and long-term affect on allograft survival.
Collapse
|
39
|
The interrelationship between portal and arterial blood flow after adult to adult living donor liver transplantation. Transplantation 2000; 70:1697-703. [PMID: 11152099 DOI: 10.1097/00007890-200012270-00006] [Citation(s) in RCA: 114] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Abstract
BACKGROUND When adults are transplanted with segmental grafts, disparity between the size of the graft and the native organ is almost universal. These grafts presumably still receive all of the native portal inflow despite a reduced vascular bed and dramatically elevated blood flow may result. The hemodynamic changes after segmental transplantation in adults have not yet been studied and their clinical significance is unknown. METHODS Portal venous and hepatic arterial blood flow were measured intraoperatively in right lobe liver donors and recipients with electromagnetic flow probes. Postoperative evolution was monitored in recipients with ultrasonography. RESULTS Portal flow to the right lobe ranged from 601 to 1,102 ml/min before resection and from 1,257 to 2,362 ml/min after transplantation. There was a statistically significant linear correlation between the change in portal flow and graft to recipient body weight ratio. Arterial blood flow ranged from 213 to 460 ml/min before resection and from 60 to 300 ml/min after transplantation. Preoperative portal peak systolic velocity was uniformly around 10 cm/sec. Values on postoperative day 1 were increased to 30 cm/sec in recipients of cadaveric organs, to 50 cm/sec in recipients of organs with graft to recipient body weight ratios of more than 1.2%, and to 115 cm/sec in recipients of organs with ratios less than 0.9%. A decreasing tendency was universally observed. Arterial systolic velocity was inversely related to portal systolic velocity. Neither graft dysfunction nor vascular complications occurred. CONCLUSIONS The hemodynamic pattern after right lobe transplantation is predictable and intraoperative measurements and ultrasonography are useful for monitoring. The size of the graft influences the magnitude of the hemodynamic changes.
Collapse
|
40
|
Abstract
BACKGROUND The shortage of cadaveric livers has sparked an interest in adult-to-adult living donor transplantation. Right lobe donor hepatectomy is frequently required to obtain a graft of adequate size for adult recipients. Careful donor selection is necessary to minimize complications and assure a functional graft. METHODS A four-step evaluation protocol was used for donor selection and satisfactory results of all tests in each step were required before proceeding to the next. Donors were selected based on a battery of laboratory studies chosen to exclude unrecognized infection, liver disease, metabolic disorders, and conditions representing undue surgical risk. Imaging studies included ultrasonography, angiography, magnetic resonance imaging, and intraoperative cholangiography and ultrasonography. The information obtained from liver biopsy was used to correct the estimated graft mass for the degree of steatosis. RESULTS From March 1998 to August 1999, 126 candidates were evaluated for living donation. A total of 35 underwent donor right lobectomy with no significant complications. Forty percent of all donors that came to surgery were genetically unrelated to the recipient. A total of 69% of those evaluated were excluded. ABO incompatibility was the primary reason for exclusion after the first step (71%) and the presence of steatosis yielding an inadequate estimated graft mass after the second step (20%). CONCLUSIONS Donor selection limits the application of living donor liver transplantation in the adult population. Unrelated individuals increase the size of the donor pool. Right lobe hepatectomy can be performed safely in healthy adult liver donors. Preoperative liver biopsy is an essential part of the evaluation protocol, particularly when the estimated graft mass is marginal.
Collapse
|
41
|
Abstract
OBJECTIVE To review the anatomical variations of the right lobe encountered in 40 living liver donors, describe the surgical management of these variations, and summarize the results of these procedures. SUMMARY BACKGROUND DATA Anatomical variability is the rule rather than the exception in liver and biliary surgery. To make effective use of liver segments from living donors for transplantation, surgical techniques must be adapted to the anomalies. METHODS Donor evaluation included celiac and mesenteric angiography with portal phase, magnetic resonance angiography, and intraoperative ultrasonography and cholangiography. Arterial anastomoses were generally between the donor right hepatic artery and the recipient main hepatic artery. Jump-grafts were constructed for recipients with hepatic artery thrombosis, and double donor arteries were joined to the bifurcation of the recipient hepatic artery. The branches of a trifurcated donor portal vein were isolated during the parenchymal transection, joined in a common cuff, and anastomosed to the recipient main portal vein. Significant accessory hepatic veins were preserved, brought together in a common cuff if multiple, and anastomosed to the recipient cava. The bile ducts were individually drained through a Roux-en-Y limb, and stents were placed in most patients. RESULTS Forty right lobe liver transplants were performed between adults. No donor was excluded because of prohibitive anatomy. Seven recipients had a prior transplant and five had a transjugular intrahepatic portosystemic shunt (TIPS). Arterial anomalies were noted in six donors and portal anomalies in four. Arterial jump-grafts were required in three. Sixteen had at least one significant accessory hepatic vein, and one had a double right hepatic vein. There were no vascular complications. Multiple bile ducts were found in 27 donors. Biliary complications occurred in 33% of patients without stents and 4% with stents. CONCLUSIONS Anatomical variations of the right lobe can be accommodated without donor complications or complex reconstruction. Previous transplantation and TIPS do not significantly complicate right lobe transplantation. Microvascular arterial anastomosis is not necessary, and vascular complications should be infrequent. Biliary complications can be minimized with stenting.
Collapse
|
42
|
Emergency portacaval shunt for control of hemorrhage from a parenchymal fracture after adult-to-adult living donor liver transplantation. Transplantation 2000; 69:2218-21. [PMID: 10852631 DOI: 10.1097/00007890-200005270-00049] [Citation(s) in RCA: 19] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
As more adults undergo transplantation with partial liver grafts, the unique features of these segments and their clinical significance will become apparent. A patient presented with life-threatening hemorrhage from an iatrogenic laceration to a right lobe graft 11 days after transplantation. The creation of a portacaval shunt effectively controlled the bleeding, allowing more elective replacement of the organ with another right lobe graft. The regeneration process combined with increased portal blood flow and relative outflow limitation may have set the stage for this complication. Any disruption of the liver parenchyma during transplantation should be securely repaired and followed cautiously. Portacaval shunting is an option for controlling hemorrhage from the liver in transplant recipients. The timely availability of a second organ was likely the ultimate determinant of survival for this patient.
Collapse
|
43
|
Abstract
BACKGROUND The high mortality rate associated with fulminant hepatic failure combined with the limited availability of cadaveric organs requires consideration of alternatives to conventional cadaveric transplantation. Use of the donor right lobe in adult-to-adult living donor transplantation holds promise in a variety of circumstances, including high-acuity situations. METHODS A 28-year-old male with fulminant hepatic failure secondary to hepatitis B was referred to our institution. He rapidly progressed to grade IV encephalopathy, and laboratory values were indicative of a poor prognosis without transplantation. He was listed for transplantation as UNOS status I. Three siblings were simultaneously evaluated for living liver donation. Following established protocols, we completed donor evaluation in less than 24 hr, and donor right lobectomy and living donor transplantation were performed within 36 hr of the recipient's admission to our center. RESULTS The donor surgery was uncomplicated, and the patient was discharged on postoperative day 4. The recipient experienced full recovery and was discharged home on postoperative day 14. Of note, the first offer for a cadaveric liver came more than 60 hr after living donor transplantation. CONCLUSIONS Thorough donor workup can be completed in less than 24 hr without inappropriate abbreviation of the evaluation. Simultaneous workup of willing individuals prevents unnecessary delay. Living donor transplantation should be considered for patients with fulminant hepatic failure who are appropriate transplant candidates.
Collapse
|
44
|
Abstract
The first adult-to-adult living donor liver transplant using the right hepatic lobe in the United States was performed only 2 years ago. Although initial reports were encouraging, continuous review of the results and appropriate modifications in patient management will be necessary to minimize donor risk and optimize recipient outcome. The results of 40 such transplantations were analyzed and are summarized. Recipients were listed for transplantation according to the usual criteria. Living donors were not considered for United Network for Organ Sharing status IIA patients after the initial 22 patients. Donor evaluation followed a rigid protocol. A graft-to-recipient body weight ratio of at least 0.8% was the minimum required throughout most of the study. The surgical procedures were similar, except the plane of transection was modified to better accommodate donor biliary anatomy, and uniform stenting of bile ducts was practiced after the first 10 transplants. Immunosuppression consisted of tacrolimus, mycophenolate mofetil, and a prednisone taper. The target tacrolimus level was decreased and mycophenolate was withdrawn more rapidly in the second half of the study because of the absence of acute cellular rejection. Donor morbidity has been limited to minor complications, and transplant recipient biliary complications decreased from 35% to 0%. Acute cellular rejection has not been observed despite less aggressive immunosuppression, and septic complications decreased dramatically. There have been no recipient deaths since these changes were instituted. Right lobectomy can be performed safely in the donor population. Recipient biliary complications can be minimized with stenting. Less aggressive immunosuppression is well tolerated and minimizes septic complications and attributable mortality.
Collapse
|
45
|
Liver regeneration and function in donor and recipient after right lobe adult to adult living donor liver transplantation. Transplantation 2000; 69:1375-9. [PMID: 10798757 DOI: 10.1097/00007890-200004150-00028] [Citation(s) in RCA: 236] [Impact Index Per Article: 9.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/22/2022]
Abstract
BACKGROUND Regeneration of the liver to a predetermined size after resection or transplantation is a well described phenomenon, but the time course over which these events occur has not been well defined. It is not clear how initial liver mass, reperfusion, immunosuppression, or steatosis influence this process. METHODS Liver regeneration was assessed prospectively by volumetric magnetic resonance imaging (MRI) in living right lobe liver donors and the recipients of these grafts. Imaging was performed at regular intervals through 60 days after resection/transplantation, and liver mass was determined. Liver function tests and synthetic function were monitored throughout the study period in donors and recipients of these grafts as well as recipients of cadaveric grafts. RESULTS MRI consistently overestimated liver mass by a mean of 45 g (+/-65) (range 10-123). Donor liver mass increased by 101%, 110%, 115%, and 144% at 7, 14, 30, and 60 days after resection, respectively. Recipient liver mass increased by 87,101, 119, and 99% at 7, 14, 30, and 60 days after transplantation, respectively. Steatosis did not influence the degree of regeneration or graft function, nor was there a functional difference between grafts of >1% graft to recipient body weight ratio or <1%. CONCLUSIONS MRI accurately determines right lobe mass. Most liver regeneration occurs in the 1st week after resection or transplantation, and the time course does not differ significantly in donors or recipients. The mass of the graft or remnant segment affects the duration of the regeneration process, with a smaller initial liver mass prolonging the course. Steatosis of <30% had no bearing on liver function or regeneration and, therefore, should not be an absolute criterion for exclusion of donors. A calculated graft to recipient body weight ratio of 0.8% is adequate for right lobe living donor liver transplantation.
Collapse
|
46
|
Abstract
BACKGROUND The shortage of livers for transplantation has prompted transplant centers to seek alternatives to conventional cadaveric liver transplantation. Left lateral segmentectomy from living donors has proven to be a safe operation for the donor with excellent results in the pediatric population. Left lobectomy, conceived to supply more tissue, still provides insufficient liver mass for an average size adult patient. Right lobectomy could supply a graft of adequate size. METHODS Donors were considered only after recipients were listed according to United Network for Organ Sharing (UNOS) criteria. Donor evaluation included liver biopsy, magnetic resonance imaging, and celiac and mesenteric angiography. The donor operation consisted of a right lobectomy uniformly performed throughout the series as described herein. RESULTS Twenty-five right lobe living donor liver transplants were performed between adults, with no significant complications in donors. Recipient and graft survival was 88%, with three recipient deaths secondary to uncontrolled sepsis in patients at high risk for liver transplant; all three had functioning grafts. CONCLUSIONS Right lobe living donor liver transplantation poses challenges that require a meticulous surgical technique to minimize morbidity in the recipient. Right lobectomies for living donation can be performed safely with minimal risk to both donor and recipient although providing adequate liver mass for an average size adult patient.
Collapse
|
47
|
Intragraft cytokine expression in tolerant rat renal allografts with rapamycin and cyclosporin immunosuppression. Clin Transplant 1999; 13:90-7. [PMID: 10081643 DOI: 10.1034/j.1399-0012.1999.130105.x] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
The Th-1/Th-2 paradigm proposes clonal expansion of Th-2 lymphocytes as the basis of tolerance towards allografts. Intragraft cytokine expression was evaluated in a highly stringent model of renal transplantation. ACI and Lewis rats were used as donors and recipients, respectively, for heterotopic renal transplantation. Group A (n = 8) received a single dose of rapamycin and cyclosporin 12 h prior to engraftment, followed by 7 d of cyclosporin post-operatively. Isografts (Group B, n = 5) and control allografts (Group C, n = 4) received no immunosuppression. Sacrifice was performed after 120 d. Intragraft expression of IL-10, IL-4, and IFN-gamma was determined using qualitative reverse transcriptase-polymerase chain reaction (RT-PCR). All groups had functionally normal grafts at sacrifice, with 50% histological tolerance among Group A animals. No isografts showed evidence of cellular infiltrate, and all control allografts showed severe rejection. IL-10 was only detected in the tolerant animals (p < 0.001). Similarly, IL-4 was detected predominantly in the tolerant allografts (p < 0.05). IFN-gamma was only isolated in rejected allografts, whether treated or untreated (p < 0.001). We conclude that the expansion of Th-2 cells is associated with tolerance, while the expansion of Th-1 cell is associated with acute cellular rejection.
Collapse
|
48
|
Abstract
BACKGROUND Intragraft cytokine expression was evaluated in a model of renal transplantation. ACI and Lewis rats were used as donors and recipients, respectively, for heterotopic renal transplantation. METHODS Treated allograft rats (n=10) received a preoperative dose of rapamycin and cyclosporine, followed by 7 days of cyclosporine postoperatively. Isograft rats (n=5) and control allograft rats (n=4) received no immunosuppression. Sacrifice was performed after 120 days. Expression of interleukin (IL)-4, IL-10, and interferon-gamma (IFN-gamma) transcripts was determined with semiquantitative reverse transcriptase-polymerase chain reaction. RESULTS All treated allograft rats had normal function with 50% histologic rejection. All isografts had normal function. IL-4 and IL-10 were in greater density in allografts with normal histology, whereas IFN-gamma was only seen in allografts with cellular rejection. No IL-10 was seen in isografts, but IL-4 was detected in 3/5 isografts. CONCLUSIONS We conclude that the lymphocyte population's elaboration of IL-4 and IL-10 is associated with tolerance, whereas the production of IFN-gamma and absence of IL-4 is associated with histology suggestive of acute cellular rejection.
Collapse
|
49
|
A prospective randomized trial of mycophenolate mofetil with neoral or tacrolimus after orthotopic liver transplantation. Transplantation 1998; 66:1616-21. [PMID: 9884248 DOI: 10.1097/00007890-199812270-00008] [Citation(s) in RCA: 69] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
BACKGROUND The success of liver transplantation in this decade has become the stimulus to extend the donor and recipient pool. Reducing early posttransplant morbidity to maintain our success, as we expand our frontiers, has led us to focus on balanced testing of multidrug immunosuppression regimens. METHODS A prospective trial in orthotopic liver transplantation using Mycophenolate Mofetil and an identical steroid taper with randomization of patients to Neoral (N) or Tacrolimus (FK) is the basis of this report. This was an intent-to-treat study designed to compare the 6-month primary endpoints of rejection and infection and to compare the 6-month secondary endpoints of liver function, renal function, bone marrow function, hypertension, and serum cholesterol levels. RESULTS Ninety-seven patients completed the 6-month follow-up period (N=49, FK=48). The actual 6-month patient and graft survival rates were 98% and 94%, respectively. There was no difference in the number of patients with rejection episodes (N=11, FK=8) (P=0.61). There were 24 infections (3 cytomegalovirus) in the FK group and 30 infections (9 cytomegalovirus) in the N group. The cholesterol levels at 6 months were not significantly different (P=0.07) between the groups. The other secondary 6-month endpoints were not significantly different, except total bilirubin, which was lower in the FK arm (P=0.02). CONCLUSIONS The use of Mycophenolate Mofetil with N or FK and an identical steroid taper after orthotopic liver transplantation is associated with excellent graft and patient survival, and at 6 months, only 191% of the patients experienced rejection, with a 48% overall infection rate.
Collapse
|
50
|
Abstract
BACKGROUND One of the proposed mechanisms of tolerance induction is the Th-1/Th-2 paradigm. The Th-1 cell is proinflammatory, secreting IFN-gamma and IL-2. Conversely, the Th-2 cell is anti-inflammatory, secreting IL-4 and IL-10. In our earlier studies a shift toward Th-2 dominance was required for tolerance induction in this model. MATERIALS AND METHODS ACI and Lewis rats were used as donors and recipients, respectively. Twelve hours prior to engraftment, rapamycin 1.5 mg/kg po and cyclosporin 10 mg/kg sc were given, followed by 5 mg/kg sc postop (days 1-7). Lewis rats were used as isografts. Functional allograft tolerance was induced consistently in 100% of the recipients with 50% of the allografts exhibiting normal histology beyond 120 days. Qualitative RT-PCR was performed on the grafts to determine IFN-gamma expression with beta-actin housekeeping gene as control. RESULTS IFN-gamma was expressed in all untreated allografts (5/5) and all treated, yet rejecting, allografts (4/4). None of the isografts (0/5) or histologically tolerant allografts (0/4) expressed IFN-gamma. This distribution was statistically significant (P < 0.001, Fischer's exact test). CONCLUSION Our findings support a shift from Th-2 to Th-1 predominance as the corollary mechanism responsible for preventing histologic tolerance.
Collapse
|