1
|
Geographic inequity results in disparate mortality: a multivariate intent-to-treat analysis of liver transplant data. Clin Transplant 2015; 29:484-91. [PMID: 25530463 DOI: 10.1111/ctr.12499] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/10/2014] [Indexed: 12/19/2022]
Abstract
CONTEXT The distribution of livers to listed transplant candidates shows substantial geographic inequity. OBJECTIVE To compare mortality between the 11 UNOS (United Network of Organ Sharing) regions from the time of listing and to show that the geographic region impacts survival. DESIGN, SETTING, AND PATIENTS We studied the data of 1930 adults listed with a Model for End-Stage Liver Disease (MELD) score of 18 for a liver transplant from March 1, 2002 through December 31, 2007. We calculated one- and three-yr survival rates and performed multivariate Cox regression analysis to determine significant risk factors for mortality. MAIN OUTCOME MEASURES Patient survival from the time of listing for transplantation. RESULTS Actual one-yr mortality rate from the time of listing ranged from 30.5% (Region 2) to 12.9% (Region 4). The three-yr mortality rate ranged from 42.0% (Region 2) to 21.6% (Region 4). Multivariate analysis showed a significant increase in mortality in Region 2 (odds ratio [OR], 1.49; 95% confidence interval [CI], 1.21 to 1.83) and a significant decrease in mortality in Region 3 (OR, 0.74; 95% CI, 0.59 to 0.93). CONCLUSIONS We found significant differences in one- and three-yr mortality rates among UNOS regions. Regional disparities significantly affect patient survival and result in national inequality.
Collapse
|
2
|
Abstract
BACKGROUND Human islet allotransplantation for the treatment of type 1 diabetes is in phase III clinical trials in the U.S. and is the standard of care in several other countries. Current islet product release criteria include viability based on cell membrane integrity stains, glucose-stimulated insulin release, and islet equivalent (IE) dose based on counts. However, only a fraction of patients transplanted with islets that meet or exceed these release criteria become insulin independent following 1 transplant. Measurements of islet oxygen consumption rate (OCR) have been reported as highly predictive of transplant outcome in many models. METHOD In this article we report on the assessment of clinical islet allograft preparations using OCR dose (or viable IE dose) and current product release assays in a series of 13 first transplant recipients. The predictive capability of each assay was examined and successful graft function was defined as 100% insulin independence within 45 days post-transplant. RESULTS OCR dose was most predictive of CTO. IE dose was also highly predictive, while glucoses stimulated insulin release and membrane integrity stains were not. CONCLUSION OCR dose can predict CTO with high specificity and sensitivity and is a useful tool for evaluating islet preparations prior to clinical human islet allotransplantation.
Collapse
|
3
|
Abstract
Outcomes of intestinal transplants (ITx; n = 977) for pediatric patients are examined using the United Network for Organ Sharing data from 1987 to 2009. Recipients were divided into four age groups: (1) <2 years of age (n = 569), (2) 2-6 years (n = 219), (3) 6-12 years (n = 121) and (4) 12-18 years (n = 68). Of 977 ITx, 287 (29.4%) were isolated ITx and 690 (70.6%) were liver and ITx (L-ITx). Patient survival for isolated ITx at 1, 3 and 5 years, 85.3%, 71.3% and 65.0%, respectively, was significantly better than L-ITx, 68.4%, 57.0% and 51.4%, respectively, (p = 0.0001); this was true for all age groups, except for patients <2 years of age. The difference in graft survival between isolated ITx and L-ITx was significant at 1 and 3 years (Wilcoxon test, p = 0.0012). After attrition analysis of graft survival of patients who survived past first year, 3 and 5 years, graft survival for L-ITx patient was significantly better than those for isolated ITx. Isolated ITx should be considered early before the onset of liver disease in children >2 with intestinal failure but is not advantageous in patients <2 years.
Collapse
|
4
|
Effect of epidural analgesia on postoperative complications following pancreaticoduodenectomy. Am J Surg 2012; 204:1000-4; discussion 1004-6. [DOI: 10.1016/j.amjsurg.2012.05.022] [Citation(s) in RCA: 34] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2012] [Revised: 05/09/2012] [Accepted: 05/22/2012] [Indexed: 11/30/2022]
|
5
|
Pancreas after living donor kidney transplants in diabetic patients: impact on long-term kidney graft function. Clin Transplant 2009; 23:437-46. [PMID: 19496790 DOI: 10.1111/j.1399-0012.2009.00998.x] [Citation(s) in RCA: 77] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
In this single-institution study, we compared outcomes in diabetic recipients of living donor (LD) kidney transplants that did vs. did not undergo a subsequent pancreas transplant. Of 307 diabetic recipients who underwent LD kidney transplants from January 1, 1995, through December 31, 2003, a total of 175 underwent a subsequent pancreas after kidney (PAK) transplant; 75 were deemed eligible (E) for, but did not receive (for personal or financial reasons), a PAK, and thus had a kidney transplant alone (KTA); and 57 deemed ineligible (I) for a PAK because of comorbidity also had just a KTA. We analyzed the three groups (PAK, KTA-E, KTA-I) for differences in patient characteristics, glycemic control, renal function, patient and kidney graft survival rates, and causes of death. Kidney graft survival rates (actuarial) were similar in the PAK vs. KTA-E groups at one, five, and 10 yr post-transplant: 98%, 82%, and 67% (PAK) vs. 100%, 84%, and 62% (KTA-E) (p = 0.9). The long-term (greater than four yr post-transplant) estimated glomerular filtration rate (GFR) was higher in the PAK than in the KTA-E group: 53 +/- 20 mL/min (PAK) vs. 43 +/- 16 mL/min (KTA-E) (p = 0.016). The patient survival rates were also similar for the PAK and KTA-E groups. We conclude that the subsequent transplant of a pancreas after an LD kidney transplant does not adversely affect patient or kidney graft survival rates; in fact, it is associated with better long-term kidney graft function.
Collapse
|
6
|
Red cell aplasia and autoimmune hemolytic anemia following immunosuppression with alemtuzumab, mycophenolate, and daclizumab in pancreas transplant recipients. Haematologica 2007; 92:1029-36. [PMID: 17640860 DOI: 10.3324/haematol.10733] [Citation(s) in RCA: 51] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2006] [Accepted: 04/05/2007] [Indexed: 11/09/2022] Open
Abstract
BACKGROUND AND OBJECTIVES Acquired red cell aplasia (RCA) is a rare disorder and can be either idiopathic or associated with certain diseases, pregnancy, or drugs. In exceptionally rare cases, it has been reported to co-exist with other autoimmune cytopenias. We report a high incidence of RCA and autoimmune hemolytic anemia (AIHA) in pancreas transplant recipients on alemtuzumab-based maintenance therapy. DESIGN AND METHODS Between February 2003 and July 2005, 357 pancreas transplant recipients were treated with immunosuppressive regimens containing the lymphocyte-depleting antibody alemtuzumab, the T-cell activation inhibitor daclizumab, and the anti-metabolite mycophenolate mofetil (MMF). We retrospectively reviewed medical records, blood bank data and bone marrow biopsy specimens of patients with a Transplant Information Services database diagnosis of RCA and AIHA from February 2003 to November 2005. RESULTS Severe RCA, AIHA, and idiopathic thrombocytopenic purpura (ITP) occurred independently or in combination, in 20 out of 357 (5.6%) pancreas transplant recipients, 12 to 24 months following the initiation of the aforementioned immunosuppressive regimens. Severe opportunistic infections developed late in 14/20 (70%) of these patients. Atypical morphologic features, including variable dysgranulopoiesis, variable megakaryocytic hyperplasia with normal or low peripheral platelet counts, and atypical lymphoid aggregates were found in bone marrow trephine sections of 11 patients in whom the diagnosis of RCA was made. INTERPRETATION AND CONCLUSIONS We hypothesize that the combination of alemtuzumab, daclizumab and MMF can result in immune dysregulation thereby permitting autoantibody formation. Because the use of these three immune suppressants is becoming increasingly common, it is important to recognize the severe hematologic complications that can arise.
Collapse
MESH Headings
- Adult
- Alemtuzumab
- Anemia, Hemolytic, Autoimmune/chemically induced
- Anemia, Hemolytic, Autoimmune/epidemiology
- Antibodies, Monoclonal/adverse effects
- Antibodies, Monoclonal/therapeutic use
- Antibodies, Monoclonal, Humanized
- Antibodies, Neoplasm/adverse effects
- Antibodies, Neoplasm/therapeutic use
- Autoimmune Diseases/chemically induced
- Autoimmune Diseases/epidemiology
- Bone Marrow/pathology
- Daclizumab
- Female
- Humans
- Immunoglobulin G/adverse effects
- Immunoglobulin G/therapeutic use
- Immunosuppressive Agents/adverse effects
- Immunosuppressive Agents/therapeutic use
- Incidence
- Kidney Transplantation/statistics & numerical data
- Lymphocyte Activation/immunology
- Male
- Middle Aged
- Mycophenolic Acid/adverse effects
- Mycophenolic Acid/analogs & derivatives
- Mycophenolic Acid/therapeutic use
- Opportunistic Infections/epidemiology
- Opportunistic Infections/etiology
- Pancreas Transplantation/statistics & numerical data
- Pilot Projects
- Postoperative Complications/chemically induced
- Postoperative Complications/epidemiology
- Purpura, Thrombocytopenic, Idiopathic/chemically induced
- Purpura, Thrombocytopenic, Idiopathic/epidemiology
- Red-Cell Aplasia, Pure/chemically induced
- Red-Cell Aplasia, Pure/epidemiology
- Retrospective Studies
- T-Lymphocytes/immunology
Collapse
|
7
|
Do inherited hypercoagulable states play a role in thrombotic events affecting kidney/pancreas transplant recipients? Clin Transplant 2007; 21:32-7. [PMID: 17302589 DOI: 10.1111/j.1399-0012.2006.00574.x] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
Abstract
BACKGROUND Pancreas graft thrombosis remains the leading non-immunologic cause of graft loss after pancreas transplantation. We studied the role of hypercoagulable states (HCS) in pancreas graft thrombosis (pthx). METHODS Between January 1, 1994, and January 1, 2003, 131 pancreas transplant recipients experienced a pthx (n = 67) or other thrombotic events. Fifty-six recipients consented to have their blood drawn and tested for the HCS. These results were compared with a control group of pancreas transplant recipients who did not experience a thrombotic event. Fisher's exact test was used to compare the groups. RESULTS We found 18% of the recipients with pancreas thrombosis to have a HCS. Factor V Leiden (FVL) was found in 15% vs. 4% in the control group (p = ns) vs. 3-5% in the general white population. We found 3% of the pancreas thrombosis patients to have a prothrombin gene mutation (PGM) vs. 0% in the control group (p = ns) vs. 1-2% in the general white population. CONCLUSIONS Of pancreas transplant recipients with thrombosis, 18% had one or more of the most common factors associated with a HCS (FVL or PGM). This can be compared with 4% in a control group and 4-7% in the general white population, respectively. Although the differences are not statistically significant due to small numbers, we feel that the findings may be clinically relevant. While this is only a pilot study, it may be reasonable to screen select pancreas transplant candidates for HCS, especially FVL and PGM, until more data become available.
Collapse
|
8
|
Abstract
BACKGROUND Determining factors associated with negative slope of inverse creatinine vs. time (1/Cr vs. t) may help prevent a decline in renal allograft function. METHODS A total of 1389 adult recipients of primary renal transplants were divided into quartiles based on the slope of 1/Cr vs. t calculated from 6 and 12 months post transplant. A multivariate analysis of risk factors for being in the worst vs. best quartile employed these variables: donor source, HLA mismatch, recipient age, donor age, panel-reactive antibody (PRA), acute rejection (AR), 3-month cyclosporin A (CsA) level, 1-yr CsA level and acute tubular necrosis. Two separate analyses compared risk factors in patients with 1 and 3 yr survival, respectively. RESULTS In recipients with > or = 1 yr graft survival, high PRA and AR were associated with negative slopes of 1/Cr vs. t. For those with > or = 3 yr graft survival, both AR and 3-month CsA level > 150 ng/mL were significant risk factors, using both 6- and 12-month slopes. Stratification of AR showed 1 AR episode > or = 6 months and multiple AR episodes carried significant risk for negative slopes. CONCLUSION Optimization of allograft function invokes a conundrum between the needs to avoid both AR and high early CsA levels. We support a policy of carefully balancing these two risks.
Collapse
|
9
|
A report of the Vancouver Forum on the care of the live organ donor: lung, liver, pancreas, and intestine data and medical guidelines. Transplantation 2006; 81:1373-85. [PMID: 16732172 DOI: 10.1097/01.tp.0000216825.56841.cd] [Citation(s) in RCA: 251] [Impact Index Per Article: 13.9] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/22/2022]
|
10
|
Abstract
BACKGROUND The objective of this study was to examine how effectively pancreas transplants provide long-term glucose control in patients with type 2 diabetes mellitus (DM). We used guidelines from the American Diabetes Association (ADA) and the World Health Organization (WHO) to appropriately classify recipients with type 2 DM (vs. type 1 DM). RESULTS From 1994 through 2002, a total of 17 patients with type 2 DM underwent a pancreas transplant at our center. Mean recipient age was 52.5 yr. The mean age at diabetes onset was 35.7 yr; mean duration, 16.8 yr. Most recipients had one or more secondary complications related to their diabetes: retinopathy (94%), neuropathy (76%), or nephropathy (65%). At the time of their transplant, three (18%) were on oral hypoglycemic agents alone and 14 (82%) were on insulin therapy. Of the 17 transplants, seven (41%) were a simultaneous pancreas-kidney transplant (SPK); four (24%), pancreas after kidney transplant (PAK); and six (35%), pancreas transplant alone (PTA). One recipient died during the perioperative period because of aspiration. The other 16 recipients became euglycemic post-transplant and had a functional graft at 1 yr post-transplant (patient and graft survival rates, 94%). Now, with a mean follow-up of 4.3 yr post-transplant, the patient survival rate is 71%. The four additional deaths were because of sepsis (n = 2), suicide (n = 1), and unknown cause (n = 1). All four of these recipients were insulin-independent at the time of death, although one was on an oral hypoglycemic agent. Of the 12 recipients currently alive, 11 remain euglycemic without requiring insulin therapy or oral hypoglycemic agents; one began insulin therapy 1.2 yr post-transplant (current daily dose, 60 units). CONCLUSION These findings suggest that pancreas transplants can provide excellent glucose control in recipients with type 2 DM. All 16 (94%) of our recipients whose transplant was technically successful were rendered euglycemic. Long-term results were comparable with those seen in transplant recipients with type 1 DM.
Collapse
|
11
|
Late anastomotic leaks in pancreas transplant recipients - clinical characteristics and predisposing factors. Clin Transplant 2005; 19:220-4. [PMID: 15740558 DOI: 10.1111/j.1399-0012.2005.00322.x] [Citation(s) in RCA: 47] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/18/2022]
Abstract
BACKGROUND Anastomotic leaks after pancreas transplants usually occur early in the postoperative course, but may also be seen late post-transplant. We studied such leaks to determine predisposing factors, methods of management, and outcomes. RESULTS Between January 1, 1994 and December 31, 2002, a total of 25 pancreas transplant recipients at our institution experienced a late leak (defined as one occurring more than 3 months post-transplant). We excluded recipients with an early leak or with a leak seen immediately after an enteric conversion. The mean recipient age was 40.3 yr; mean donor age, 31.3 yr. The category of transplant was as follows: simultaneous pancreas-kidney (n = 5, 20%), pancreas after kidney (n = 10, 40%), and pancreas transplant alone (n = 10, 40%). At the time of their leak, most recipients (n = 23, 92%) had bladder-drained pancreas grafts; only two recipients (8%) had enteric-drained grafts. The mean time from transplant to the late leak was 20.5 months (range = 3.5-74 months). A direct predisposing event or risk factor occurring in the 6 wk preceding leak diagnosis was identified in 10 (40%) of the recipients. Such events or risk factors included a biopsy-proven episode of acute rejection (n = 4, 16%), a history of blunt abdominal trauma (n = 3, 12%), a recent episode of cytomegalovirus infection (n = 2, 8%), and obstructive uropathy from acute prostatitis (n = 1, 4%). Non-operative or conservative care (Foley catheter placement with or without percutaneous abdominal drains) was the initial treatment in 14 (56%) of the recipients. Such care was successful in nine (64%) of the 14 recipients; the other five (36%) required surgical repair after failure of conservative care at a mean of 10 d after Foley catheter placement. Of the 25 recipients, 11 underwent surgery as their initial leak treatment: repair in nine and pancreatectomy because of severe peritonitis in two. After appropriate management (conservative or operative) of the initial leak, five (20%) of the 25 recipients had a recurrent leak; the mean length of time from initial leak to recurrent leak was 5.6 months. All five recipients with a recurrent leak ultimately required surgery. CONCLUSIONS Late anastomotic leaks are not uncommon; they may be more common with bladder-drained grafts. One-third of the recipients with a late leak had experienced some obvious preceding event that predisposed to the leak. For two-thirds of our stable recipients with bladder-drained grafts, non-operative treatment of the leak was successful.
Collapse
|
12
|
A comparison of surgical outcomes and quality of life surveys in right lobe vs. left lateral segment liver donors. Am J Transplant 2005; 5:805-9. [PMID: 15760405 DOI: 10.1111/j.1600-6143.2005.00767.x] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
Concern remains regarding the possibly higher risk to living liver donors of the right lobe (RL), as compared with the left lateral segment (LLS). We studied outcomes and responses to quality of life (QOL) surveys in the two groups. From 1997 to 2004, we performed 49 living donor liver transplants (LDLTs): 33 RL and 16 LLS. Notable differences included a higher proportion of female and unrelated donors in the RL group. A significantly larger liver mass was resected in RL (vs. LLS) donors: 720 (vs. 310) g, p = 0.01; RL donors also had greater blood loss (398 vs. 240 mL, p = 0.04) and operative times (7.2 vs. 5.7 h, p = 0.05). However, those findings did not translate into significant differences in donor morbidity. The complication rate was 12.5% in LLS donors and 9.1% in RL donors (p = ns). Per a QOL survey at 6 months postdonation, no significant differences were noted in SF-12 scores for the two groups. Recovery times were somewhat longer for RL donors. Mean time off work was 61.0 days for RL donors and 32.4 days for LLS donors (p = 0.004). RL donation is associated with greater operative stress for donors, but not necessarily with a more complicated recovery or differences in QOL.
Collapse
|
13
|
Abstract
Delayed graft function (DGF) occurs after many pancreas transplants (PTx), but is poorly characterized. We studied its incidence, course, and impact in a series of 531 pancreas transplants. Between January 1997 and September 2002, we performed 531 technically successful primary PTx. Of these 531 recipients, 176 (33%) had DGF, defined by their need for exogenous insulin at the time of hospital discharge. The incidence of DGF was roughly equivalent in the three transplant categories: SPK (36%), PAK (32%), and PTA (31%) (p = NS). By 3 months posttransplant, only 19 (3.5%) of all recipients remained on insulin. Only three recipients (0.56%) did not achieve insulin independence. The mean donor age of recipients with DGF was 35.1 years vs. 28.8 years without DGF (p = 0.003). By multivariate analysis, the most significant risk factor for DGF was donor age > 45 years (RR = 4.3, p = 0.0001). For SPK recipients with DGF, graft survival was 87% at 1 year and 82% at 3 years posttransplant; without DGF, 94% at 1 year and 87% at 3 years (p = 0.07). For PAK and PTA recipients, no difference was noted. Acute rejection rates were somewhat higher in recipients with DGF, but this did not reach statistical significance.
Collapse
|
14
|
A PROSPECTIVE, RANDOMIZED TRIAL OF STEROID FREE MAINTENANCE VS. DELAYED STEROID WITHDRAWAL USING A SIROLIMUSITACROLIMUS REGIMEN IN SIMULTANEOUS PANCREAS-KIDNEY TRANSPLANTS (SPK). Transplantation 2003. [DOI: 10.1097/00007890-200308271-00051] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
15
|
Pancreas transplantation in cross-match-positive recipients using cyclosporine- or tacrolimus-based immunosuppression. Transplant Proc 2002; 34:1901-2. [PMID: 12176621 DOI: 10.1016/s0041-1345(02)03116-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
16
|
Abstract
Biliary stricture is a recipient graft complication, occurring late in the post-operative period, which appears to occur with increased frequency in living-related donor liver transplantation (LRD LTx). We reviewed the experience at the University of Minnesota in managing a biliary complication of LRD LTx. Since January 1997, 13 LRD transplants have been performed using the technique of transplantation of the left lateral segments with a small portion of segment IV. All patients had hepaticojejunostomies using a Roux-en-Y loop. Of the 11 surviving patients, eight had evidence of cholangitis (Gram-negative sepsis, two patients; ascending cholangitis, three patients; or unexplained fever with elevated liver enzymes, three patients) 4-8 months after otherwise successful transplantation. Six of the patients underwent percutaneous transhepatic cholangiography (PTC) with demonstration of a stenosis at the site of the biliary anastomosis. Repeated dilation of the anastomosis led to resolution of the stenoses, normalization of liver enzymes, and prevention of further episodes of infection. No patient required revision of the hepaticojejunostomy. Computed axial tomography evidence of ductal stenosis may be subtle in this group of patients, but PTC is diagnostic. We suggest a high index of suspicion of biliary stricture in the LRD LTx population. Biliary dilation reduces the risk of life-threatening sepsis.
Collapse
|
17
|
Abstract
The shortage of cadaver donor livers has been most severe for adult patients. Split liver transplantation is one method to expand the donor pool, but to have a significant impact on the waiting list, it needs to be applied for 2 adult recipients. We split livers from 6 cadaver donors, and transplanted 12 adult recipients. All splits were performed in situ with transection through the midplane of the liver, resulting in a right lobe and a left lobe graft. Mean donor age was 19.7 years; mean donor weight was 79.1 kg. Mean recipient age was 41.5 years. Mean weight of right lobe recipients was 89 kg; left lobe recipients, 60 kg. All donors were hemodynamically stable and had normal liver function tests. Mean operative time for the procurement was 7.4 h. Average blood loss during the transection of the liver was 490 mL. Mean GW/ RW ratio for all recipients was 0.87%; right lobe recipients, 0.86%; and left lobe recipients, 0.88%. With mean follow-up of 9.3 months, patient and graft survival rates were both 83.3%. There were 2 deaths: 1 after hepatic artery thrombosis (HAT) and subsequent multiorgan failure; the other after HAT, a liver retransplant, and subsequent gram-negative sepsis. The remaining 10 recipients are doing well. We observed no cases of primary nonfunction. Other complications included bile leak and/or stenosis (n = 3), bleeding from the Roux loop (n = 1), bleeding after percutaneous biopsy (n = 1), and incisional hernia (n = 1). In conclusion, split liver transplantation, using 1 cadaver liver for 2 adult recipients, can be performed successfully. Crucial to success is proper donor and recipient selection.
Collapse
|
18
|
5,000 kidney transplants--a single-center experience. CLINICAL TRANSPLANTS 2001:159-71. [PMID: 11512309] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 02/21/2023]
Abstract
Between 6/1963 and 12/1998, 5,069 kidney transplants were done at the University of Minnesota. Of these, about half have been living donor, half cadaver. The majority (83%) have been primary transplants. Recipients were grouped in 6 eras based on changes in our immunosuppressive protocols--6/63-12/67 (n = 98); 1/68-7/79 (n = 1,188); 8/79-6/84 (n = 789); 7/84-9/90 (n = 1,006); 10/90-12/95 (n = 1,050; 1/96-12/98 (n = 718)--and their outcomes were compared. Recent eras contained a higher proportion of recipients aged > 50. Since the inception of the program, there has been a steady improvement in actuarial patient survival, graft survival, and death-censored graft survival. Short-term outcome for primary and retransplant recipients has been similar; however, long-term outcome seems worse for retransplant recipients. Importantly, acute rejection and infectious death have become rare causes of graft loss. Chronic rejection and death with function (most often due to a cardiovascular event) have become the predominant causes of graft loss. Recent changes in immunosuppressive protocols (Era VI) have included more aggressive attempts to maintain CsA levels > 150 ng/ml (by HPLC) in the first 3 months and the substitution of mycophenolate mofetil for azathioprine. As a result, the incidence of acute and chronic rejection has decreased and graft survival has improved.
Collapse
|
19
|
Abstract
UNLABELLED Steroids are associated with significant postoperative complications (hypertension, cosmetic changes, bone loss, hyperlipidemia, diabetes, and cataracts). Most develop early; in addition, late post-transplant steroid withdrawal in kidney transplant recipients has been associated with increased acute rejection (AR). To obviate these problems, we studied outcome of a protocol of rapid discontinuation of prednisone (RDS) (steroids stopped on POD6). Between November 1, 1999 and October 31, 2000, 51 adult living donor (LD) first transplant recipients (2 HLA-id, 28 non-id relative, 21 LURD) were immunosuppressed with thymoglobulin (1.25 mg/kg intraoperatively and then qdx4); prednisone (P) (500 mg methylprednisolone intraoperatively, 1 mg/kg x 1 day, 0.5 mg/kg x 2 days, 0.25 mg/kg x 2 days, then d/c); MMF, 1 g b.i.d.; and CSA, 4 mg/kg b.i.d. adjusted to achieve levels of 150-200 ng/mL (by HPLC). Exclusion criteria were delayed graft function or primary disease requiring P. Minimum follow-up was 5.5 months (range 5.5 to 17.5 months). Outcome was compared vs. previous cohorts of LD recipients immunosuppressed with P/AZA/CSA (n = 171) or P/MMF/CSA (n = 43) (both without antibody induction). RESULTS For the RDS group, average CSA level (+/- S.E.) at 3 and 6 months was 190 +/- 12 and 180 +/- 9; avg. MMF dose, 1.7 +/- 0.1 g and 1.7 +/- 0.1 g. There was no significant difference in 6- and 12-month actuarial patient survival, graft survival and rejection-free graft survival between recipients on the RDS protocol vs. historical controls. For RDS recipients, actuarial 6- and 12-month rejection-free graft survival was 87%. Of the 51 RDS recipients, five (10%) have had AR (at 20 days, 1 month, 3 months, 3 months, and 3.5 months post-transplant). After treatment, all five were maintained on 5 mg P; there have been no second AR episodes. Two additional recipients were started on 5 mg P due to low white blood count (WBC) and low/no MMF. Of the 51 grafts, one has failed (death with function). Average serum Cr level (+/- S.E.) at 3 and 6 months for RDS recipients was 1.7 +/- 0.5 (NS vs. historical controls). CONCLUSION For low-risk LD recipients, a kidney transplant with an RDS protocol does not increase risk of AR or graft loss. Future studies will need to be done to assess AR rates with an RDS protocol in cadaver transplant recipients and in recipients with delayed graft function.
Collapse
|
20
|
|
21
|
Abstract
BACKGROUND For certain uremic diabetic patients, a sequential transplant of a kidney (usually from a living donor) followed by a cadaver pancreas has become an attractive alternative to a simultaneous transplant of both organs. The purpose of this study was to compare outcomes with simultaneous pancreas-kidney (SPK) versus pancreas after kidney (PAK) transplants to determine advantages and disadvantages of the two procedures. METHODS Between January 1, 1994, and June 30, 2000, we performed 398 cadaver pancreas transplants at our center. Of these, 193 were SPK transplants and 205 were PAK transplants. We compared these two groups with regard to several endpoints, including patient and graft survival rates, surgical complications, acute rejection rates, waiting times, length of hospital stay, and quality of life. RESULTS Overall, surgical complications were more common for SPK recipients. The total relaparotomy rate was 25.9% for SPK recipients versus 15.1% for PAK recipients (P = 0.006). Leaks, intraabdominal infections, and wound infections were all significantly more common in SPK recipients (P = 0.009, P = 0.05, and P = 0.01, respectively, versus PAK recipients). Short-term pancreas graft survival rates were similar between the two groups: at 1 year posttransplant, 78.0% for SPK recipients and 77.9% for PAK recipients (P = not significant). By 3 years, however, pancreas graft survival differed between the two groups (74.1% for SPK and 61.7% for PAK recipients), although this did not quite reach statistical significance (P = 0.15). This difference in graft survival seemed to be due to increased immunologic losses for PAK recipients: at 3 years posttransplant, the incidence of immunologic graft loss was 16.2% for PAK versus 5.2% for SPK recipients (P = 0.01). Kidney graft survival rates were, however, better for PAK recipients. At 3 years after their kidney transplant, kidney graft survival rates were 83.6% for SPK and 94.6% for PAK recipients (P = 0.001). The mean waiting time to receive the pancreas transplant was 244 days for SPK and 167 days for PAK recipients (P = 0.001). CONCLUSIONS PAK transplants are a viable option for uremic diabetics. While long-term pancreas graft results are slightly inferior to SPK transplants, the advantages of PAK transplants include the possibility of a preemptive living donor kidney transplant, better long-term kidney graft survival, significantly decreased waiting times, and decreased surgical complication rates. Use of a living donor for the kidney transplant expands the donor pool. Improvements in immunosuppressive regimens will hopefully eliminate some of the difference in long-term pancreas graft survival between SPK and PAK transplants.
Collapse
|
22
|
Abstract
OBJECTIVE To review a single center's experience and outcome with living donor transplants. SUMMARY BACKGROUND DATA Outcome after living donor transplants is better than after cadaver donor transplants. Since the inception of the authors' program, they have performed 2,540 living donor transplants. For the most recent cohort of recipients, improvements in patient care and immunosuppressive protocols have improved outcome. In this review, the authors analyzed outcome in relation to protocol. METHODS The authors studied patient and graft survival by decade. For those transplanted in the 1990s, the impact of immunosuppressive protocol, donor source, diabetes, and preemptive transplantation was analyzed. The incidence of rejection, posttransplant steroid-related complications, and return to work was determined. Finally, multivariate analysis was used to study risk factors for worse 1-year graft survival and, for those with graft function at 1 year, to study risk factors for worse long-term survival. RESULTS For each decade since 1960, outcome has improved after living donor transplants. Compared with patients transplanted in the 1960s, those transplanted in the 1990s have better 8-year actuarial patient and graft survival rates. Death with function and chronic rejection have continued to be a major cause of graft loss, whereas acute rejection has become a rare cause of graft loss. Cardiovascular deaths have become a more predominant cause of patient death; infection has decreased. Donor source (e.g., ideally HLA-identical sibling) continues to be important. For living donor transplants, rejection and graft survival rates are related to donor source. The authors show that patients who had preemptive transplants or less than 1 year of dialysis have better 5-year graft survival and more frequently return to full-time employment. Readmission and complications remain problems; of patients transplanted in the 1990s, only 36% never required readmission. Similarly, steroid-related complications remain common. The authors' multivariate analysis shows that the major risk factor for worse 1-year graft survival was delayed graft function. For recipients with 1-year graft survival, risk factors for worse long-term outcome were pretransplant smoking, pretransplant peripheral vascular disease, pretransplant dialysis for more than 1 year, one or more acute rejection episodes, and donor age older than 55. CONCLUSIONS These data show that the outcome of living donor transplants has continued to improve. However, for living donors, donor source affects outcome. The authors also identify other major risk factors affecting both short- and long-term outcome.
Collapse
|
23
|
Abstract
Given the constant flux in caseload and the number of personnel available in the OR, waiting for a final XM often prolongs organ preservation time (a room available at the time a XM is started is not available when the XM is completed). Longer preservation is associated with increased DGF and decreased graft survival. We have shown in a retrospective analysis that final XMs on 0% PRA recipients were always negative (Transplantation, 1999). We now describe a policy of: a) not doing screening XM and b) proceeding to the OR without a XM, in situations where the recipients's PRA has been documented to be 0% and when there have not been any interim transfusions (and the OR is ready before XM completion). Final XM is completed after the transplant. All patients send sera every 6 weeks for PRA (antiglobulin technique). If > o r=3 consecutive PRAs are 0%, no donor-specific screening XM is done prior to calling the patient in for tx (UNOS allocation algorithm used). If there have not been any interim transfusions, we have proceeded to tx prior to completion of the final XM. Between 1/1/98-12/31/99, we did 109 CAD kidney (K) and 79 simultaneous kidney pancreas (SPK) tx; 67 (61%) K and 56 (71%) SPK had 0% PRA. Of the 0% PRA, 25/67 (37%) K and 28/56 (50%) SPK had no pre-tx XM. For K with no XM, cold ischemia was shorter (13.2+/-0.2 vs. 18+/-0.9 hrs, p=0.01) and DGF less (12% vs. 24%, p=0.3); for SPK with no XM, cold ischemia was shorter (15.2+/-2 vs. 18+/-0.9 hrs, p=0.1); no diff in DGF. All post-XM were negative and there were no hyperacute rejections; there was no diff in acute rejection episodes. Actuarial 1 yr graft survival: no XM-K=87.5%, SKP=82%; Yes XM-K=88%, SKP=86% (NS). Our data suggests it is safe, in select circumstances, to proceed to the OR without a XM. Elimination of the screening XM for 0% PRA candidates saves money. Proceeding of the OR (if available) without a final XM shortens cold ischemia time.
Collapse
|
24
|
Experimental short-term immunosuppression after bowel transplantation and donor-specific bone marrow infusion. ARCHIVES OF SURGERY (CHICAGO, ILL. : 1960) 2001; 136:817-21. [PMID: 11448397 DOI: 10.1001/archsurg.136.7.817] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
HYPOTHESIS We previously showed in a large animal pig model that unmodified donor-specific bone marrow infusion (DSBMI) did not facilitate total bowel engraftment; in contrast, it increased the risks of rejection, infection, and graft-vs-host disease (GVHD) posttransplant. We hypothesize that continuous immunosuppression, in combination with DSBMI, might contribute to-or even trigger-these unwarranted immune responses by both host and graft; therefore, discontinuing immunosuppression might decrease these risks and prolong survival. METHODS Six groups of outbred, mixed lymphocyte culture-reactive pigs underwent a total (small and large) bowel transplant: group 1, nonimmunosuppressed control pigs (n = 5); group 2, nonimmunosuppressed DSBMI pigs (n = 6); group 3, tacrolimus (indefinite) pigs (n = 7); group 4, tacrolimus (indefinite) plus DSBMI pigs (n = 7); group 5, tacrolimus (10 days only) pigs (n = 5); and group 6, tacrolimus (10 days only) plus DSBMI pigs (n = 6). RESULTS The combination of short-term immunosuppression and DSBMI (group 6) significantly prolonged survival, compared with short-term immunosuppression only (group 5) or DSBMI only (group 2). Short-term immunosuppression and DSBMI (group 6) did not prolong overall survival, compared with indefinite immunosuppression with (group 4) or without (group 3) DSBMI: survival rates at 7, 14, and 28 days posttransplant were 100%, 100%, and 67% in group 6; 100%, 100%, and 71% in group 3; and 100%, 67%, and 47% in group 4 (P =.14). Short-term immunosuppression and DSBMI (group 6) increased the incidence of rejection, infection, and GVHD, compared with indefinite immunosuppression without (but not with) DSBMI. CONCLUSIONS Short-term immunosuppression and DSBMI did not prolong survival and did not reduce the incidence of death from rejection, infection, or GVHD, compared with indefinite immunosuppression without DSBMI. But short-term immunosuppression and DSBMI resulted in a lower incidence of death from infection and GVHD, compared with indefinite immunosuppression and DSBMI. When immunosuppression was discontinued 10 days posttransplant, the effect of DSBMI was insufficient to avert death from rejection. CLINICAL RELEVANCE The clinical results of bowel transplantation trail those of other solid organ transplants. It reduced the rates of infection and GVHD. Our study shows that systemically infused donor-specific bone marrow with short-term or indefinite immunosuppression does not improve outcome after bowel transplantation. It seems necessary to modify the time, dosing, routing, and/or composition of donor-specific bone marrow before it can be successfully used in clinical bowel transplantation.
Collapse
|
25
|
Abstract
Specific immunomodulatory strategies are required to eliminate the need for lifelong dependence on debilitating immunosuppressants. One proposed strategy is to simultaneously transplant the kidney and infuse donor-specific bone marrow cells. We prospectively studied the effect of unmodified donor-specific bone marrow infusion (DSBMI) on rejection, infection, graft-versus-host disease (GvHD), and graft survival. We performed 57 kidney transplants in mixed lymphocyte culture (MLC)-reactive, outbred pigs. The groups of recipient pigs differed according to the use of (1) indefinite versus short-term tacrolimus-based immunosuppression, (2) DSBMI, and (3) recipient preconditioning (RPC: whole body irradiation with 400 rads on day 0 and horse anti-pig thymocyte globulin (ATG) on days -2, -1, and 0). In all, we studied eight groups: group 1, nonimmunosuppressed control pigs (n = 8); group 2, nonimmunosuppressed DSBMI pigs (n = 7); group 3, nonimmunosuppressed RPC + DSBMI pigs (n = 5); group 4, tacrolimus (indefinite) pigs (n = 11); group 5, tacrolimus (10 days only) pigs (n = 5); group 6, DSBMI + tacrolimus (indefinite) pigs (n = 8); group 7, DSBMI + tacrolimus (10 days only) pigs (n = 6); and group 8, RPC + DSBMI + tacrolimus (indefinite) pigs (n = 7). DSBMI alone (group 2) or in combination with RPC (group 3) did not prolong graft survival, as compared with nonimmunosuppressed controls (group 1). In groups 1, 2, and 3, all but one pig died from rejection; in group 3 only, 45% of the pigs died from concurrent infection or GvHD, indicating that RPC in combination with DSBMI aggravated the risk of generalized infection and GvHD. Post-transplant immunosuppression--irrespective of indefinite or short-term administration--was required for prolonged graft survival. With indefinite use of immunosuppression, graft survival rates and death rates from rejection were not different for pigs with (group 6) versus without (group 4) DSBMI; however, the death rate from infection was higher in group 6, suggesting that the bone marrow inoculum increased the risk of systemic infection. With short-term use of immunosuppression, graft survival rates were higher and death rates from rejection lower for pigs with (group 7) versus without (group 5) DSBMI. But DSBMI and short-term immunosuppression (group 7) failed to prolong survival beyond that achieved with indefinite immunosuppression (groups 4 and 6). Although the combination of DSBMI and short-term immunosuppression (group 7) reduced the risk of infection, it did not avert severe rejection. The addition of RPC to DSBMI and indefinite immunosup- pression (group 8) significantly decreased graft survival, as compared with groups 4, 6, and 7. It also increased the incidence of death from rejection, GvHD, and infection, or a combination thereof. Unmodified DSBMI did not prolong graft survival after kidney transplantation, nor did it decrease the incidence of rejection. But it aggravated the risk of GvHD and infection. Short-term immunosuppression with DSBMI reduced the incidence of death from infection or GvHD, but it resulted in a higher incidence of death from rejection (as compared with indefinite use of immunosuppression). RPC, combined with DSBMI and indefinite immunosuppression, increased the death rate from rejection, GvHD, infection, or a combination thereof. In this large animal study, the effect of unmodified DSBMI has been disappointing. The search continues for the optimal way to successfully perform bone marrow augmentation in solid organ transplants.
Collapse
|
26
|
Abstract
Pancreas transplantation is the only treatment for type I diabetes mellitus that can induce an insulin-independent normoglycemic state. Because of the need for immunosuppression, it has been most widely applied in uremic diabetic recipients of kidney transplant with a high success rate, particularly when done as a simultaneous (SPK) procedure (insulin independence > 80% at 1 year) with patient and kidney graft survival rates equivalent to or higher than in those who receive a kidney transplant alone. The results of solitary pancreas transplants (PAK in nephropathic diabetic recipients or PTA in nonuremic recipients) have also dramatically improved; 1-year graft survival rates are more than 80% and 70%, respectively, with the new immunosuppressants tacrolimus and mycophenolate mofetil. Multiple factors are important for successful application of pancreas transplantation, as summarized in this review.
Collapse
|
27
|
Abstract
OBJECTIVE To determine outcome in diabetic pancreas transplant recipients according to risk factors and the surgical techniques and immunosuppressive protocols that evolved during a 33-year period at a single institution. SUMMARY BACKGROUND DATA Insulin-dependent diabetes mellitus is associated with a high incidence of management problems and secondary complications. Clinical pancreas transplantation began at the University of Minnesota in 1966, initially with a high failure rate, but outcome improved in parallel with other organ transplants. The authors retrospectively analyzed the factors associated with the increased success rate of pancreas transplants. METHODS From December 16, 1966, to March 31, 2000, the authors performed 1,194 pancreas transplants (111 from living donors; 191 retransplants): 498 simultaneous pancreas-kidney (SPK) and 1 simultaneous pancreas-liver transplant; 404 pancreas after kidney (PAK) transplants; and 291 pancreas transplants alone (PTA). The analyses were divided into five eras: era 0, 1966 to 1973 (n = 14), historical; era 1, 1978 to 1986 (n = 148), transition to cyclosporine for immunosuppression, multiple duct management techniques, and only solitary (PAK and PTA) transplants; era 2, 1986 to 1994 (n = 461), all categories (SPK, PAK, and PTA), predominantly bladder drainage for graft duct management, and primarily triple therapy (cyclosporine, azathioprine, and prednisone) for maintenance immunosuppression; era 3, 1994 to 1998 (n = 286), tacrolimus and mycophenolate mofetil used; and era 4, 1998 to 2000 (n = 275), use of daclizumab for induction immunosuppression, primarily enteric drainage for SPK transplants, pretransplant immunosuppression in candidates awaiting PTA. RESULTS Patient and primary cadaver pancreas graft functional (insulin-independence) survival rates at 1 year by category and era were as follows: SPK, era 2 (n = 214) versus eras 3 and 4 combined (n = 212), 85% and 64% versus 92% and 79%, respectively; PAK, era 1 (n = 36) versus 2 (n = 61) versus 3 (n = 84) versus 4 (n = 92), 86% and 17%, 98% and 59%, 98% and 76%, and 98% and 81%, respectively; in PTA, era 1 (n = 36) versus 2 (n = 72) versus 3 (n = 30) versus 4 (n = 40), 77% and 31%, 99% and 50%, 90% and 67%, and 100% and 88%, respectively. In eras 3 and 4 combined for primary cadaver SPK transplants, pancreas graft survival rates were significantly higher with bladder drainage (n = 136) than enteric drainage (n = 70), 82% versus 74% at 1 year (P =.03). Increasing recipient age had an adverse effect on outcome only in SPK recipients. Vascular disease was common (in eras 3 and 4, 27% of SPK recipients had a pretransplant myocardial infarction and 40% had a coronary artery bypass); those with no vascular disease had significantly higher patient and graft survival rates in the SPK and PAK categories. Living donor segmental pancreas transplants were associated with higher technically successful graft survival rates in each era, predominately solitary (PAK and PTA) in eras 1 and 2 and SPK in eras 3 and 4. Diabetic secondary complications were ameliorated in some recipients, and quality of life studies showed significant gains after the transplant in all recipient categories. CONCLUSIONS Patient and graft survival rates have significantly improved over time as surgical techniques and immunosuppressive protocols have evolved. Eventually, islet transplants will replace pancreas transplants for suitable candidates, but currently pancreas transplants can be applied and should be an option at all stages of diabetes. Early transplants are preferable for labile diabetes, but even patients with advanced complications can benefit.
Collapse
|
28
|
|
29
|
A prospective, randomized, open-label study of steroid withdrawal in pancreas transplantation-a preliminary report with 6-month follow-up. Transplant Proc 2001; 33:1663-4. [PMID: 11267460 DOI: 10.1016/s0041-1345(00)02632-4] [Citation(s) in RCA: 36] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
|
30
|
|
31
|
|
32
|
|
33
|
Abstract
BACKGROUND For certain uremic, diabetic patients, a sequential transplant of a kidney (usually from a living donor) followed by a cadaver pancreas has become an attractive option. But how long to wait after the kidney transplant before proceeding with a pancreas transplant is unclear. We studied outcomes in recipients of a pancreas at varying times after a kidney to determine the optimal timing for the second transplant. METHODS We compared pancreas after kidney (PAK) transplants performed early (< or =4 months) and late (>4 months) after the kidney transplant to determine any significant differences in surgical complications or outcomes between the two groups. RESULTS Between January 1, 1994, and September 30, 1998, we performed 123 cadaver PAK transplants. Of these, 25 (20%) were early and 98 (80%) were late. Characteristics of the two recipient groups were similar. We found no significant differences in outcome between the two groups. The incidence of surgical complications (bleeding, leaks, thrombosis, infections) and of opportunistic infections (such as cytomegalovirus) did not significantly differ between the two groups. Graft and patient survival rates were also equivalent (P=NS). The incidence of acute rejection by 3 months posttransplant was 20% in both groups. CONCLUSION The timing of the pancreas transplant for PAK recipients does not seem to influence outcome. As long as an acceptable organ is available and the recipient is clinically stable, a PAK transplant can be performed relatively soon after the kidney transplant.
Collapse
|
34
|
Living related liver transplantation in infants and children: report of anesthetic care and early postoperative morbidity and mortality. J Clin Anesth 2000; 12:454-9. [PMID: 11090731 DOI: 10.1016/s0952-8180(00)00192-6] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/17/2022]
Abstract
STUDY OBJECTIVE To determine those infants at high risk for perioperative complications and mortality following living, related liver transplantation. DESIGN Retrospective chart review. SETTING Large metropolitan teaching hospital. MEASUREMENTS AND MAIN RESULTS The charts and anesthetic records of the 12 infants and children who received the left lateral hepatic segment from a living relative the past 2 years at our institution were reviewed. The records were examined to determine the causes of perioperative morbidity and to identify patients at high risk for serious complications and mortality. All infants and children (mean +/- SD age, 29+/-30 months; weight, 13.6 +/-6.8 kg) survived the operation (8.3+/-1.7 hours) without intraoperative complications. The average blood loss, including 500 mL of recipient blood used to flush the liver before reperfusion, was 1483 +/-873 mL (119+/-70 mL/kg). Three infants developed portal vein thrombosis, and one of these infants also had hepatic artery thrombosis. The risk of vessel thrombosis was significantly higher (3/3 vs. 0/9; p<0.0045) in infants less than 9 kg body weight, as was the risk of death (2/3 vs. 0/9; p<0.045). Both children who died had vascular thrombosis. Other serious complications were bleeding, 6; infection, 7; acute rejection, 3; and bile leak, 2. CONCLUSIONS Infants and children can successfully undergo living, related liver transplantation. However, the risks of vascular complications and death are greater in infants less than 9 kg body weight.
Collapse
|
35
|
Abstract
OBJECTIVE To document the decreased incidence of surgical complications after pancreas transplantation in recent times. SUMMARY BACKGROUND DATA Compared with other abdominal transplants, pancreas transplants have historically had the highest incidence of surgical complications. However, over the past few years, the authors have noted a significant decrease in the incidence of surgical complications. METHODS The authors studied the incidence of early (<3 months after transplant) surgical complications (e.g., relaparotomy, thrombosis, infections, leaks) after 580 pancreas transplants performed during a 12-year period. Patients were analyzed and compared in two time groups: era 1 (June 1, 1985, to April 30, 1994, n = 367) and era 2 (May 1, 1994, to June 30, 1997, n = 213). RESULTS Overall, surgical complications were significantly reduced in era 2 compared with era 1. The relaparotomy rate decreased from 32.4% in era 1 to 18.8% in era 2. Significant risk factors for early relaparotomy were donor age older than 40 years and recipient obesity. Recipients with relaparotomy had significantly lower graft survival rates than those without relaparotomy, but patient survival rates were not significantly different. A major factor contributing to the lower relaparotomy rate in era 2 was a significant decrease in the incidence of graft thrombosis; the authors believe this lower incidence is due to the routine use of postoperative low-dose intravenous heparin and acetylsalicylic acid. The incidence of bleeding requiring relaparotomy did not differ between the two eras. Older donor age was the most significant risk factor for graft thrombosis. The incidence of intraabdominal infections significantly decreased between the two eras; this decrease may be due to improved prophylaxis regimens in the first postoperative week. CONCLUSIONS Although a retrospective study has its limits, the results of this study, the largest single-center experience to date, show a significant decrease in the surgical risk associated with pancreas transplants. Reasons for this decrease are identification of donor and recipient risk factors, better prophylaxis regimens, refinements in surgical technique, and improved immunosuppressive regimens. These improved results suggest that more widespread application of pancreas transplantation is warranted.
Collapse
|
36
|
Abstract
Post-transplant hemolytic uremic syndrome (HUS) is an uncommon but well-described complication in solid organ transplant recipients. Believed to be secondary to immunosuppressive therapy, it has been reported after kidney, liver, pancreas, heart, and lung transplants. In all reported cases, the primary organ affected was the kidney (transplant or native). But until now, no cases after small-bowel transplants and no cases in which the kidney was not the primary organ affected have been reported. We report two cases of HUS in small-bowel transplant recipients. In our first case, clinical presentation was with renal failure; biopsy of the native kidney demonstrated the typical histological changes seen with HUS, namely occlusion of the microcirculation by thrombi and platelet aggregation. Immunosuppression was changed from tacrolimus to cyclosporin, but with no improvement in renal function. In our second case, the transplanted bowel was the primary organ affected. This recipient presented with ulcers in the bowel mucosa, which were believed to be ischemic in origin, secondary to occlusive vascular lesions affecting the small vessels in the transplanted bowel. Her tacrolimus dose was decreased with resolution of ulcers and no evidence of rejection. These two cases represent the first reports of HUS after small-bowel transplants; in addition, our second case represents the first report of an extrarenal graft as the primary organ affected. When caring for small-bowel transplant recipients, physicians must be alert to the possibility of HUS and its various presentations.
Collapse
|
37
|
Abstract
Simultaneous pancreas-kidney transplant from living donors has been recently proposed as an effective therapeutic option in selected uremic patients with type I diabetes. We report the first simultaneous pancreas-kidney transplant performed between identical twins. Posttransplant, the recipient has been maintained on low dose cyclosporine to avoid recurrent auto-immune insulitis. At the 1-year follow-up, both donor and recipient are well with normal renal function and excellent glucose control. Simultaneous pancreas-kidney transplant between identical twins can be performed successfully using cyclosporine to prevent recurrent auto-immune insulitis.
Collapse
|
38
|
|
39
|
MRI is superior to angiography for evaluation of living-related simultaneous pancreas and kidney donors. Transplant Proc 1999; 31:604-5. [PMID: 10083255 DOI: 10.1016/s0041-1345(98)01575-9] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
|
40
|
Vascular graft thrombosis after pancreas transplantation: comparison of the FK 506 and cyclosporine eras. Transplant Proc 1999; 31:602-3. [PMID: 10083254 DOI: 10.1016/s0041-1345(98)01574-7] [Citation(s) in RCA: 23] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
41
|
Cytomegalovirus disease recurrence after ganciclovir treatment in kidney and kidney-pancreas transplant recipients. Transplantation 1999; 67:94-7. [PMID: 9921803 DOI: 10.1097/00007890-199901150-00016] [Citation(s) in RCA: 53] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
BACKGROUND With the introduction of ganciclovir, the clinical pattern of cytomegalovirus (CMV) disease has changed; CMV disease recurrence after successful treatment of the initial episode has emerged as a more common problem. We studied CMV disease recurrence in kidney transplant (KTx) and simultaneous kidney-pancreas transplant (SPK) recipients, and identified risk factors for recurrence. METHODS Between January 1987 and December 1995, of 1272 KTx and 287 SPK recipients, 332 developed CMV disease and were treated with a 14-day course of i.v. ganciclovir, followed by a 10-week course of oral acyclovir. Among these 332 recipients, 103 (31%) developed CMV disease recurrence more than 30 days after treatment for the initial episode; this group was compared with those recipients who did not develop recurrence (n=229). Risk factors examined were age, presence of diabetes, type of transplant (KTx vs. SPK), donor source (cadaver vs. living donor), treatment for acute rejection, pretransplant CMV serologic status, evidence of tissue-invasive CMV, and treatment of the initial episode with human immune globulin in addition to ganciclovir. RESULTS Univariate analysis found that patients with recurrence were more likely to be diabetic (70.9% vs. 53.7%; P=0.04), to have undergone an SPK (39.8% vs. 20.5%; P=0.004), to have received a cadaver organ (78.6% vs. 61.6%; P=0.002), and to have received treatment for acute rejection (78.6% vs. 59.8%; P=0.001). Using multivariate analysis, two statistically significant risk factors were found: receiving a cadaver organ (relative risk [RR]=1.90; P=0.03) and treatment for acute rejection (RR=2.02; P=0.008). Diabetes (RR=1.44; P=0.18) and a cadaver SPK transplant (RR=1.55; P=0.12) tended toward increased risk for recurrence, but the difference did not reach statistical significance. The remaining variables were not significant. Interestingly, CMV recurrence did not significantly diminish 5-year graft survival (52.0% vs. 54.4%; P not significant) or patient survival (67.0% vs. 68.3%; P not significant) rates. CONCLUSIONS CMV disease recurs in roughly one-third of KTx and SPK recipients after treatment of the initial episode with ganciclovir. A cadaver organ source and treatment for acute rejection were the most significant clinical risk factors for recurrence. Clinical predictors of recurrence such as these may help to identify those recipients who need more intensive therapeutic and prophylactic regimens.
Collapse
|
42
|
Thrombotic microangiopathy after liver-small bowel transplant. Clin Transplant 1998; 12:600-1. [PMID: 9850460] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/09/2023]
Abstract
We herein report the first case of immunosuppression-associated thrombotic microangiopathy (TMA) in which an extrarenal graft was primarily affected by the characteristic microvascular lesions. Although TMA is a well-known complication of cyclosporine (CSA) or tacrolimus therapy in renal and extrarenal (liver, heart, lung) transplant recipients, the kidney (transplanted or native) is typically the site primarily affected. We describe a combined liver-small bowel transplant recipient who developed tacrolimus-associated TMA that affected both her transplanted small bowel and her native kidneys. Involvement of the bowel, with evidence of microvascular occlusion on biopsy, led to the development of ischemic mucosal ulcers and eventual bowel perforation. Involvement of the kidney manifested with a doubling of the recipient's baseline serum creatinine level. Significant lowering of the tacrolimus dose resulted in healing of the small bowel ulcers and return to her baseline level of renal function. Therefore, it is important to note that, in transplant recipients, TMA with microvascular occlusion may affect extrarenal sites. In small bowel transplant recipients, the result might be ischemic ulcers in the graft and eventual bowel perforation.
Collapse
|
43
|
Conversion from bladder to enteric drainage after pancreaticoduodenal transplantations. Surgery 1998; 124:883-93. [PMID: 9823403] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/09/2023]
Abstract
BACKGROUND Bladder drainage is the most common technique for managing the exocrine secretions of pancreaticoduodenal grafts. However, bladder drainage can cause urinary, pancreatic, and metabolic complications that may require conversion to enteric drainage. With enteric drainage, urinary amylase levels cannot be monitored as a marker for rejection. After enteric conversion, rejection is the major cause of graft loss. Timing the conversion to reduce immunologic graft loss would greatly improve patient and graft survival rates. Our study was designed to assess the incidence of, indications for, and complications of converting from bladder to enteric drainage after pancreaticoduodenal transplantations. METHODS We retrospectively reviewed our experience with 80 recipients who underwent enteric conversion. We studied the recipient category, the interval from transplantation to conversion, the interval from the last rejection episode to conversion, the indications for conversion, the type of enteric drainage at conversion (loop versus Roux-en-Y), the results of the conversion, and postconversion complications. RESULTS The major indications for conversion were metabolic acidosis (n = 26, 33%), recurrent urinary tract infections (UTIs) (n = 16, 20%), reflux pancreatitis (n = 15, 19%), and hematuria (n = 12, 15%). For most recipients, their symptoms resolved after conversion (n = 76, 95%). The cumulative probability of undergoing conversion was 13% at 12 months, 21% at 36 months, and 25% at 60 months. Of the recipients with surgical complications after conversion (n = 12, 15%), one lost his graft as a result of pancreatitis. Overall, of the 80 recipients who underwent conversion, 12 (15%) lost their graft, most due to rejection (n = 8, 75%). Immunologic graft loss was highest for recipients of pancreas transplants alone who underwent conversion < or = 6 months after transplantation or < or = 1 year after their last rejection episode. CONCLUSIONS Enteric conversion is safe and therapeutic in recipients with complications related to the exocrine secretions of bladder-drained pancreas grafts. After conversion, rejection accounted for 75% of the grafts lost. However, waiting at least 1 year after the last rejection episode significantly reduced immunologic graft loss.
Collapse
|
44
|
|
45
|
Abstract
BACKGROUND Mycophenolate mofetil (MMF) has been shown to decrease the incidence of acute rejection episodes after kidney transplantation. The use of MMF along with tacrolimus for > or =1 year after pancreas transplantation has not been studied in a large single-center analysis. METHODS Between July 1, 1995 and June 30, 1997, both MMF and tacrolimus were given to 120 pancreas transplant recipients. By category, 61 underwent simultaneous pancreas-kidney transplantation (SPK); 44 underwent pancreas transplantation after previous kidney transplantation (PAK); and 15 underwent pancreas transplantation alone (PTA). By donor source, 86% of the grafts were from a cadaver, and 14% were from a living-related donor. Induction therapy was with MMF, tacrolimus, prednisone, and antithymocyte globulin (n=109) or OKT3 (n=2). Until oral intake was resumed, recipients initially received intravenous azathioprine. Side effects were as follows: gastrointestinal (GI) toxicity in 53% of recipients receiving combined MMF and tacrolimus therapy; bone marrow toxicity in 24% of recipients receiving MMF alone; nephrotoxicity in 18% and neurotoxicity in 11% of recipients receiving tacrolimus alone. We did a matched-pair analysis to compare outcome in MMF versus azathioprine recipients, using the database of the International Pancreas Transplant Registry. Matching criteria included transplantation category, transplantation number, recipient and donor age, duct management, HLA typing, and transplantation year. RESULTS One-year patient survival rates were 98% for SPK, 98% for PAK, and 100% for PTA (P=NS). For SPK recipients, 1-year pancreas graft survival rates were 86% with MMF versus 79% with azathioprine (P=NS); kidney graft survival rates were 96% with MMF versus 86% with azathioprine (P=NS). The incidence of first rejection episodes at 1 year was significantly lower for MMF recipients (15% with MMF versus 43% with azathioprine) (P = 0.0003). For recipients of solitary pancreas transplants (PTA and PAK), we found no difference in graft survival rates between MMF and azathioprine. The conversion rate from MMF to azathioprine at 1 year was 14% for SPK recipients, 26% for PAK, and 39% for PTA (P < 0.007). The most common reason for conversion was GI toxicity, in particular for nonuremic (PTA) or posturemic (PAK) recipients. The rates of posttransplant infection and lymphoproliferative disease were low for recipients on MMF and tacrolimus. CONCLUSIONS The combination of MMF and tacrolimus after pancreas transplantation is highly effective and safe. For SPK recipients, the incidence of acute reversible rejection episodes was significantly lower with MMF than with azathioprine. The conversion rate from MMF to azathioprine because of GI toxicity was lowest for SPK and highest for PTA recipients.
Collapse
|
46
|
Abstract
Pancreas transplantation is now performed as a routine treatment for uremic diabetic recipients of kidney transplants either simultaneously with or after the kidney. Such patients are obligated to immunosuppression and with a successful pancreas transplant can achieve insulin independence as well as a dialysis-free state. Pancreas transplants alone are less commonly applied because of the need for immunosuppression, but the trade off to achieve an insulin-independent state may be worthwhile for individual patients, particularly those who are labile with hypoglycemic unawareness. This option should certainly be a part of the treatment armentation of the modern diabetologist. A positive effect on secondary complications will certainly occur with an early transplant, and even late can have an impact as has been shown for neuropathy. Whether the simpler procedure of islet transplantation will replace pancreas transplants remains to be seen. Of more than 200 islet allografts performed in the 1990s, less than 10% of the recipients have achieved insulin independence at 1 year. Clinical islet trials are ongoing but limited to patients who accept a low individual probability of success to assist in development, or to those in whom the surgical risks of a pancreas transplant is high. Islet transplantation has held promise for over 25 years, but candidates for endocrine replacement therapy must honestly be told the difference in success rates, which are currently much higher with the pancreas.
Collapse
|
47
|
|
48
|
Donor-specific portal blood transfusion in intestinal transplantation: a prospective, preclinical large animal study. Transplantation 1998; 66:164-9. [PMID: 9701258 DOI: 10.1097/00007890-199807270-00004] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
BACKGROUND Unlike in kidney and heart transplantation, the role of pretransplant donor-specific blood transfusion (DST) has not been studied prospectively in a large animal model of bowel transplantation. We investigated the impact of portal versus systemic DST on overall survival, rejection, graft-versus-host disease (GVHD), and infection after total (small and large) bowel transplantation in pigs. METHODS Mixed lymphocyte culture-reactive, outbred pigs underwent total enterectomy and orthotopic total bowel transplantation with portal vein graft drainage. One unit of donor blood was transfused via the portal or systemic circulation (according to a randomization protocol) before graft implantation was begun. We studied six groups, all of which underwent at least a total bowel transplant: group 1 (n=5) comprised nonimmunosuppressed control pigs with portal DST; group 2 (n=6), nonimmunosuppressed control pigs with systemic DST; group 3 (n=5), cyclosporine (CsA)-treated pigs with portal DST; group 4 (n=5), CsA-treated pigs with systemic DST; group 5 (n=5), tacrolimus-treated pigs with portal DST; and group 6 (n=5), tacrolimus-treated pigs with systemic DST. All immunosuppressed pigs received prednisone (2 mg/kg/day) and either CsA (to maintain levels between 250 and 350 ng/ml) or tacrolimus (to maintain levels between 10 and 30 ng/ml). Stomal biopsies and autopsies were obtained to study the incidence of rejection, GVHD, and infection. RESULTS Portal DST and tacrolimus-based immunosuppression resulted in the highest survival rates. At 7, 14, and 28 days after transplantation, survival rates in group 5 were 100%, 100%, and 80%; in group 6, 100%, 60%, and 40%; and in group 3, 100%, 0%, and 0%, respectively. Only the combination of portal DST and tacrolimus prevented the occurrence of, and death from, rejection. Death from rejection at 7, 14, and 28 days in group 5 was 0%, 0%, and 0%; in group 6, 0%, 33%, and 67%; and in group 3, 0%, 100%, and 100%, respectively. Of note, if immunosuppression was used, the groups with portal (versus systemic) DST had a higher risk of death from infection but a lower risk of death from GVHD. Simultaneous immunologic events were noted more frequently in groups with systemic (versus portal) DST. Long-term survival was noted only in groups with tacrolimus-based immunosuppression and was more common for those with portal (versus systemic) DST. CONCLUSIONS Portal DST at the time of total bowel transplantation and posttransplant immunosuppression with tacrolimus prevent rejection and significantly increase graft survival. The combination of portal antigen presentation and tacrolimus needs to be studied in clinical bowel transplantation.
Collapse
|
49
|
Abstract
BACKGROUND Simultaneous kidney-pancreas transplantation has become a recognized therapy for type I diabetes mellitus patients with diabetic nephropathy, neuropathy, and retinopathy. In the vast majority of these procedures, both grafts are placed intraperitoneally, which reduces posttransplant morbidity. Recently, in some of our recipients, we noted renal dysfunction related to complications of the renal pedicle. Our objectives in this study were to identify the cause of this renal dysfunction and to prevent its occurrence in future recipients. STUDY DESIGN We undertook a retrospective chart review of simultaneous kidney-pancreas recipients who experienced renal dysfunction related to renal pedicle complications. RESULTS We found four recipients with renal dysfunction related to renal pedicle torsion, diagnosed by serial ultrasound scans and kidney graft biopsies. Early diagnosis allowed salvage of three kidney grafts, but one was lost after late diagnosis. CONCLUSIONS A high level of suspicion is needed to diagnose renal pedicle torsion. If simultaneous kidney-pancreas recipients have recurrent renal dysfunction, and rejection has been excluded, serial ultrasound scans with color flow Doppler examinations are needed. Once the diagnosis is made, a nephropexy to the anterior abdominal wall is indicated to prevent further torsion and save the kidney graft. We recommend prophylactic nephropexy of left renal grafts if the renal pedicle is > or = 5 cm long and if there is a 2 cm or more discrepancy between the length of the artery and the vein.
Collapse
|
50
|
|