76
|
Dossett LA, Swenson BR, Evans HL, Bonatti H, Sawyer RG, May AK. Serum estradiol concentration as a predictor of death in critically ill and injured adults. Surg Infect (Larchmt) 2008; 9:41-8. [PMID: 18363467 DOI: 10.1089/sur.2007.037] [Citation(s) in RCA: 37] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
BACKGROUND Whereas animal models of sepsis demonstrate survival benefits for the pro-estrus state, human observational studies have failed to demonstrate a consistent survival advantage among female patients. Estrogen biosynthesis differs substantially in primate and non-primate animals, and estrogens have diverse immunologic actions. Estrogen concentrations are elevated in response to critical illness and injury (regardless of sex), and elevated concentrations of serum estradiol are associated with a higher mortality rate. Our objective was to determine the predictive ability and test characteristics of the serum estradiol concentration at 48 h in critically ill patients. METHODS A prospective cohort study of surgical and trauma adult intensive care unit patients at two academic tertiary-care centers. Sex hormones (estradiol, progesterone, testosterone, prolactin, and dehydroepiandrosterone) and cytokines were assayed at 48 h, and the 28-day all-cause mortality rate was assessed. RESULTS There was no difference in mortality rates between the sexes (survivors being male in 75.2% of cases vs. 76.0% in non-survivors; p = 0.43). The serum estradiol concentration was significantly elevated in non-survivors regardless of sex (median 18.7 pg/mL [interquartile range {IRQ} 9.99-43.6] in survivors and 40.7 pg/mL [IQR 9.99-94.8] in non-survivors; p < 0.001). The area under the receiver-operating characteristic (ROC) curve for serum estradiol was 0.64 (95% confidence interval [CI] 0.55, 0.72). The parameter with the largest ROC curve was the Acute Physiology and Chronic Health Evaluation (APACHE) II score (0.75; 95% CI 0.68, 0.82). A serum estradiol cut-point of 50 pg/mL was 48% sensitive and 80% specific in predicting death and classified the outcome of 76% of patients correctly. CONCLUSIONS Serum estradiol concentration is a valuable prognostic tool and potential contributor to adverse outcomes of critically ill or injured surgical patients.
Collapse
|
77
|
Hedrick TL, Schulman AS, McElearney ST, Smith RL, Swenson BR, Evans HL, Truwit JD, Scheld WM, Sawyer RG. Outbreak of Resistant Pseudomonas aeruginosa Infections during a Quarterly Cycling Antibiotic Regimen. Surg Infect (Larchmt) 2008; 9:139-52. [DOI: 10.1089/sur.2006.102] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
|
78
|
Smith RL, Chong TW, Hedrick TL, Hughes MG, Evans HL, McElearney ST, Pruett TL, Sawyer RG. Does body mass index affect infection-related outcomes in the intensive care unit? Surg Infect (Larchmt) 2008; 8:581-8. [PMID: 18171117 DOI: 10.1089/sur.2006.079] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/17/2022] Open
Abstract
BACKGROUND Obesity is a worldwide healthcare concern, but its impact on critical care (intensive care unit; ICU) outcomes is not well understood. The general hypothesis is that obesity worsens ICU outcomes, but published reports fail to demonstrate this effect consistently. We hypothesized that increasing BMI would be an independent predictor of higher mortality rates in the surgical/trauma ICU. METHODS Data on patients with infections, defined by U.S. Centers for Disease Control and Prevention criteria, were collected prospectively from a single university surgical/trauma ICU. From 1996 to 2003, 807 such patients had measurable BMIs on admission to the ICU and were divided into underweight (<18.5 kg/m(2)), normal weight (18.5-24.9 kg/m(2)), overweight (25.0-29.9 kg/m(2)), obese (30.0-39.9 kg/m(2)), and morbidly obese (> or =40.0 kg/m(2)). The primary outcome was in-hospital death. Bivariate and multivariate analyses were performed. RESULTS In-hospital death was associated with increasing age, increasing average Acute Physiology and Chronic Health Evaluation (APACHE) II score, history of diabetes (p = 0.001), cardiac disease (p = 0.001), hypertension (p = 0.044), history of cerebrovascular disease (p = 0.021), renal insufficiency (p = 0.007), need for hemodialysis (p < 0.001), history of pulmonary disease (p = 0.012), requirement for mechanical ventilation while in the ICU (p = 0.107), history of malignant disease (p < 0.001), and history of liver disease (p < 0.001). The multivariate analysis selected age (odds ratio [OR] 1.03 per integer; confidence interval [CI] 1.0, 1.05), APACHE II score (OR 1.17 per integer; CI 1.12, 1.74), diabetes (OR 2.20; CI 1.32, 3.65), mechanical ventilation (OR 1.88; CI 1.21, 2.94), malignancy (OR 2.54; CI 1.43, 4.47), and liver disease (OR 5.01; CI 2.69, 9.32) as significant risk factors. When controlling for these variables, none of the BMI groups had an independent association with death compared with the normal weight group. CONCLUSION Contrary to the hypothesis, the data suggest no discernable independent association of increasing BMI with heightened mortality rate in the surgical/trauma ICU patient with infection.
Collapse
|
79
|
Hedrick TL, McElearney ST, Smith RL, Evans HL, Pruett TL, Sawyer RG. Duration of antibiotic therapy for ventilator-associated pneumonia caused by non-fermentative gram-negative bacilli. Surg Infect (Larchmt) 2008; 8:589-97. [PMID: 18171118 DOI: 10.1089/sur.2006.021] [Citation(s) in RCA: 29] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/15/2023] Open
Abstract
BACKGROUND AND PURPOSE Chastre et al. compared eight and 15 days of antibiotic therapy for ventilator-associated pneumonia (VAP), finding no difference in outcome with the exception of VAP caused by non-fermentative gram-negative bacilli (NFGNB), for which a higher recurrence rate was seen in the shorter-duration group (JAMA 2003;290:2588-2598). We recently examined our institutional experience with VAP caused by NFGNB to determine whether shorter courses of antibiotic therapy were associated with higher rates of recurrence. METHODS Data collected on all patients completing treatment for VAP in a surgical/trauma intensive care unit from December 1996 to October 2004 were analyzed retrospectively for the relations between the duration of antibiotic therapy and recurrence and in-hospital mortality rates. RESULTS Of the 452 episodes of VAP, 154 were associated with NFGNB. Twenty-seven patients were treated with 3-8 days (mean 6.4 +/- 0.3 days) of antibiotics, whereas 127 received nine or more days (mean 17.1 +/- 0.7 days) of therapy. The recurrence rate for infections treated with the shorter course was 22% vs. 34% for patients receiving nine or more days of antibiotics (p = 0.27). The mortality rates were 22% and 14%, respectively (p = 0.38). Similar trends were demonstrated for infections caused by other organisms. CONCLUSIONS We did not find a higher recurrence rate in patients with VAP caused by NFGNB who received shorter courses of antibiotic therapy. On the contrary, those patients receiving shorter courses trended toward lower rates of recurrence. Pending further prospective trials addressing the duration of antibiotic treatment for patients with VAP caused by NFGNB, shorter courses of treatment, perhaps based on improvement in clinical parameters, may be warranted.
Collapse
|
80
|
Hedrick TL, Turrentine FE, Smith RL, McElearney ST, Evans HL, Pruett TL, Sawyer RG. Single-institutional experience with the surgical infection prevention project in intra-abdominal surgery. Surg Infect (Larchmt) 2007; 8:425-35. [PMID: 17883359 DOI: 10.1089/sur.2006.043] [Citation(s) in RCA: 27] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/20/2023] Open
Abstract
BACKGROUND The incidence of surgical site infection (SSI) is becoming a key component of standard measures of quality of performance. We hypothesized that institutional implementation of a protocol targeting known risk factors would reduce the incidence of SSI associated with intra-abdominal surgery. METHODS Beginning in June 2004, a quality control initiative was implemented to prevent SSI in patients undergoing intra-abdominal surgical procedures at an academic medical center. This protocol included administration of the proper prophylactic antibiotic 0-60 minutes before incision, continued antibiotic administration for <or=24 hours, and maintenance of intraoperative normothermia (>or=36 degrees C), along with good glycemic control (goal<200 mg/dL 48 h postoperatively) in diabetic patients. Baseline data collected during the initial four months of protocol development (379 patients) were compared with data collected during the last four months of the 11-month study period (390 patients). RESULTS Compliance with antibiotic selection increased from 89 percent to 97 percent (p <or= 0.05). Compliance with timeliness of administration improved from 89 percent to 97 percent (p <or= 0.05), whereas cessation of perioperative antibiotics within 24 hours remained constant at 93 and 92 percent, respectively. The incidence of hypothermia fell from 15 percent to 10 percent (p = 0.27). The 30-day incidence of SSI improved from 9.2 percent to 5.6 percent (p = 0.07). CONCLUSION The implementation of a prevention protocol resulted in a substantial trend toward a reduction in the incidence of SSI. These data support the use of protocol implementation as a cost-effective method of reducing perioperative infectious morbidity associated with intra-abdominal surgery.
Collapse
|
81
|
Evans HL, Lefrak SN, Lyman J, Smith RL, Chong TW, McElearney ST, Schulman AR, Hughes MG, Raymond DP, Pruett TL, Sawyer RG. Cost of Gram-negative resistance*. Crit Care Med 2007; 35:89-95. [PMID: 17110877 DOI: 10.1097/01.ccm.0000251496.61520.75] [Citation(s) in RCA: 87] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
OBJECTIVE It is unclear that infections with Gram-negative rods resistant to at least one major class of antibiotics (rGNR) have a greater effect on patient morbidity than infections caused by sensitive strains (sGNR). We wished to test the hypothesis that rGNR infections are associated with higher resource utilization. DESIGN Retrospective observational cohort study of prospectively collected data. SETTING University hospital surgical intensive care unit and ward. PATIENTS Surgical patients with at least one GNR infection. MEASUREMENTS We compared admissions treated for rGNR infection with those with sGNR infections. Primary outcomes were total hospital costs and hospital length of stay. Other outcomes included antibiotic treatment cost, in-hospital death, and intensive care unit length of stay. After univariate analysis comparing outcomes after rGNR infection with those after sGNR infection, multivariate linear regression models for hospital cost and length of stay were created to account for potential confounders. MAIN RESULTS Cost data were available for 604 surgical admissions treated for at least one GNR infection (Centers for Disease Control and Prevention criteria), 137 (23%) of which were rGNR infections. Admissions with rGNR infections were associated with a higher severity of illness at the time of infection (Acute Physiology and Chronic Health Evaluation II score, 17.6 +/- 0.6 vs. 13.9 +/- 0.3), had higher median hospital costs ($80,500 vs. $29,604, p < .0001) and median antibiotic costs ($2,607 vs. $758, p < .0001), and had longer median hospital length of stay (29 vs. 13 days, p < .0001) and median intensive care unit length of stay (13 days vs. 1 day, p < .0001). Infection with rGNR within the first 7 days of admission was independently predictive of increased hospital cost (incremental increase in median hospital cost estimated at $11,075; 95% confidence interval, $3,282-$20,099). CONCLUSIONS Early infection with rGNR is associated with a high economic burden, which is in part related to increased antibiotic utilization compared with infection with sensitive organisms. Efforts to control overuse of antibiotics should be pursued.
Collapse
|
82
|
Hedrick TL, Evans HL, Smith RL, McElearney ST, Schulman AS, Chong TW, Pruett TL, Sawyer RG. Can We Define the Ideal Duration of Antibiotic Therapy? Surg Infect (Larchmt) 2006; 7:419-32. [PMID: 17083308 DOI: 10.1089/sur.2006.7.419] [Citation(s) in RCA: 39] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
BACKGROUND Because of the increasing development of antimicrobial resistance, there is a greater responsibility within the medical community to limit the exposure of patients to antibiotics. We tested the hypothesis that shorter courses of antibiotics are associated with similar or better results than longer durations. We also sought to investigate the difference between a fixed duration of therapy and one based on physiologic measures such as fever and leukocytosis. METHODS All infectious episodes on the general surgery units of the University of Virginia Health System from December 15, 1996, to July 31, 2003, were analyzed retrospectively for the relation between the duration of antibiotic therapy and infectious complications (recurrent infection with the same organism or at the same site). All infections associated with either fever or leukocytosis were categorized into quartiles on the basis of the absolute length of antibiotic administration or the duration of treatment following resolution of fever or leukocytosis. Multivariate logistic regression models were developed to estimate the independent risk of recurrence associated with a longer duration of antibiotic use. RESULTS Of the 5,561 treated infections, 4,470 were associated with fever (temperature > or =38 degrees C) or leukocytosis (white blood cell count > or =11,000/mm(3)). For all infections, whether analyzed by absolute duration or time from resolution of leukocytosis or fever, the first or second quartiles (0-12 days, 0-9 days, 0-9 days, respectively) were associated with the lowest recurrence rates (14-18%, 17-23%, 18-19%, respectively). Individual analysis of intra-abdominal infections and pneumonia yielded similar results. The fixed-duration groups received fewer days of antibiotics on average, with outcomes similar to those in the physiologic parameters group. CONCLUSIONS Shorter courses of antibiotics were associated with similar or fewer complications than prolonged therapy. In general, adopting a strategy of a fixed duration of therapy, rather than basing duration on resolution of fever or leukocytosis, appeared to yield similar outcomes with less antibiotic use.
Collapse
|
83
|
Schulman AS, Willcutts KF, Claridge JA, O'Donnell KB, Radigan AE, Evans HL, McElearney ST, Hedrick TL, Lowson SM, Schirmer BD, Young JS, Sawyer RG. Does enteral glutamine supplementation decrease infectious morbidity? Surg Infect (Larchmt) 2006; 7:29-35. [PMID: 16509783 DOI: 10.1089/sur.2006.7.29] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
BACKGROUND Although some studies have demonstrated lower infectious morbidity in patients receiving supplemental glutamine, there remains no consensus on the utility of such treatment. This study was designed to investigate the effects of supplemental enteral glutamine on the rate and outcomes of infection in critically ill surgical patients. METHODS All 185 surgical and trauma patients admitted to a single university surgical trauma intensive care unit (STICU) over an approximately three-year period who were to receive enteral nutrition support were assigned sequentially to one of three diets: standard 1-kCal/mL feedings with added protein (Group 1), standard feedings with glutamine 0.6 g/kg per day (Group 2), or immune-modulated feedings with a similar amount of glutamine (Group 3). Group compositions and patient characteristics were similar at baseline. Data were collected prospectively on infections acquired during hospitalization. RESULTS A total of 119 patients had at least one infection: 59% of the patients in Group 1, 64% of Group 2, and 69% of Group 3 (p = NS). There were no differences among the groups in the mean number of infections. The most common sites in all groups were the lungs, blood, and urine; and the frequencies of these infections did not differ between groups. Minor differences were found between groups in the organisms isolated. Antibiotic usage did not differ. CONCLUSION Supplemental enteral glutamine in the dose studied does not appear to influence the acquisition or characteristics of infection in patients admitted to a mixed STICU.
Collapse
|
84
|
Schulman AS, Willcutts KF, Claridge JA, Evans HL, Radigan AE, O'Donnell KB, Camden JR, Chong TW, McElearney ST, Smith RL, Gazoni LM, Farinholt HMA, Heuser CC, Lowson SM, Schirmer BD, Young JS, Sawyer RG. Does the addition of glutamine to enteral feeds affect patient mortality? Crit Care Med 2005; 33:2501-6. [PMID: 16276173 DOI: 10.1097/01.ccm.0000185643.02676.d3] [Citation(s) in RCA: 62] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
OBJECTIVE Studies have failed to consistently demonstrate improved survival in intensive care unit (ICU) patients receiving immune-modulating nutrient-enhanced enteral feeds when compared with standard enteral feeds. The objective was to study in a prospective fashion the effects of adding glutamine to standard or immune-modulated (supplemented with omega-3 fatty acids, beta-carotene, and amino acids such as glutamine and arginine) tube feeds. DESIGN Prospective, unblinded study using sequential allocation. SETTING A university surgical trauma ICU. PATIENTS All surgical and trauma patients admitted to the surgical trauma ICU at a university hospital over a 3-yr period who were to receive enteral feeds (n = 185). INTERVENTIONS Sequential assignment to three isocaloric, isonitrogenous diets was performed as follows: standard 1-kcal/mL feeds with added protein (group 1), standard feeds with the addition of 20-40 g/day (0.6 g/kg/day) glutamine (group 2), or an immune-modulated formula with similar addition of glutamine (group 3). The goal for all patients was 25-30 kcal/kg/day and 2 g/kg/day protein. MEASUREMENTS AND MAIN RESULTS Patients were followed until discharge from the hospital. The primary end point was in-hospital mortality, and multiple secondary end points were recorded. In-hospital mortality for group 1 was 6.3% (four of 64) vs. 16.9% (ten of 59, p = .09) for group 2 and 16.1% (ten of 62, p = .09) for group 3. After controlling for age and severity of illness, the difference in mortality between patients receiving standard tube feeds and all patients receiving glutamine was not significant (p < or = .11). There were no statistically significant differences between the groups for secondary end points. CONCLUSIONS The addition of glutamine to standard enteral feeds or to an immunomodulatory formula did not improve outcomes. These findings suggest that enteral glutamine should not be routinely administered to patients with surgical critical illness.
Collapse
|
85
|
Evans HL, Milburn ML, Hughes MG, Smith RL, Chong TW, Raymond DP, Pelletier SJ, Pruett TL, Sawyer RG. Nature of gram-negative rod antibiotic resistance during antibiotic rotation. Surg Infect (Larchmt) 2005; 6:223-31. [PMID: 16128629 DOI: 10.1089/sur.2005.6.223] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/04/2023] Open
Abstract
PURPOSE The aim of this study was to characterize the evolution of gram-negative antibiotic resistance during a study of empiric antibiotic rotation. METHODS We showed previously that quarterly rotation of a single antibiotic class is inferior to cycling two antibiotics per quarter for empiric treatment of gram-negative rod (GNR) infections, as evidenced by increased incidence of antibiotic-resistant GNR (rGNR) infections. Resistance patterns were examined by quantifying GNRs resistant to one or more of the following drug classes: Aminoglycosides, cephalosporins, carbapenems, fluoroquinolones, or piperacillin-tazobactam. For all rGNR isolates, the mean number of antibiotic classes to which an organism was resistant was calculated per quarter, as was the number of rGNR species. RESULTS Single-antibiotic rotation (SAR) was associated with significant increases in the incidence of piperacillin-tazobactam (p < 0.0005) and cephalosporin (p = 0.003) resistance, reaching nearly 25% and 30% of rGNR isolates respectively, most notably during the quarter of designated cephalosporin use (VI). Multi-drug resistance emerged over time; resistant classes/resistant GNR isolates ranged from 1.2 in the dual-antibiotic rotation (DAR) to 1.9 in the SAR period (p = 0.02). Resistance was evident in an increasing number of unique GNR species. On average, 1.3 species were isolated per month in the DAR period and 3.0/month in the SAR period (p = 0.004), but proportionally, no single GNR species became significantly more resistant across time. Compared to only 5.8% in the DAR period, 29% noncompliance was observed in the SAR, with a six-fold increase in the use of nonscheduled empiric antibiotics due to the presence of an organism resistant to the scheduled rotation drug. CONCLUSIONS A single-antibiotic rotation is associated with increased incidence and heterogeneity of resistant GNR isolates, as well as increased multiple-drug-class resistance. The attenuation of resistance observed in the single-antibiotic rotation may reflect the effect of unintended antibiotic heterogeneity driven by increasing resistance to the antibiotic class recommended for use each quarter. This suggests that reliance on a single antibiotic class for empiric treatment of GNR infection exerts sufficient pressure within the environment to encourage the development of diversified resistance, as well as cross-resistance over antibiotic classes, thus narrowing the availability of effective antibiotic treatment.
Collapse
|
86
|
Hughes MG, Chong TW, Smith RL, Evans HL, Iezzoni JC, Sawyer RG, Rudy CK, Pruett TL. HCV infection of the transplanted liver: changing CD81 and HVR1 variants immediately after liver transplantation. Am J Transplant 2005; 5:2504-13. [PMID: 16162201 DOI: 10.1111/j.1600-6143.2005.01060.x] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
The second envelope protein at hypervariable region 1 (HVR1) has been implicated in contributing to hepatitis C virus (HCV)-host cell interactions and CD81 (a multifunctional protein) has been demonstrated to act as a cell surface receptor for HCV and may interact directly with HVR1. The purpose of the current study was to determine if certain HVR1 quasispecies variants more effectively associate with and infect allografts after liver transplantation than other HVR1 variants and whether CD81 receptor expression changes after transplantation. Blood and allograft samples were obtained from the peritransplant period in seven patients. Clones of RT-PCR product were directly sequenced to identify HVR1 quasispecies variants. Explanted liver and serial allograft biopsies in recipients with HCV were examined by immunohistochemistry (IHC) for CD81 expression. Examination of HVR1 sequences demonstrated that only a fraction of the quasispecies variants recovered from each patient's blood sampled immediately prior to transplantation associated with and infected the allografts. Genetic diversity at HVR1 decreased with reperfusion but did not significantly decrease with infection. Expression of CD81 varied during the immediate post-transplant period. In conclusion, HVR1 quasispecies variants differentially associate with, and infect allografts, after liver transplantation. Additionally, allografts express variable amounts of CD81 after transplantation.
Collapse
|
87
|
Chong TW, Smith RL, Hughes MG, Camden J, Rudy CK, Evans HL, Sawyer RG, Pruett TL. Primary human hepatocytes in spheroid formation to study hepatitis C infection. J Surg Res 2005; 130:52-7. [PMID: 16154152 DOI: 10.1016/j.jss.2005.04.043] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2005] [Revised: 03/25/2005] [Accepted: 04/13/2005] [Indexed: 11/17/2022]
Abstract
BACKGROUND Hepatitis C (HCV) is a worldwide health problem, affecting nearly 170 million people. Current models for studying Hepatitis C have focused primarily on the use of poorly permissive cell lines and viral constructs, because of the lack of a suitable animal model or an in vitro system for studying functional infection. As hepatocytes are the primary reservoir for the virus in vivo, we report on a model using primary human hepatocytes cultured in spheroid formation. MATERIALS AND METHODS The hepatocytes were harvested from uninfected liver resections and cultured as spheroids (that promotes a differentiated phenotype) or monolayers. Spheroids expressed the putative receptors CD81 and human scavenger receptor B1 in a variable pattern throughout the culture period. Samples were inoculated with infectious HCV serum, and HCV RNA was detected using RT-PCR. RNA was detected in the cells and culture medium by 3 days and 5 days after inoculation, respectively. Selection of HVR1 variants occurred in a differential pattern based on culture technique, suggesting that viral selection was dependent on host phenotype. Detection of NS5A by Western blot analysis of infected samples and immunofluorescence for HCV core protein was seen only in infected spheroids. CONCLUSION The use of spheroid formation to study Hepatitis C is associated with the establishment of HVR1 selection and functional infection. This represents a promising alternative model to study Hepatitis C.
Collapse
|
88
|
Hughes MG, Chong TW, Smith RL, Evans HL, Pruett TL, Sawyer RG. Comparison of fungal and nonfungal infections in a broad-based surgical patient population. Surg Infect (Larchmt) 2005; 6:55-64. [PMID: 15865551 DOI: 10.1089/sur.2005.6.55] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
BACKGROUND Our aim was to compare fungal and nonfungal infections among a diverse surgical patient population. METHODS Data on all hospital-acquired infectious episodes among surgical intensive care unit and ward patients were collected prospectively over six years at a single university hospital. The relationships between fungal and nonfungal infection and over 100 variables were examined using univariate and multiple logistic regression analysis. RESULTS During the study period, 3,980 infectious episodes were identified; 554 were associated with fungal infection. Multiple logistic regression analysis demonstrated that markers of severity of acute illness (higher APACHE II scores and white blood cell counts, greater transfusion of cellular blood products, mechanical ventilator dependency, and prior infection) predicted fungal infection, whereas markers of chronic illness (comorbidities) did not independently predict either fungal or nonfungal infection. Patients with fungal infection were treated with more antibiotics for longer periods of time, had prolonged lengths of stay, and more often died compared with nonfungal infection patients. A separate multiple logistic regression analysis demonstrated that both fungal infection and the number of fungal sites of infection independently predicted mortality. Of all fungal isolates, only Candida albicans and Aspergillus spp. independently predicted mortality. CONCLUSIONS Fungal infections differ significantly in character and outcomes from nonfungal infections among surgical patients.
Collapse
|
89
|
Saalwachter AR, Evans HL, Willcutts KF, O'Donnell KB, Radigan AE, McElearney ST, Smith RL, Chong TW, Schirmer BD, Pruett TL, Sawyer RG. A nutrition support team led by general surgeons decreases inappropriate use of total parenteral nutrition on a surgical service. Am Surg 2004; 70:1107-11. [PMID: 15663055] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/01/2023]
Abstract
The purpose of this study was to decrease the number of inappropriate orders for total parenteral nutrition (TPN) in surgical patients. From February 1999 through November 2000 and between July 2001 and June 2002, the surgeon-guided adult nutrition support team (NST) at a university hospital monitored new TPN orders for appropriateness and specific indication. In April 1999, the NST was given authority to discontinue inappropriate TPN orders. Indications, based on the American Society for Parenteral and Enteral Nutrition (ASPEN) standards, included short gut, severe pancreatitis, severe malnutrition/catabolism with inability to enterally feed > or =5 days, inability to enterally feed >50 per cent of nutritional needs > or =9 days, enterocutaneous fistula, intra-abdominal leak, bowel obstruction, chylothorax, ischemic bowel, hemodynamic instability, massive gastrointestinal bleed, and lack of abdominal wall integrity. The number of inappropriate TPN orders declined from 62/194 (32.0%) in the first 11 months of the study to 22/168 (13.1%) in the second 11 months (P < 0.0001). This number further declined to 17/215 (7.9%) in the final 12 months of data collection, but compared to the second 11 months, this decrease was not statistically significant (P = 0.1347). The involvement of a surgical NST was associated with a reduction in inappropriate TPN orders without a change in overall use.
Collapse
|
90
|
Askari R, Evans HL, Sawyer RG, May AK. SEX HORMONES, GENDER DIFFERENCES AND MORTALITY IN THE CRITICALLY ILL PATIENT. Crit Care Med 2004. [DOI: 10.1097/00003246-200412001-00556] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
91
|
Smith RL, Chong TW, Hughes MG, Hedrick TL, Evans HL, McElearney ST, Saalwachter AR, Raymond DP, Du K, Rudy CK, Pruett TL, Sawyer RG. Impact of immunomodulatory oligodeoxynucleotides on cytokine production in the lipopolysaccharide-stimulated human whole blood model. Surgery 2004; 136:464-72. [PMID: 15300216 DOI: 10.1016/j.surg.2004.05.026] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
Abstract
BACKGROUND In studying potential immunotherapeutics for sepsis, we used a lipopolysaccharide (LPS)-stimulated whole blood model to test the immunomodulating capacity of cytosine-phospho-guanine oligodeoxynucleotides (CpG ODNs). We hypothesized that CpG ODNs would have considerable counterinflammatory effects on LPS-induced cytokine production. METHODS We administered 4 micromol/L of CpG ODNs (2216, D19, 2006, K3, or 1668) or guanine-phospho-cytosine (GpC) ODNs (control D or control K) immediately after LPS (10 ng/mL) stimulation of heparinized human whole blood. Samples were incubated for 1, 3, 6, 12, and 24 hours. Media and LPS were used as negative and positive controls. Cell-free supernatants were obtained and evaluated for interferon gamma (IFN-gamma), interleukin (IL)-12(p40), tumor necrosis factor alpha, IL-6, IL-10, IFN-alpha, IL-8, and IL-1beta by ELISA. RESULTS Compared to LPS alone, significantly reduced levels of IFN-gamma, IL-12(p40), tumor necrosis factor alpha, and IL-6 were associated with both CpG and GpC ODNs administration to LPS-stimulated whole blood. IL-10 levels were concomitantly increased. However, IFN-alpha generation was CpG specific as was increased IL-8 levels. Lastly, only 2216 was associated with decreased IL-1beta levels. CONCLUSIONS CpG ODNs and GpC ODNs in the LPS-stimulated whole blood model demonstrate differential counterinflammatory effects, but only CpG ODNs were associated with proinflammatory cytokine production. With further examination, we may find that these observed immunomodulatory differences could potentially be exploited for therapeutic benefit.
Collapse
|
92
|
Hughes MG, Evans HL, Lightfoot L, Chong TW, Smith RL, Raymond DP, Pelletier SJ, Claridge JA, Pruett TL, Sawyer RG. Does prior transfusion worsen outcomes from infection in surgical patients? Surg Infect (Larchmt) 2004; 4:335-43. [PMID: 15012860 DOI: 10.1089/109629603322761391] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
BACKGROUND Controversy continues to exist regarding the immunomodulatory effects of cellular blood transfusions in the fields of oncology, transplantation, and infectious diseases. Numerous studies have correlated transfusion with hospital-acquired infection, but the impact of transfusion on infection-related mortality has not been addressed. The objective of this study was to determine the effect of transfusion on outcomes among infected surgical patients. METHODS Data on all hospital-acquired infectious episodes among surgical intensive care unit and ward patients were collected prospectively over 39 months at a single university hospital. The relationships between prior transfusion (defined as the receipt of allogeneic red blood cells or platelets during the index hospitalization but prior to the development of infection) and over 100 variables were examined using univariate and multiple logistic regression analysis, with inclusion of a propensity score for prior transfusion to attempt to account for treatment selection bias. RESULTS During the study period, 1,228 infectious episodes occurred; 641 were associated with the transfusion of packed red blood cells or platelets. Univariate analysis revealed that patients with those infectious episodes following transfusion had higher Acute Physiology and Chronic Health Evaluation II (APACHE II) scores (18.3 +/- 0.3 vs. 10.9 +/- 0.3, p < 0.0001) and higher mortality (26.0% vs. 8.9%, p < 0.0001). Those patients who died received a greater number of transfusions prior to infection than those who lived (7.7 +/- 0.8 vs. 4.5 +/- 0.3, p = 0.0002). Multiple logistic regression analysis revealed that the propensity score for prior transfusion independently predicted mortality (odds ratio 9.45, 95% confidence interval 3.71-24.09, p < 0.001), but that neither the presence or absence of transfusion (OR 0.89, 95% CI 0.54-1.46, p = 0.6) nor the absolute number of units of red blood cells or platelets transfused (OR for red blood cells 1.00, 95% CI 0.98-1.02, p = 0.7; OR for platelets 0.97, 95% CI 0.90-1.05, p = 0.5) independently predicted mortality. CONCLUSIONS The transfusion of packed red blood cells or platelets prior to infection is associated with more severe disease among surgical patients, but once corrected for treatment selection bias does not appear to worsen outcomes from infection.
Collapse
|
93
|
Hughes MG, Rudy CK, Chong TW, Smith RL, Evans HL, Iezzoni JC, Sawyer RG, Pruett TL. E2 quasispecies specificity of hepatitis C virus association with allografts immediately after liver transplantation. Liver Transpl 2004; 10:208-16. [PMID: 14762858 DOI: 10.1002/lt.20060] [Citation(s) in RCA: 24] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
It is unknown whether all hepatitis C virus (HCV) quasispecies variants found within patient serum have equal capacity to associate with the liver after transplantation; however, in vitro models of HCV infection suggest that variations in the hypervariable region 1 (HVR1) of the second envelope protein (E2) may be important in infectivity. The hypothesis of the current study is that the two hypervariable regions (HVR1 and HVR2) within E2 are important in the initial virus-liver interaction, and, therefore, certain HCV quasispecies variants will be isolated from the liver after reperfusion. In 8 patients with end-stage liver disease secondary to HCV infection, HCV envelope quasispecies were determined from intraoperative serum samples obtained before the anhepatic phase of transplantation and from liver biopsies 1.5 to 2.5 hours after the transplanted liver was perfused. Explanted (native) liver biopsies were taken as a control. Sequence analysis was performed on clones of specific HCV reverse transcriptase-polymerase chain reaction products spanning HVR1 and HVR2 of the E2 protein. HVR1 was more variable than HVR2 for all samples. Quasispecies isolated from postperfusion liver differed more from serum than did explanted liver quasispecies at HVR1 (P = 0.03) but not at HVR2 (P = 0.2). Comparison of HVR1 sequences from postperfusion liver versus serum revealed significantly less HVR1 genetic complexity and diversity (P = 0.02 and P = 0.04, respectively). Immediately after transplantation but before actual infection, liver allografts select out from the infecting serum inoculum a less heterogeneous, more closely related population of quasispecies variants.
Collapse
|
94
|
Hughes MG, Evans HL, Chong TW, Smith RL, Raymond DP, Pelletier SJ, Pruett TL, Sawyer RG. Effect of an intensive care unit rotating empiric antibiotic schedule on the development of hospital-acquired infections on the non–intensive care unit ward. Crit Care Med 2004; 32:53-60. [PMID: 14707559 DOI: 10.1097/01.ccm.0000104463.55423.ef] [Citation(s) in RCA: 45] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
OBJECTIVE We have previously shown that a rotating empirical antibiotic schedule could reduce infectious mortality in an intensive care unit (ICU). We hypothesized that this intervention would decrease infectious complications in the non-ICU ward to which these patients were transferred. DESIGN Prospective cohort study. SETTING An ICU and the ward to which the ICU patients were transferred at a university medical center. PATIENTS All patients treated on the general, transplant, or trauma surgery services who developed hospital-acquired infection while on the non-ICU wards. INTERVENTIONS A 2-yr study consisting of 1-yr non-protocol-driven antibiotic use and 1-yr quarterly rotating empirical antibiotic assignment for patients treated in the ICU from which a portion of the patients were transferred. MEASUREMENTS AND MAIN RESULTS There were 2,088 admissions to the non-ICU wards during the nonrotation year and 2,183 during the ICU rotation year. Of these patients, 407 hospital-acquired infections were treated during the nonrotation year and 213 were treated during the ICU rotation (19.7 vs. 9.8 infections/100 admissions, p <.0001). During the ICU rotation year a decrease in the rate of resistant Gram-positive and resistant Gram-negative infections on the non-ICU wards occurred (2.5 vs. 1.6 infections/100 admissions, p =.04; 1.0 vs. 0.4 infections/100 admissions, p =.03). Subgroup analysis revealed that the decrease in resistant infections on the wards was due to a reduction in resistant Gram-positive and resistant Gram-negative infections among non-ICU ward patients admitted initially from areas other than the ICU implementing the antibiotic rotation (e.g., home, other ward, or a different ICU) (1.8 vs. 0.5 infections/100 admissions, p =.0001; 0.7 vs. 0.2 infections/100 admissions, p =.02), not due to differences for those transferred to the ward from the rotation ICU (10.4 vs. 9.7 infections/100 admissions, p = 1.0; 4.3 vs. 1.9 infections/100 admissions, p =.3). No differences in infection-related mortality were detected. CONCLUSIONS An effective rotating empirical antibiotic schedule in an ICU is associated with a reduction in infectious morbidity (hospital-acquired and resistant hospital-acquired infection rates) on the non-ICU wards to which patients are transferred.
Collapse
|
95
|
Evans HL, Shaffer MM, Hughes MG, Smith RL, Chong TW, Raymond DP, Pelletier SJ, Pruett TL, Sawyer RG. Contact isolation in surgical patients: a barrier to care? Surgery 2003; 134:180-8. [PMID: 12947316 DOI: 10.1067/msy.2003.222] [Citation(s) in RCA: 144] [Impact Index Per Article: 6.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/26/2022]
Abstract
BACKGROUND Contact isolation is commonly used to prevent transmission of resistant organisms. We hypothesized that contact isolation negatively impacts the amount of direct patient care. METHODS For 2 hours per day over a 5-week period, a single observer recorded provider/patient contact in adjacent isolated and nonisolated patient rooms on both the surgical intensive care unit (ICU) and surgical wards of a university hospital. Number of visits, contact time, and compliance with isolation were recorded, as was illness severity as assessed by APACHE II score. RESULTS Isolated patients were visited fewer times than nonisolated patients (5.3 vs 10.9 visits/h, P <.0001) and had less contact time overall (29 +/- 5 vs 37 +/- 3 min/h, P =.008), in the ICU (41 +/- 10 vs 47 +/- 5 min/h, P =.03), and on the floor (17 +/- 3 vs 28 +/- 4 min/h, P =.039), in spite of higher mean APACHE II scores in the isolated (10.1 +/- 1.0 vs 7.6 +/- 0.8, P =.05). Among floor patients with APACHE II scores greater than 10, patients in the isolated group had nearly 40% less contact time per hour than patients in the nonisolated group (19 +/- 4 vs 34 +/- 7 min/h, P =.05). CONCLUSIONS Because of the significantly lower contact time observed, particularly among the most severely ill of floor patients, we propose a reexamination of the risk-benefit ratio of this infection control method.
Collapse
|
96
|
Evans HL, Krag DN, Teates CD, Patterson JW, Meijer S, Harlow SP, Tanabe KK, Loggie BW, Whitworth PW, Kusminsky RE, Carp NZ, Gadd MA, Slingluff CL. Lymphoscintigraphy and sentinel node biopsy accurately stage melanoma in patients presenting after wide local excision. Ann Surg Oncol 2003; 10:416-25. [PMID: 12734091 DOI: 10.1245/aso.2003.05.009] [Citation(s) in RCA: 27] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
BACKGROUND Patients have traditionally been considered candidates for sentinel node biopsy (SNBx) only at the time of wide local excision (WLE). We hypothesized that patients with prior WLE may also be staged accurately with SNBx. METHODS Seventy-six patients, including 18 patients from the University of Virginia and 58 from a multicenter study of SNBx led by investigators at the University of Vermont, who had previous WLE for clinically localized melanoma underwent lymphoscintigraphy with SNBx. Median follow-up time was 38 months. RESULTS Intraoperative identification of at least 1 sentinel node was accomplished in 75 patients (98.6%). The mean number of sentinel nodes removed per patient was 2.0. Eleven patients (15%) had positive sentinel nodes. Among the 64 patients with negative SNBx, 3 (4%) developed nodal recurrences in a sentinel node-negative basin simultaneous with systemic metastasis, and 1 (1%) developed an isolated first recurrence in a lymph node. CONCLUSIONS This multicenter study more than doubles the published experience with SNBx after WLE and provides much-needed outcome data on recurrence after SNBx in these patients. These outcomes compare favorably with the reported literature for patients with SNBx at the time of WLE, suggesting that accurate staging of the regional lymph node bed is possible in patients after WLE.
Collapse
|
97
|
Raymond DP, Pelletier SJ, Crabtree TD, Evans HL, Pruett TL, Sawyer RG. Impact of antibiotic-resistant Gram-negative bacilli infections on outcome in hospitalized patients. Crit Care Med 2003; 31:1035-41. [PMID: 12682469 DOI: 10.1097/01.ccm.0000060015.77443.31] [Citation(s) in RCA: 73] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/16/2022]
Abstract
OBJECTIVE The impact of resistant (vs. nonresistant) Gram-negative infections on mortality remains unclear. We sought to define risk factors for and excess mortality from these infections. DESIGN Prospective cohort study. SETTING Inpatient surgical wards at a university hospital. PATIENTS All patients in the general, transplant, and trauma surgery services diagnosed with Gram-negative rod (GNR) infection. MEASUREMENTS AND MAIN RESULTS All culture-proven GNR infections (n = 924) from December 1996 to September 2000 were studied. Characteristics and outcomes were compared between GNR infections with and without antibiotic resistance. Univariate and logistic regression analysis identified factors associated with antibiotic-resistant GNR (rGNR) infection and mortality. rGNR infection (n = 203) was associated with increased Acute Physiology and Chronic Health Evaluation (APACHE) II scores (17.8 +/- 0.5), multiple comorbidities, pneumonia and catheter infection, coexistent infection with antibiotic-resistant Gram-positive cocci and fungi, and high mortality (27.1%). Only seven isolates were resistant in vitro to all available antibiotics. Logistic regression demonstrated that rGNR infection was an independent predictor of mortality (odds ratio, 2.23; 95% confidence interval, 1.35-3.67; p =.002). Analysis of rGNR infection with controls matched by organism, age, APACHE II score, and site of infection, however, revealed that antibiotic resistance was not associated with increased mortality (23.6% vs. 29.2%, p =.35). Furthermore, analysis of all Pseudomonas aeruginosa infections demonstrated no significant difference in mortality between resistant and sensitive strains (18.9% vs. 20.0%, p =.85). CONCLUSION rGNRs are associated with prolonged hospital stay and increased mortality. Infection with rGNRs independently predicts mortality; however, this may be more closely related to selection of certain bacterial species with a high frequency of resistance rather than actual resistance to antibiotic therapy. Therefore, altering infection-control practices to limit the dissemination of certain bacterial species may be more effective than attempts to control only antibiotic-resistant isolates.
Collapse
|
98
|
Evans HL, Raymond DP, Pelletier SJ, Crabtree TD, Pruett TL, Sawyer RG. Tertiary peritonitis (recurrent diffuse or localized disease) is not an independent predictor of mortality in surgical patients with intraabdominal infection. Surg Infect (Larchmt) 2003; 2:255-63; discussion 264-5. [PMID: 12593701 DOI: 10.1089/10962960152813296] [Citation(s) in RCA: 29] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022] Open
Abstract
BACKGROUND It is well documented that tertiary peritonitis is associated with different microbiological flora and worse outcomes than secondary peritonitis. It is unknown, however, if these differences can be explained simply by the nosocomial nature of tertiary peritonitis and underlying severity of illness. METHODS We reviewed all episodes of intraabdominal infection on the inpatient surgical services at a university hospital over a 46-month period. Univariate analysis and logistic regression were used to compare 91 episodes of secondary peritonitis that progressed to tertiary peritonitis (recurrent diffuse or localized intraabdominal infection) to all episodes of secondary peritonitis (n = 453) to identify predictors for developing tertiary peritonitis. Logistic regression was also used to identify predictors of mortality among patients with secondary (n = 473) or tertiary peritonitis (n = 129). RESULTS Of 602 episodes of intraabdominal infection identified, there were 473 episodes of secondary peritonitis, including 20 patients who died within seven days of diagnosis. A total of 129 episodes of tertiary peritonitis were identified, of which 91 were preceded by a single episode of secondary peritonitis, and 38 were preceded by an episode of secondary peritonitis and at least one prior episode of tertiary peritonitis. Tertiary peritonitis was associated with a high APACHE II score (14.9 +/- 0.7), pancreatic or small bowel source, drainage only at initial intervention, gram-positive and fungal pathogens, and a high mortality rate (19%). Increasing APACHE II score (OR 1.07, 95% CI 1.03-1.16, p = 0.0009) independently predicted progression from secondary to tertiary peritonitis while increasing age (OR 0.98, 95% CI 0.97-0.99, p = 0.01) and appendiceal source (OR 0.12, 95% CI 0.02-0.68, p = 0.02) predicted non-progression to tertiary peritonitis. Independent predictors of mortality in this population included increasing age (OR 1.06, 95% CI 1.03-1.1, p < 0.001), increasing APACHE II score (OR 1.18, 95% CI 1.11-1.3, p < 0.001), and four comorbidities: cerebrovascular disease (OR 4.3, 95% CI 1.4-13.1, p = 0.01), malignant disease (OR 2.9, 95% CI 1.3-6.5, p = 0.01), hemodialysis dependency (OR 3.8, 95% CI 1.3-11.2, p = 0.02), and liver disease (OR 4.2, 95% CI 1.6-15.1, p = 0.03). Tertiary peritonitis was not an independent predictor of mortality. CONCLUSIONS We were unable to demonstrate, when compared to secondary peritonitis, that tertiary peritonitis is a significant independent predictor of mortality when other variables are taken into account. This suggests that the high mortality associated with tertiary peritonitis is more a function of the patient population in which it occurs than the severity of the pathologic process itself.
Collapse
|
99
|
Evans HL, Sawyer RG. Cycling chemotherapy: A promising approach to reducing the morbidity and mortality of nosocomial infections. Drugs Today (Barc) 2003; 39:733-8. [PMID: 14586487 DOI: 10.1358/dot.2003.39.9.799480] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
Abstract
In the past several decades, nosocomial infections have emerged as one of the most serious contributors to hospital morbidity and mortality, particularly amongst patients who require intensive care. Resistant organisms, both Gram-negative and Gram-positive, are now to blame for a significant portion of hospital-acquired infections. Efforts to prevent nosocomial infection had historically focused on infection control measures, such as patient isolation. However, there have been numerous reports of the increasing prevalence of antibiotic resistance, as well as the dramatic, negative impact of the infections they cause, both in terms of patient outcomes and attributable costs, demanding new methods to halt this growing epidemic. The increasing threat of resistance may be attributed in part to the widespread and increasingly inappropriate use of antimicrobials, which inadvertently exert sufficient effect on the hospital (and now community) environment to allow the preferential selection of resistant microbes. This idea of selective antibiotic pressure is supported by data showing volume of antibiotic use and inadequate antimicrobial coverage as risk factors for increased morbidity and mortality. Accordingly, the focus of nosocomial infection control has now largely shifted towards the judicious use of antibiotic therapy. There have been numerous attempts to curtail antibiotic usage through various forms of antibiotic stewardship: formulary restriction, computerized decision-support and abbreviated course empiric therapy. Aside from the inherent difficulty of effecting change in physician practice, we are burdened, particularly in the setting of empiric therapy, with the need to balance between adequate therapy for the individual and prudent drug selection so as not to endanger other patients in the environment through resistant organism selection. Cycling chemotherapy for empiric treatment of suspected infection is a method uniquely designed to address these challenges.
Collapse
|
100
|
|