1
|
Elderly Patients Benefit From Enhanced Recovery Protocols After Colorectal Surgery. J Surg Res 2021; 266:54-61. [PMID: 33984731 DOI: 10.1016/j.jss.2021.01.050] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/02/2019] [Revised: 01/27/2021] [Accepted: 01/29/2021] [Indexed: 12/15/2022]
Abstract
BACKGROUND Enhanced recovery protocols (ERAS) aim to decrease physiological stress response to surgery and maintain postoperative physiological function. Proponents of ERAS state these protocols decrease lengths of stay (LOS) and complication rates. Our aim was to assess whether elderly patients receive the same benefit as younger patients using ERAS protocols. METHODS We queried patients from 2015 to 2017 at our institution with Enhanced Recovery in Surgery (ERIN) variables from the targeted colectomy NSQIP database. The patients were divided into sextiles and analyzed for readmission, LOS, return of bowel function, tolerating diet, mobilization, and multimodal pain management comparing the youngest sextile to the oldest sextile. RESULTS Two hundred sixty-two patients (73% colectomies) were enrolled in ERAS. When compared with the youngest sextile (age 19-43.8), the oldest sextile (age 71.4-92.5) had similar readmission rates at 9.8% versus 9.5% (P-value = 0.87), quicker return of bowel function, average 1.9 d versus 3.7 d (P-value < 0.01), and tolerated diet quicker, average POD 2.4 d versus 5.1 d (P-value < 0.01). There was a slight decrease in the use of multimodal pain management 88% versus 100% (P-value = 0.07), but mobilization on POD1 was slightly better in the elderly at 80% versus 78% (P-value = 0.76). Elderly patients enrolled in ERAS had an average LOS of 4.9 days versus 7.8 in the younger patients (P-value = 0.08). Among elderly non-ERAS patients average LOS was 14.6 days. CONCLUSION Overall, elderly patients fared better or the same on the ERIN variables analyzed than the younger cohort. ERAS protocols are beneficial and applicable to elderly patients undergoing colorectal surgery.
Collapse
|
2
|
Hospitalization Costs After Surgery in High-Risk Patients With Early Stage Lung Cancer. Ann Thorac Surg 2018; 105:263-270. [DOI: 10.1016/j.athoracsur.2017.08.038] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/10/2016] [Revised: 06/25/2017] [Accepted: 08/21/2017] [Indexed: 01/06/2023]
|
3
|
Does length of intubation before tracheostomy affect intensive care unit length of stay? Oral Surg Oral Med Oral Pathol Oral Radiol 2017; 124:525-528. [PMID: 29097138 DOI: 10.1016/j.oooo.2017.09.009] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2017] [Revised: 08/30/2017] [Accepted: 09/14/2017] [Indexed: 11/26/2022]
Abstract
OBJECTIVE The purpose of this study was to determine if length of intubation before tracheotomy (LIT) affects length of stay in the intensive care unit (ICU). STUDY DESIGN This was a retrospective case series of patients who had open tracheotomies at Grady Memorial Hospital by the Oral and Maxillofacial Surgery (OMS) service. Medical records were reviewed to document patient demographic characteristics, etiology for ventilator dependence, and complications. The primary predictor variable was LIT and primary outcome variable was length of stay in ICU after tracheotomy. Statistical analysis was performed (significance P < .05). RESULTS There were 115 patients (mean age 54 years) included in the study. The majority received tracheotomies because of prolonged mechanical ventilation secondary to a medical comorbidity. Intraoperative complications were cardiac arrest and difficulty accessing trachea. Postoperative complications were bleeding. Postoperatively, most patients were discharged from the ICU or weaned off mechanical ventilation within 5 days. The correlation between LIT and ICU stay was not statistically significant, but the trend was positive. CONCLUSIONS The results of this study indicate that patients undergoing an earlier tracheotomy were more likely to have an earlier discharge from the ICU.
Collapse
|
4
|
Operating Room Efficiency in Bariatric Surgery: The Effect of Team Member Experience on Operative Times in Laparoscopic Roux-en-Y Gastric Bypass. Bariatr Surg Pract Patient Care 2017. [DOI: 10.1089/bari.2017.0010] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
|
5
|
Development and Validation of a Risk Calculator for Renal Complications after Colorectal Surgery Using the National Surgical Quality Improvement Program Participant Use Files. Am Surg 2016. [DOI: 10.1177/000313481608201234] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Postoperative acute renal failure is a major cause of morbidity and mortality in colon and rectal surgery. Our objective was to identify preoperative risk factors that predispose patients to postoperative renal failure and renal insufficiency, and subsequently develop a risk calculator. Using the National Surgical Quality Improvement Program Participant Use Files database, all patients who underwent colorectal surgery in 2009 were selected (n = 21,720). We identified renal complications during the 30-day period after surgery. Using multivariate logistic regression analysis, a predictive model was developed. The overall incidence of renal complications among colorectal surgery patients was 1.6 per cent. Significant predictors include male gender (adjusted odds ratio [OR]: 1.8), dependent functional status (OR: 1.5), preoperative dyspnea (OR: 1.5), hypertension (OR: 1.6), preoperative acute renal failure (OR: 2.0), American Society of Anesthesiologists class ≥3 (OR: 2.2), preoperative creatinine >1.2 mg/dL (OR: 2.8), albumin <3.5 g/dL (OR: 1.8), and emergency operation (OR: 1.5). This final model has an area under the curve (AUC) of 0.79 and was validated with similar excellent discrimination (area under the curve: 0.76). Using this model, a risk calculator was developed with excellent predictive ability for postoperative renal complications in colorectal patients and can be used to aid clinical decision-making, patient counseling, and further research on measures to improve patient care.
Collapse
|
6
|
Development and Validation of a Risk Calculator for Renal Complications after Colorectal Surgery Using the National Surgical Quality Improvement Program Participant Use Files. Am Surg 2016; 82:1244-1249. [PMID: 28234192] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
Postoperative acute renal failure is a major cause of morbidity and mortality in colon and rectal surgery. Our objective was to identify preoperative risk factors that predispose patients to postoperative renal failure and renal insufficiency, and subsequently develop a risk calculator. Using the National Surgical Quality Improvement Program Participant Use Files database, all patients who underwent colorectal surgery in 2009 were selected (n = 21,720). We identified renal complications during the 30-day period after surgery. Using multivariate logistic regression analysis, a predictive model was developed. The overall incidence of renal complications among colorectal surgery patients was 1.6 per cent. Significant predictors include male gender (adjusted odds ratio [OR]: 1.8), dependent functional status (OR: 1.5), preoperative dyspnea (OR: 1.5), hypertension (OR: 1.6), preoperative acute renal failure (OR: 2.0), American Society of Anesthesiologists class ≥3 (OR: 2.2), preoperative creatinine >1.2 mg/dL (OR: 2.8), albumin <3.5 g/dL (OR: 1.8), and emergency operation (OR: 1.5). This final model has an area under the curve (AUC) of 0.79 and was validated with similar excellent discrimination (area under the curve: 0.76). Using this model, a risk calculator was developed with excellent predictive ability for postoperative renal complications in colorectal patients and can be used to aid clinical decision-making, patient counseling, and further research on measures to improve patient care.
Collapse
|
7
|
Solving the Value Equation: Assessing Surgeon Performance Using Risk-Adjusted Quality-Cost Diagrams and Surgical Outcomes. Am J Med Qual 2016; 32:532-540. [DOI: 10.1177/1062860616662704] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Quality-cost diagrams have been used previously to assess interventions and their cost-effectiveness. This study explores the use of risk-adjusted quality-cost diagrams to compare the value provided by surgeons by presenting cost and outcomes simultaneously. Colectomy cases from a single institution captured in the National Surgical Quality Improvement Program database were linked to hospital cost-accounting data to determine costs per encounter. Risk adjustment models were developed and observed average cost and complication rates per surgeon were compared to expected cost and complication rates using the diagrams. Surgeons were surveyed to determine if the diagrams could provide information that would result in practice adjustment. Of 55 surgeons surveyed on the utility of the diagrams, 92% of respondents believed the diagrams were useful. The diagrams seemed intuitive to interpret, and making risk-adjusted comparisons accounted for patient differences in the evaluation.
Collapse
|
8
|
Premature T Cell Senescence in Pediatric CKD. J Am Soc Nephrol 2016; 28:359-367. [PMID: 27413076 DOI: 10.1681/asn.2016010053] [Citation(s) in RCA: 43] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2016] [Accepted: 06/07/2016] [Indexed: 12/29/2022] Open
Abstract
An individual's immune function, susceptibility to infection, and response to immunosuppressive therapy are influenced in part by his/her T cell maturation state. Although childhood is the most dynamic period of immune maturation, scant information regarding the variability of T cell maturation in children with renal disease is available. In this study, we compared the T cell phenotype in children with renal failure (n=80) with that in healthy children (n=20) using multiparameter flow cytometry to detect markers of T cell maturation, exhaustion, and senescence known to influence immune function. We correlated data with the degree of renal failure (dialysis or nondialysis), prior immunosuppression use, and markers of inflammation (C-reactive protein and inflammatory cytokines) to assess the influence of these factors on T cell phenotype. Children with renal disease had highly variable and often markedly skewed maturation phenotypes, including CD4/CD8 ratio reversal, increased terminal effector differentiation in CD8+ T cells, reduction in the proportion of naïve T cells, evidence of T cell exhaustion and senescence, and variable loss of T cell CD28 expression. These findings were most significant in patients who had experienced major immune insults, particularly prior immunosuppressive drug exposure. In conclusion, children with renal disease have exceptional heterogeneity in the T cell repertoire. Cognizance of this heterogeneity might inform risk stratification with regard to the balance between infectious risk and response to immunosuppressive therapy, such as that required for autoimmune disease and transplantation.
Collapse
|
9
|
Utility of Prostate Cancer Screening in Kidney Transplant Candidates. J Am Soc Nephrol 2015; 27:2157-63. [PMID: 26701982 DOI: 10.1681/asn.2014121182] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2014] [Accepted: 09/19/2015] [Indexed: 01/20/2023] Open
Abstract
Screening recommendations for prostate cancer remain controversial, and no specific guidelines exist for screening in renal transplant candidates. To examine whether the use of prostate-specific antigen (PSA)-based screening in patients with ESRD affects time to transplantation and transplant outcomes, we retrospectively analyzed 3782 male patients ≥18 years of age undergoing primary renal transplant evaluation during a 10-year period. Patients were grouped by age per American Urological Association screening guidelines: group 1, patients <55 years; group 2, patients 55-69 years; and group 3, patients >69 years. A positive screening test result was defined as a PSA level >4 ng/ml. We used univariate analysis and Cox proportional hazards models to identify the independent effect of screening on transplant waiting times, patient survival, and graft survival. Screening was performed in 63.6% of candidates, and 1198 candidates (31.7%) received kidney transplants. PSA screening was not associated with improved patient survival after transplantation (P=0.24). However, it did increase the time to listing and transplantation for candidates in groups 1 and 2 who had a positive screening result (P<0.05). Furthermore, compared with candidates who were not screened, PSA-screened candidates had a reduced likelihood of receiving a transplant regardless of the screening outcome (P<0.001). These data strongly suggest that PSA screening for prostate cancer may be more harmful than protective in renal transplant candidates because it does not appear to confer a survival benefit to these candidates and may delay listing and decrease transplantation rates.
Collapse
|
10
|
Abdominal MRI without Enteral Contrast Accurately Detects Intestinal Fibrostenosis in Patients with Inflammatory Bowel Disease. Am Surg 2015. [DOI: 10.1177/000313481508101123] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/17/2022]
Abstract
Patients with inflammatory bowel disease (IBD) presenting for surgical evaluation require thorough small bowel surveillance as it improves accuracy of diagnosis (ulcerative colitis versus Crohn's) and differentiates those who may respond to nonoperative therapy, preserving bowel length. MRI has not been validated conclusively against histopathology in IBD. Most protocols require enteral contrast. This study aimed to 1) evaluate the accuracy of MRI for inflammation, fibrosis, and extraluminal complications and 2) compare MRI without enteral contrast to standard magnetic resonance enterography. Adults with Crohn's disease or ulcerative colitis who underwent abdominal MRI and surgery were retrospectively reviewed. Of 65 patients evaluated, 55 met inclusion criteria. Overall sensitivity and specificity of MRI for disease involvement localized by segment were 93 per cent (95% confidence interval = 89.4–95.0) and 95 per cent (95% confidence interval = 92.3–97.0), respectively (positive predictive value was 86%, negative predictive value was 98%). Sensitivity and specificity between MRI with and without oral and rectal contrast were similar (96% vs 91% and 99% vs 94%, P > 0.10). As were positive predictive value and negative predictive value (85% vs 96%, P = 0.16; 97% vs 99%, P = 0.42). Magnetic resonance is highly sensitive and specific for localized disease involvement and extraluminal abdominal sequelae of IBD. It accurately differentiates patients who have chronic transmural (fibrotic) disease and thus may require an operation from those with acute inflammation, whose symptoms may improve with aggressive medical therapy alone. MRI without contrast had comparable diagnostic yield to standard magnetic resonance enterography.
Collapse
|
11
|
Abdominal MRI without Enteral Contrast Accurately Detects Intestinal Fibrostenosis in Patients with Inflammatory Bowel Disease. Am Surg 2015; 81:1118-1124. [PMID: 26672581] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
Patients with inflammatory bowel disease (IBD) presenting for surgical evaluation require thorough small bowel surveillance as it improves accuracy of diagnosis (ulcerative colitis versus Crohn's) and differentiates those who may respond to nonoperative therapy, preserving bowel length. MRI has not been validated conclusively against histopathology in IBD. Most protocols require enteral contrast. This study aimed to 1) evaluate the accuracy of MRI for inflammation, fibrosis, and extraluminal complications and 2) compare MRI without enteral contrast to standard magnetic resonance enterography. Adults with Crohn's disease or ulcerative colitis who underwent abdominal MRI and surgery were retrospectively reviewed. Of 65 patients evaluated, 55 met inclusion criteria. Overall sensitivity and specificity of MRI for disease involvement localized by segment were 93 per cent (95% confidence interval = 89.4-95.0) and 95 per cent (95% confidence interval = 92.3-97.0), respectively (positive predictive value was 86%, negative predictive value was 98%). Sensitivity and specificity between MRI with and without oral and rectal contrast were similar (96% vs 91% and 99% vs 94%, P > 0.10). As were positive predictive value and negative predictive value (85% vs 96%, P = 0.16; 97% vs 99%, P = 0.42). Magnetic resonance is highly sensitive and specific for localized disease involvement and extraluminal abdominal sequelae of IBD. It accurately differentiates patients who have chronic transmural (fibrotic) disease and thus may require an operation from those with acute inflammation, whose symptoms may improve with aggressive medical therapy alone. MRI without contrast had comparable diagnostic yield to standard magnetic resonance enterography.
Collapse
|
12
|
Renal safety of intravenous gadolinium-enhanced magnetic resonance imaging in patients awaiting liver transplantation. Liver Transpl 2015; 21:1340-6. [PMID: 25786913 DOI: 10.1002/lt.24118] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/06/2015] [Revised: 03/04/2015] [Accepted: 03/09/2015] [Indexed: 02/07/2023]
Abstract
Renal dysfunction in cirrhosis carries a high morbidity and mortality. Given the potential risk of contrast-induced nephropathy associated with iodinated intravenous contrast used in computed tomography (CT), alternate contrast modalities for abdominal imaging in liver transplant candidates need to be examined. The purpose of this study was to examine the renal safety of magnetic resonance imaging (MRI) with gadolinium in patients awaiting liver transplantation. The study involved a retrospective analysis of 352 patients of abdominal MRI with low-dose gadobenate dimeglumine (MultiHance) (0.05 mmol/kg) in patients with cirrhosis and without renal replacement therapy at a single center during the period from 2007 to 2013. For each case, serum creatinine before and within a few days after the MRI were compared. In addition, the patients were analyzed for the development of nephrogenic systemic fibrosis (NSF), a reported complication of gadolinium in chronic kidney disease. The pre-MRI serum creatinine values ranged from 0.36 to 4.86 mg/dL, with 70 patients (20%) having values ≥ 1.5 mg/dL. A comparison of the pre- and post-MRI serum creatinine values did not demonstrate a clinically significant difference (mean change = 0.017 mg/dL; P = 0.38), including those patients with a pre-MRI serum creatinine ≥ 1.5 mg/dL. In addition, no cases of NSF were noted. In conclusion, our findings suggest that MRI with low-dose gadobenate dimeglumine (MultiHance) is a nonnephrotoxic imaging modality in liver transplant candidates, and its use can be cautiously expanded to liver transplant candidates with concomitant renal insufficiency.
Collapse
|
13
|
Does the amount of time medical students spend in the operating room during the general surgery core clerkship affect their career decision? Am J Surg 2015; 210:167-72. [DOI: 10.1016/j.amjsurg.2014.10.031] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2014] [Revised: 09/30/2014] [Accepted: 10/03/2014] [Indexed: 10/23/2022]
|
14
|
Predicting Length of Stay Following Radical Nephrectomy Using the National Surgical Quality Improvement Program Database. J Urol 2015; 194:923-8. [PMID: 25986510 DOI: 10.1016/j.juro.2015.04.112] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/27/2015] [Indexed: 11/27/2022]
Abstract
PURPOSE Length of stay is frequently used to measure the quality of health care, although its predictors are not well studied in urology. We created a predictive model of length of stay after nephrectomy, focusing on preoperative variables. MATERIALS AND METHODS We used the NSQIP database to evaluate patients older than 18 years who underwent nephrectomy without concomitant procedures from 2007 to 2011. Preoperative factors analyzed for univariate significance in relation to actual length of stay were then included in a multivariable linear regression model. Backward elimination of nonsignificant variables resulted in a final model that was validated in an institutional external patient cohort. RESULTS Of the 1,527 patients in the NSQIP database 864 were included in the training cohort after exclusions for concomitant procedures or lack of data. Median length of stay was 3 days in the training and validation sets. Univariate analysis revealed 27 significant variables. Backward selection left a final model including the variables age, laparoscopic vs open approach, and preoperative hematocrit and albumin. For every additional year in age, point decrease in hematocrit and point decrease in albumin the length of stay lengthened by a factor of 0.7%, 2.5% and 17.7%, respectively. If an open approach was performed, length of stay increased by 61%. The R(2) value was 0.256. The model was validated in a 427 patient external cohort, which yielded an R(2) value of 0.214. CONCLUSIONS Age, preoperative hematocrit, preoperative albumin and approach have significant effects on length of stay for patients undergoing nephrectomy. Similar predictive models could prove useful in patient education as well as quality assessment.
Collapse
|
15
|
Medical students impact laparoscopic surgery case time. J Surg Res 2015; 197:277-82. [PMID: 25963166 DOI: 10.1016/j.jss.2015.04.021] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2015] [Revised: 03/22/2015] [Accepted: 04/03/2015] [Indexed: 12/21/2022]
Abstract
BACKGROUND Medical students (MS) are increasingly assuming active roles in the operating room. Laparoscopic cases offer unique opportunities for MS participation. The aim of this study was to examine associations between the presence of MS in laparoscopic cases and operation time and postoperative complication rates. MATERIALS AND METHODS Data from the American College of Surgeons National Surgical Quality Improvement Program were linked to operative records for nonemergent, inpatient, and laparoscopic general surgery cases at our institution from January, 2009-January, 2013. Cases were grouped into eight distinct procedure categories. Hospital records provided information on the presence of MS. Demographics, comorbidities, intraoperative variables, and postoperative complication rates were analyzed. RESULTS Seven hundred laparoscopic cases were included. Controlling for wound class, procedure group, and surgeon, MS were associated with an additional 28 min of total operative time. The most significant increase occurred between the skin incision and skin closure. No significant association between the presence of MS and postoperative complications was observed. CONCLUSIONS This is the first retrospective analysis to examine the effect of MS presence during laparoscopic procedures. Increase in the operation time associated with the presence of MS should be examined further, to optimize the educational experience without incurring increased cost due to increased operation time.
Collapse
|
16
|
Do medical students in the operating room affect patient care? An analysis of one institution's experience over the past five years. JOURNAL OF SURGICAL EDUCATION 2014; 71:817-824. [PMID: 24931415 DOI: 10.1016/j.jsurg.2014.04.011] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/27/2014] [Revised: 04/20/2014] [Accepted: 04/30/2014] [Indexed: 06/03/2023]
Abstract
BACKGROUND Medical students are active learners in operating rooms during medical school. This observational study seeks to investigate the effect of medical students on operative time and complications. METHODS Data from the American College of Surgeons National Surgical Quality Improvement Program was linked to operative records for nonemergent, inpatient general surgery cases at our institution from 1 January 2009 to 1 January 2013. Cases were grouped into 13 distinct procedure groups. Hospital records provided information on the presence of medical students. Demographics, comorbidities, intraoperative variables, and postoperative complications were analyzed. RESULTS Overall, 2481 cases were included. Controlling for wound class, procedure group, and surgeon, medical students were associated with an additional 14 minutes of operative time. No association between medical students and postoperative complications was observed. CONCLUSIONS The educational benefits gained by the presence of medical students do not appear to jeopardize the quality of patient care.
Collapse
|
17
|
Cost comparison analysis of open versus laparoscopic distal pancreatectomy. HPB (Oxford) 2014; 16:907-14. [PMID: 24931314 PMCID: PMC4238857 DOI: 10.1111/hpb.12288] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/04/2014] [Accepted: 05/05/2014] [Indexed: 12/12/2022]
Abstract
BACKGROUND In comparison with open distal pancreatectomy (ODP), laparoscopic distal pancreatectomy (LDP) is associated with fewer complications and shorter hospital stays, but comparative cost data for the two approaches are limited. METHODS Records of all distal pancreatectomies carried out from January 2009 to June 2013 were reviewed and stratified according to operative complexity. Patient factors and outcomes were recorded. Total variable costs (TVCs) were tabulated for each patient, and stratified by category [e.g. 'floor', 'operating room' (OR), 'radiology']. Costs for index admissions and 30-day readmissions were compared between LDP and ODP groups. RESULTS Of 153 procedures, 115 (70 LDP, 45 ODP) were selected for analysis. The TVC of the index admission was US$3420 less per patient in the LDP group (US$10 480 versus US$13 900; P = 0.06). Although OR costs were significantly greater in the LDP cohort (US$5756 versus US$4900; P = 0.02), the shorter average hospitalization in the LDP group (5.2 days versus 7.7 days; P = 0.01) resulted in a lower overall cost. The total cost of index hospitalization combined with readmission was significantly lower in the LDP cohort (US$11 106 versus US$14 803; P = 0.05). CONCLUSIONS In appropriately selected patients, LDP is more cost-effective than ODP. The increased OR cost associated with LDP is offset by the shorter hospitalization. These data clarify targets for further cost reductions.
Collapse
|
18
|
Video-assisted thoracic surgery lobectomy cost variability: implications for a bundled payment era. Ann Thorac Surg 2014; 97:1686-92; discussion 1692-3. [PMID: 24792254 DOI: 10.1016/j.athoracsur.2014.01.021] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/04/2013] [Revised: 12/23/2013] [Accepted: 01/06/2014] [Indexed: 11/16/2022]
Abstract
BACKGROUND In 2013, the Centers for Medicare and Medicaid Services began its Bundled Payments for Care Improvement Initiative. If payments are to be bundled, surgeons must be able to predict which patients are at risk for more costly care. We aim to identify factors driving variability in hospital costs after video-assisted thoracic surgery (VATS) lobectomy for lung cancer. METHODS Our institutional Society of Thoracic Surgeons data were queried for patients undergoing VATS lobectomy for lung cancer during fiscal years 2010 to 2011. Clinical outcomes data were linked with hospital financial data to determine operative and postoperative costs. Linear regression models were created to identify the impact of preoperative risk factors and perioperative outcomes on cost. RESULTS One hundred forty-nine VATS lobectomies for lung cancer were reviewed. The majority of patients had clinical stage IA lung cancer (67.8%). Median length of stay was 4 days, with 30-day mortality and morbidity rates of 0.7% and 37.6%, respectively. Mean operative and postoperative costs per case were $8,492.31 (±$2,238.76) and $10,145.50 (±$7,004.71), respectively, resulting in an average overall hospital cost of $18,637.81 (±$8,244.12) per patient. Patients with chronic obstructive pulmonary disease and coronary artery disease, as well as postoperative urinary tract infections and blood transfusions, were associated with statistically significant variability in cost. CONCLUSIONS Variability in cost associated with VATS lobectomy is driven by assorted patient and clinical variables. Awareness of such factors can help surgeons implement quality improvement initiatives and focus resource utilization. Understanding risk-adjusted clinical-financial data is critical to designing payment arrangements that include financial and performance accountability, and thus ultimately increasing the value of health care.
Collapse
|
19
|
The allo- and viral-specific immunosuppressive effect of belatacept, but not tacrolimus, attenuates with progressive T cell maturation. Am J Transplant 2014; 14:319-32. [PMID: 24472192 PMCID: PMC3906634 DOI: 10.1111/ajt.12574] [Citation(s) in RCA: 56] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2013] [Revised: 10/24/2013] [Accepted: 11/08/2013] [Indexed: 01/25/2023]
Abstract
Tacrolimus impairs allo- and viral-specific T cell responses. Belatacept, a costimulation-based alternative to tacrolimus, has emerged with a paradoxical picture of less complete control of alloimmunity with concomitant impaired viral immunity limited to viral-naïve patients. To reconcile these signatures, bulk population and purified memory and naïve lymphocytes from cytomegalovirus (CMV)-seropositive (n=10) and CMV-seronegative (n=10) volunteers were studied using flow cytometry, interrogating proliferation (carboxyfluorescein succinimidyl ester dilution) and function (intracellular cytokine staining) in response to alloantigens or CMV-pp-65 peptides. As anticipated, T cells from CMV-experienced, but not naïve, individuals responded to pp-65 with a small percentage of their repertoire (<2.5%) consisting predominantly of mature, polyfunctional (expressing interferon gamma, tumor necrosis factor alpha and IL-2) T effector memory cells. Both CMV naïve and experienced individuals responded similarly to alloantigen with a substantially larger percentage of the repertoire (up to 48.2%) containing proportionately fewer polyfunctional cells. Tacrolimus completely inhibited responses of CMV- and allo-specific T cells regardless of their maturation. However, belatacept's effects were decreasingly evident in increasingly matured cells, with minimal effect on viral-specific triple cytokine producers and CD28-negative allo-specific cells. These data indicate that belatacept's immunosuppressive effect, unlike tacrolimus's, wanes on progressively developed effector responses, and may explain the observed clinical effects of belatacept.
Collapse
|
20
|
Endovascular vs open repair of renal artery aneurysms: outcomes of repair and long-term renal function. J Am Coll Surg 2013; 217:263-9. [PMID: 23769185 DOI: 10.1016/j.jamcollsurg.2013.03.021] [Citation(s) in RCA: 60] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2012] [Revised: 03/14/2013] [Accepted: 03/28/2013] [Indexed: 10/26/2022]
Abstract
BACKGROUND Endovascular treatment (ER) of renal artery aneurysms (RAA) has been widely used recently due to its assumed lower morbidity and mortality compared with open surgery (OS). The purpose of this study was to investigate the outcomes of OS and ER, and compare long-term renal function. STUDY DESIGN Data from 2000 to 2012 were retrospectively collected to identify patients who were treated for RAA in a single institution. Morbidity, mortality, freedom from reinterventions, and renal function were compared between OS and ER for RAA. RESULTS Forty-four RAA repairs were identified in 40 patients (28 women, mean age ± SD 54 ± 13 years). Twenty RAA were repaired with OS (45%) and 24 RAA (55%) with ER. Mean aneurysm sizes were 2.5 ± 1.5 cm (OS) and 2.2 ± 2.2 cm (ER; p = 0.66). Endovascular repair included coil embolization with or without stent placement in 19 patients (79%) and stent grafts in 4 (17%). Open surgery included excision or aneurysmorrhaphy of the aneurysm in 11 kidneys (55%), graft interposition or bypass in 4 (20%), and 4 nephrectomies (20%). There was 1 technical failure in each group. Comorbidities were similar between the 2 groups (American Society of Anesthesiologists III-IV: OS, 40%; ER, 58%; p = 0.44). Endovascular repair and OR had equivalent perioperative morbidity (any complication OS, 15%, ER, 17%, p = 1.0) and no mortality (OS, 0%, ER, 0%). Endovascular repair was associated with shorter hospitalization (OS, 6.3 ± 2.5; ER, 2 ± 3.4 days, p < 0.001). Mean follow-ups were 21 ± 32 months (OS) and 27 ± 36 months (ER). A 30% reduction in glomerular filtration rate occurred in 12.5% of OS patients and 9.1% of ER patients (p = 1.00). Freedom from reintervention at 12 and 24 months were OS, 82%/82% and ER, 82%/74%, respectively (log-rank-test = 0.23). CONCLUSIONS Endovascular repair of RAA is as safe and effective as open repair in selected patients with appropriate anatomy. There was no difference in decline in renal function between OS and ER.
Collapse
|
21
|
Evaluation of clinical outcomes of prophylactic versus preemptive cytomegalovirus strategy in liver transplant recipients. Transpl Int 2013; 26:592-600. [PMID: 23590709 DOI: 10.1111/tri.12101] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2012] [Revised: 10/30/2012] [Accepted: 03/18/2013] [Indexed: 12/17/2022]
Abstract
Cytomegalovirus (CMV) is a major cause of morbidity and mortality following solid organ transplantation (SOT). Two strategies, prophylactic, and preemptive have emerged for the prevention of CMV infection and disease after SOT. This retrospective chart review of two liver transplant cohorts: prophylactic and preemptive, compares the clinical impact of transitioning from prophylactic to preemptive strategy. The primary outcome is the incidence of CMV viremia at 3-and 6-months post-transplant. Secondary outcomes include: incidence of CMV tissue-invasive disease, acute cellular rejection, leukopenia and neutropenia, opportunistic infection rates, hospital readmission rates, and mortality at 3-and 6-months post-transplant. A total of 109 patients were included in the analysis. The incidence of CMV viremia was 4.9% and 50.0% (P < 0.001) in the prophylactic versus preemptive cohort, respectively, at 3 months post-transplant. The incidence of CMV viremia was 24.6% and 8.3% (P = 0.026) in the prophylactic versus preemptive cohort, respectively, at 6 months post-transplant. There were no statistical significant differences in the secondary outcomes between both cohorts. In conclusion, there is a statistical significant difference in time to onset of CMV viremia; however, the use of either prophylactic or preemptive strategy was not associated with significant negative clinical outcomes of CMV.
Collapse
|
22
|
|
23
|
Impact of surgical care improvement project inf-9 on postoperative urinary tract infections: do exemptions interfere with quality patient care? ACTA ACUST UNITED AC 2012; 147:946-53. [PMID: 23070409 DOI: 10.1001/archsurg.2012.1485] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
BACKGROUND The Surgical Care Improvement Project (SCIP) Inf-9 guideline promotes removal of indwelling urinary catheters (IUCs) within 48 hours of surgery. OBJECTIVES To determine whether a correlation exists between SCIP Inf-9 compliance and postoperative urinary tract infection (UTI) rates and whether an association exists between UTI rates and SCIP Inf-9 exemption status. DESIGN Retrospective case control study. SETTING Southeastern academic medical center. PATIENTS American College of Surgeons National Surgical Quality Improvement Program (NSQIP) and SCIP Inf-9 compliance data were collected prospectively on randomly selected general and vascular surgery inpatients. Monthly UTI rates and SCIP Inf-9 compliance scores were tested for correlation. Complete NSQIP data for all the inpatients with postoperative UTIs were compared with a group of 100 random controls to determine whether an association exists between UTI rates and SCIP Inf-9 exemption status. MAIN OUTCOME MEASURE Postoperative UTI. RESULTS In 2459 patients reviewed, SCIP Inf-9 compliance increased over time, but this was not correlated with improved monthly UTI rates. Sixty-one of the 69 UTIs (88.4%) were compliant with SCIP Inf-9; however, 49 (71.0%) of these were considered exempt from the guideline and, therefore, the IUC was not removed within 48 hours of surgery. Retrospective review of 100 random controls showed a similar compliance rate (84.0%, P = .43) but a lower rate of exemption (23.5%, P < .001). The odds of developing a postoperative UTI were 8 times higher in patients deemed exempt from SCIP Inf-9 (odds ratio [OR], 7.99; 95% CI, 3.85-16.61). After controlling for differences between the 2 groups, the adjusted ORs slightly increased (OR, 8.34; 95% CI, 3.70-18.76). CONCLUSIONS Most UTIs occurred in patients deemed exempt from SCIP Inf-9. Although compliance rates remain high, practices are not actually improving. Surgical Care Improvement Project Inf-9 guidelines should be modified with fewer exemptions to facilitate earlier removal of IUCs.
Collapse
|
24
|
Risk factors for 30-day hospital readmission among general surgery patients. J Am Coll Surg 2012; 215:322-30. [PMID: 22726893 DOI: 10.1016/j.jamcollsurg.2012.05.024] [Citation(s) in RCA: 461] [Impact Index Per Article: 38.4] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2012] [Revised: 05/17/2012] [Accepted: 05/17/2012] [Indexed: 02/08/2023]
Abstract
BACKGROUND Hospital readmission within 30 days of an index hospitalization is receiving increased scrutiny as a marker of poor-quality patient care. This study identifies factors associated with 30-day readmission after general surgery procedures. STUDY DESIGN Using standard National Surgical Quality Improvement Project protocol, preoperative, intraoperative, and postoperative outcomes were collected on patients undergoing inpatient general surgery procedures at a single academic center between 2009 and 2011. Data were merged with our institutional clinical data warehouse to identify unplanned 30-day readmissions. Demographics, comorbidities, type of procedure, postoperative complications, and ICD-9 coding data were reviewed for patients who were readmitted. Univariate and multivariate analysis was used to identify risk factors associated with 30-day readmission. RESULTS One thousand four hundred and forty-two general surgery patients were reviewed. One hundred and sixty-three (11.3%) were readmitted within 30 days of discharge. The most common reasons for readmission were gastrointestinal problem/complication (27.6%), surgical infection (22.1%), and failure to thrive/malnutrition (10.4%). Comorbidities associated with risk of readmission included disseminated cancer, dyspnea, and preoperative open wound (p < 0.05 for all variables). Surgical procedures associated with higher rates of readmission included pancreatectomy, colectomy, and liver resection. Postoperative occurrences leading to increased risk of readmission were blood transfusion, postoperative pulmonary complication, wound complication, sepsis/shock, urinary tract infection, and vascular complications. Multivariable analysis demonstrates that the most significant independent risk factor for readmission is the occurrence of any postoperative complication (odds ratio = 4.20; 95% CI, 2.89-6.13). CONCLUSIONS Risk factors for readmission after general surgery procedures are multifactorial, however, postoperative complications appear to drive readmissions in surgical patients. Taking appropriate steps to minimize postoperative complications will decrease postoperative readmissions.
Collapse
|
25
|
Locoregional recurrence after mastectomy with immediate transverse rectus abdominis myocutaneous (TRAM) flap reconstruction. Ann Surg Oncol 2012; 19:2679-84. [PMID: 22476750 DOI: 10.1245/s10434-012-2329-z] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/03/2010] [Indexed: 01/29/2023]
Abstract
BACKGROUND The locoregional recurrence (LRR) rate after mastectomy is reported to be similar with immediate reconstruction. We aimed to identify characteristics of LRR after transverse rectus abdominis myocutaneous (TRAM) reconstruction. METHODS We retrospectively reviewed patients undergoing immediate TRAM reconstruction for breast cancer who were diagnosed with LRR. RESULTS We identified 18 LRR (4.6 %) in 18 of 390 patients who underwent immediate TRAM reconstructions for breast cancer from 1998 to 2008. The median follow-up was 69.2 months. The mean age at time of mastectomy was 49.5 years. All LRR were detected by physical examination. The LRR occurred in the TRAM subcutaneous tissue (n = 9), five in the ipsilateral axillary lymph node and four in the supraclavicular lymph node. Of the 18 patients who developed LRR, 14 (77.7 %) presented with stage 0-1-2 and 4 (22.2 %) with stage 3 disease at the time of the original mastectomy. The average time for a LRR to present was 35.8 months after initial mastectomy and reconstruction. For patients who initially presented with stage 3 disease, the average time to LRR was shorter (22.9 months). Nine patients (50.0 %) were found to have metastatic disease at the time of the LRR, and 6 (33.3 %) died of disease. CONCLUSIONS All TRAM LRR were detected by routine physical examination by the patient or the surgeon. Our findings suggest that routine history and clinical breast examination of the breast reconstructed with a TRAM flap along with patient self-awareness are reliable in the diagnosis of LRR.
Collapse
|
26
|
Diversity of release patterns for jail detainees: implications for public health interventions. Am J Public Health 2011; 101 Suppl 1:S347-52. [PMID: 22039042 PMCID: PMC3222492 DOI: 10.2105/ajph.2010.300004] [Citation(s) in RCA: 42] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/09/2023]
Abstract
OBJECTIVES We sought to develop a detailed description of the variety of jail release patterns and to learn what factors affect the length of stay (LOS). METHODS The main data set for the study came from a biennial Bureau of Justice Statistics survey on felony defendants in large urban counties. RESULTS The median LOS for the felony defendants was 7 days. One quarter of the jails had a median LOS of less than 2 days; median LOS for 75% of the jails was less than 15 days. Median regression showed that male gender, previous arrests, and violent charges were predictive of longer LOS. CONCLUSIONS The diversity in release patterns among jails has not been previously described. A public health intervention feasible in one jail may not be feasible in another because of the heterogeneity of release patterns. Individual inmate characteristics could predict a slower rate of release.
Collapse
|
27
|
Abstract
Renal transplant recipients require periodic surveillance for immune-based complications such as rejection and infection. Noninvasive monitoring methods are preferred, particularly for children, for whom invasive testing is problematic. We performed a cross-sectional analysis of adult and pediatric transplant recipients to determine whether a urine-based chemokine assay could noninvasively identify patients with rejection among other common clinical diagnoses. Urine was collected from 110 adults and 46 children with defined clinical conditions: healthy volunteers, stable renal transplant recipients, and recipients with clinical or subclinical acute rejection (AR) or BK infection (BKI), calcineurin inhibitor (CNI) toxicity or interstitial fibrosis (IFTA). Urine was analyzed using a solid-phase bead-array assay for the interferon gamma-induced chemokines CXCL9 and CXCL10. We found that urine CXCL9 and CXCL10 were markedly elevated in adults and children experiencing either AR or BKI (p = 0.0002), but not in stable allograft recipients or recipients with CNI toxicity or IFTA. The sensitivity and specificity of these chemokine assays exceeded that of serum creatinine. Neither chemokine distinguished between AR and BKI. These data show that urine chemokine monitoring identifies patients with renal allograft inflammation. This assay may be useful for noninvasively distinguishing those allograft recipients requiring more intensive surveillance from those with benign clinical courses.
Collapse
|
28
|
Abstract
The life expectancy of persons cycling through the prison system is unknown. The authors sought to determine the 15.5-year survival of 23,510 persons imprisoned in the state of Georgia on June 30, 1991. After linking prison and mortality records, they calculated standardized mortality ratios (SMRs). The cohort experienced 2,650 deaths during follow-up, which were 799 more than expected (SMR = 1.43, 95% confidence interval (CI): 1.38, 1.49). Mortality during incarceration was low (SMR = 0.85, 95% CI: 0.77, 0.94), while postrelease mortality was high (SMR = 1.54, 95% CI: 1.48, 1.61). SMRs varied by race, with black men exhibiting lower relative mortality than white men. Black men were the only demographic subgroup to experience significantly lower mortality while incarcerated (SMR = 0.66, 95% CI: 0.58, 0.76), while white men experienced elevated mortality while incarcerated (SMR = 1.28, 95% CI: 1.10, 1.48). Four causes of death (homicide, transportation, accidental poisoning, and suicide) accounted for 74% of the decreased mortality during incarceration, while 6 causes (human immunodeficiency virus infection, cancer, cirrhosis, homicide, transportation, and accidental poisoning) accounted for 62% of the excess mortality following release. Adjustment for compassionate releases eliminated the protective effect of incarceration on mortality. These results suggest that the low mortality inside prisons can be explained by the rarity of deaths unlikely to occur in the context of incarceration and compassionate releases of moribund patients.
Collapse
|
29
|
Pairing HIV-positive prisoners with volunteer life coaches to maintain health-promoting behavior upon release: a mixed-methods needs analysis and pilot study. AIDS EDUCATION AND PREVENTION : OFFICIAL PUBLICATION OF THE INTERNATIONAL SOCIETY FOR AIDS EDUCATION 2009; 21:552-569. [PMID: 20030499 DOI: 10.1521/aeap.2009.21.6.552] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
Abstract
Drawing on individuals who volunteer in US prisons to mentor HIV-infected inmates returning to the community may promote successful transitions. Evaluations published in the scientific literature of such community linkage programs are scant. Our quantitative and qualitative methods needs analysis and pilot study entailed interviewing convenience samples of 24 HIV-positive persons recently released from Georgia correctional facilities and 12 potential volunteer mentors. Both releasees and potential mentors were open to the establishment of a mentoring program. Releasees wanted nonjudgmental mentors. Releasees and volunteers had statistically significant differences in marital status, education, current employment, and possession of a driver's license but not in degree of religious involvement and attitudes toward condom use. A volunteer-staffed program, perhaps more aptly named "life coaching" than mentoring, to help HIV-infected persons to transition from prison to the community may be feasible. Success will require adequately trained volunteers and a straightforward program.
Collapse
|