1
|
Thoracic irrigation for prevention of secondary intervention after thoracostomy tube drainage for hemothorax: A Western Trauma Association multi-center study. J Trauma Acute Care Surg 2024:01586154-990000000-00744. [PMID: 38764139 DOI: 10.1097/ta.0000000000004364] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/21/2024]
Abstract
BACKGROUND Retained hemothorax (rHTX) requiring intervention occurs in up to 20% of patients who undergo chest tube (TT) placement for a hemothorax (HTX). Thoracic irrigation at the time of TT placement decreases the need for secondary intervention in this patient group but those findings are limited because of the single center design. A multi-center study was conducted to evaluate the effectiveness of thoracic irrigation. METHODS A multi-center, prospective, observational study was conducted between June 2018 and July 2023. Eleven sites contributed patients. Patients were included if they had a TT placed for a HTX and were excluded if: age < 18 years, TT for pneumothorax, thoracotomy or VATS performed within 6 hours of TT, TT >24 hours after injury, TT removed <24 hours, or death within 48 hours. Thoracic irrigation was performed at the discretion of the attending. Each hemithorax was considered separately if bilateral HTX. The primary outcome was secondary intervention for HTX-related complications (rHTX, effusion, or empyema). Secondary intervention was defined as: TT placement, instillation of thrombolytics, VATS, or thoracotomy. Irrigated and non-irrigated hemithoraces were compared using a propensity weighted analysis with age, sex, mechanism of injury, Abbreviated Injury Scale (AIS) chest and TT size as predictors. RESULTS 493 patients with 462 treated hemothoraces were included, 123 (25%) had thoracic irrigation at TT placement. There were no significant demographic differences between the cohorts. Fifty-seven secondary interventions were performed, 10 (8%) and 47 (13%) in the irrigated and non-irrigated groups, respectively (p = 0.015). Propensity weighted analysis demonstrated a reduction in secondary interventions in the irrigated cohort (Odds Ratio 0.56 (0.34-0.85); p = 0.005). CONCLUSION This Western Trauma Association multi-center study demonstrates a benefit of thoracic irrigation at the time of TT placement for a HTX. Thoracic irrigation reduces the odds of a secondary intervention for rHTX-related complications by 44%. LEVEL OF EVIDENCE Therapeutic Study, Level II.
Collapse
|
2
|
Prospective validation of a hospital triage predictive model to decrease undertriage: an EAST multicenter study. Trauma Surg Acute Care Open 2024; 9:e001280. [PMID: 38737811 PMCID: PMC11086287 DOI: 10.1136/tsaco-2023-001280] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2023] [Accepted: 03/23/2024] [Indexed: 05/14/2024] Open
Abstract
Background Tiered trauma team activation (TTA) allows systems to optimally allocate resources to an injured patient. Target undertriage and overtriage rates of <5% and <35% are difficult for centers to achieve, and performance variability exists. The objective of this study was to optimize and externally validate a previously developed hospital trauma triage prediction model to predict the need for emergent intervention in 6 hours (NEI-6), an indicator of need for a full TTA. Methods The model was previously developed and internally validated using data from 31 US trauma centers. Data were collected prospectively at five sites using a mobile application which hosted the NEI-6 model. A weighted multiple logistic regression model was used to retrain and optimize the model using the original data set and a portion of data from one of the prospective sites. The remaining data from the five sites were designated for external validation. The area under the receiver operating characteristic curve (AUROC) and the area under the precision-recall curve (AUPRC) were used to assess the validation cohort. Subanalyses were performed for age, race, and mechanism of injury. Results 14 421 patients were included in the training data set and 2476 patients in the external validation data set across five sites. On validation, the model had an overall undertriage rate of 9.1% and overtriage rate of 53.7%, with an AUROC of 0.80 and an AUPRC of 0.63. Blunt injury had an undertriage rate of 8.8%, whereas penetrating injury had 31.2%. For those aged ≥65, the undertriage rate was 8.4%, and for Black or African American patients the undertriage rate was 7.7%. Conclusion The optimized and externally validated NEI-6 model approaches the recommended undertriage and overtriage rates while significantly reducing variability of TTA across centers for blunt trauma patients. The model performs well for populations that traditionally have high rates of undertriage. Level of evidence 2.
Collapse
|
3
|
Early venous thromboembolism chemoprophylaxis in traumatic brain injury requiring neurosurgical intervention: Safe and effective. Surgery 2024; 175:1439-1444. [PMID: 38388229 DOI: 10.1016/j.surg.2024.01.026] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2023] [Revised: 12/25/2023] [Accepted: 01/17/2024] [Indexed: 02/24/2024]
Abstract
BACKGROUND Traumatic brain injury patients who require neurosurgical intervention are at the highest risk of worsening intracranial hemorrhage. This subgroup of patients has frequently been excluded from prior research regarding the timing of venous thromboembolism chemoprophylaxis. This study aims to assess the efficacy and safety of early venous thromboembolism chemoprophylaxis in patients with traumatic brain injuries requiring neurosurgical interventions. METHODS This is a single-center retrospective review (2016-2020) of traumatic brain injury patients requiring neurosurgical intervention admitted to a level I trauma center. Interventions included intracranial pressure monitoring, subdural drain, external ventricular drain, craniotomy, and craniectomy. Exclusion criteria included neurosurgical intervention after chemoprophylaxis initiation, death within 5 days of admission, and absence of chemoprophylaxis. The total population was stratified into Early (≤72 hours of intervention) versus Late (>72 hours after intervention) chemoprophylaxis initiation. RESULTS A total of 351 patients met the inclusion criteria, of whom 204 (58%) had early chemoprophylaxis initiation. Overall, there were no significant differences in baseline and admission characteristics between cohorts. The Early chemoprophylaxis cohort had a statistically significant lower venous thromboembolism rate (5% vs 13%, P < .001) with no increased risk of worsening intracranial hemorrhage (10% vs 13%, P = .44) or neurosurgical reintervention (8% vs 10%, P = .7). On subgroup analysis, a total of 169 patients required either a craniotomy or a craniectomy before chemoprophylaxis. The Early chemoprophylaxis cohort had statistically significant lower venous thromboembolism rates (2% vs 11%, P < .001) with no increase in intracranial hemorrhage (8% vs 11%, P = .6) or repeat neurosurgical intervention (8% vs 10%, P = .77). CONCLUSION Venous thromboembolism prophylaxis initiation within 72 hours of neurosurgical intervention is safe and effective. Further prospective research is warranted to validate the results of this study.
Collapse
|
4
|
Thoracic Cavity Irrigation Prevents Retained Hemothorax and Decreases Surgical Intervention in Trauma Patients. J Trauma Acute Care Surg 2024:01586154-990000000-00673. [PMID: 38523131 DOI: 10.1097/ta.0000000000004324] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/26/2024]
Abstract
INTRODUCTION Retained hemothorax (HTX) is a common complication following thoracic trauma. Small studies demonstrate the benefit of thoracic cavity irrigation at the time of tube thoracostomy for the prevention of retained HTX. We sought to assess the effectiveness of chest irrigation in preventing retained HTX leading to a secondary surgical intervention. METHODS We performed a single-center retrospective study from 2017-2021 at a Level I trauma center comparing bedside thoracic cavity irrigation via tube thoracostomy (TT) versus no irrigation. Using the trauma registry, patients with traumatic HTX were identified. Exclusion criteria were TT placement at an outside hospital, no TT within 24 hours of admission, thoracotomy or video-assisted thoracoscopic surgery (VATS) prior to or within 6 hours after TT placement, VATS as part of rib fixation or diaphragmatic repair, and death within 96 hours of admission. Bivariate and multivariable analyses were conducted. RESULTS A total of 370 patients met the inclusion criteria, of whom 225 (61%) were irrigated. Patients who were irrigated were more likely to suffer a penetrating injury (41% vs 30%, p = 0.03) and less likely to have a flail chest (10% vs 21%, p = 0.01) (Table 1). On bivariate analysis, irrigation was associated with lower rates of VATS (6% vs 19%, p < 0.001) and retained HTX (10% vs 21%, p < 0.001) (Figure 1). The irrigated cohort had a shorter TT duration (4 vs 6 days, p < 0.001) and hospital length of stay (LOS) (7 vs 9 days, p = 0.04). On multivariable analysis, thoracic cavity irrigation had lower odds of VATS (aOR: 0.37, 95%CI: 0.30-0.54), retained HTX (aOR: 0.42, 95%CI: 0.25-0.74), and a shorter TT duration (β: -1.58, 95%CI: -2.52, -0.75). CONCLUSION Our 5-year experience with thoracic irrigation confirms findings from smaller studies that irrigation prevents retained HTX and decreases the need for surgical intervention. LEVEL OF EVIDENCE Level III, Therapeutic/Care Management.
Collapse
|
5
|
A Collaborative Multidisciplinary Trauma Program Improvement Team Improves VTE Chemoprophylaxis Guideline Compliance In Non-Operative Stable TBI. J Trauma Acute Care Surg 2024:01586154-990000000-00646. [PMID: 38437527 DOI: 10.1097/ta.0000000000004294] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/06/2024]
Abstract
BACKGROUND Delays in initiating venous thromboembolism (VTE) prophylaxis in patients suffering from traumatic brain injury (TBI) persist despite guidelines recommending early initiation. We hypothesized that the expansion of a Trauma Program Performance Improvement (PI) team will improve compliance of early (24-48 hour) initiation of VTE prophylaxis and will decrease VTE events in TBI patients. METHODS We performed a single-center retrospective review of all TBI patients admitted to a Level I trauma center before (2015-2016,) and after (2019-2020,) the expansion of the Trauma Performance Improvement and Patient Safety (PIPS) team and the creation of trauma process and outcome dashboards. Exclusion criteria included discharge or death within 48 hours of admission, expanding intracranial hemorrhage on CT scan, and a neurosurgical intervention (craniotomy, pressure monitor, or drains) prior to chemoprophylaxis initiation. RESULTS A total of 1,112 patients met the inclusion criteria, of which 54% (n = 604) were admitted after Trauma PIPS expansion. Following the addition of a dedicated PIPS nurse in the trauma program and creation of process dashboards, the time from stable CT to VTE prophylaxis initiation decreased (52 hours to 35 hours; p < 0.001) and more patients received chemoprophylaxis at 24-48 hours (59% from 36%, p < 0.001) after stable head CT. There was no significant difference in time from first head CT to stable CT (9 vs 9 hours; p = 0.15). The Contemporary group had a lower rate of VTE events (1% vs 4%; p < 0.001) with no increase in bleeding events (2% vs 2%; p = 0.97). On multivariable analysis, being in the Early cohort was an independent predictor of VTE events (aOR: 3.74; 95%CI: 1.45-6.16). CONCLUSION A collaborative multidisciplinary Trauma PIPS team improves guideline compliance. Initiation of VTE chemoprophylaxis within 24-48 hours of stable head CT is safe and effective. LEVEL OF EVIDENCE Level III, Therapeutic/Care Management.
Collapse
|
6
|
Early career acute care surgeons' priorities and perspectives: A mixed-methods analysis to better understand full-time employment. J Trauma Acute Care Surg 2023; 95:935-942. [PMID: 37418689 DOI: 10.1097/ta.0000000000004037] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/09/2023]
Abstract
BACKGROUND Understanding the expectations of early career acute care surgeons will help clarify the practice and employment models that will attract and retain high-quality surgeons, thereby sustaining our workforce. This study aimed to outline the clinical and academic preferences and priorities of early career acute care surgeons and to better define full-time employment. METHODS A survey on clinical responsibilities, employment preferences, work priorities, and compensation was distributed to early career acute care surgeons in the first 5 years of practice. A subset of agreeable respondents underwent virtual semistructured interviews. Both quantitative and thematic analysis were used to describe current responsibilities, expectations, and perspectives. RESULTS Of 471 surgeons, 167 responded (35%), the majority of whom were assistant professors within the first 3 years of practice (80%). The median desired clinical volume was 24 clinical weeks and 48 call shifts per year, 4 weeks less than their median current clinical volume. Most respondents (61%) preferred a service-based model. The top priorities cited in choosing a job were geography, work schedule, and compensation. Qualitative interviews identified themes related to defining full-time employment, first job expectations and realities, and the often-misaligned system and surgeon. CONCLUSION Understanding the perspectives of early career surgeons entering the workforce is important particularly in the field of acute care surgery where no standard workload or practice model exists. The wide variety of expectations, practice models, and schedule preferences may lead to a mismatch between surgeon desires and employment expectation. Consistent employment standards across our specialty would provide a framework for sustainability. LEVEL OF EVIDENCE Prognostic and Epidemiological; Level III.
Collapse
|
7
|
The Variation of Withdrawal of Life Sustaining Therapy in Older Adults With Traumatic Brain Injury. J Surg Res 2023; 291:34-42. [PMID: 37331190 DOI: 10.1016/j.jss.2023.05.020] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2022] [Revised: 05/11/2023] [Accepted: 05/18/2023] [Indexed: 06/20/2023]
Abstract
INTRODUCTION The decision to withdraw life sustaining treatment (WDLST) in older adults with traumatic brain injury is subject to wide variability leading to nonbeneficial interventions and unnecessary use of hospital resources. We hypothesized that patient and hospital factors are associated with WDLST and WDLST timing. METHODS All traumatic brain injury patients ≥65 with Glasgow coma scores (GCS) of 4-11 from 2018 to 2019 at level I and II centers were selected from the National Trauma Data Bank. Patients with head abbreviated injury scores 5-6 or death within 24 h were excluded. Bayesian additive regression tree analysis was performed to identify the cumulative incidence function (CIF) and the relative risks (RR) over time for withdrawal of care, discharge to hospice (DH), and death. Death alone (no WDLST or DH) served as the comparator group for all analyses. A subanalysis of the composite outcome WDLST/DH (defined as end-of-life-care), with death (no WDLST or DH) as a comparator cohort was performed. RESULTS We included 2126 patients, of whom 1957 (57%) underwent WDLST, 402 (19%) died, and 469 (22%) were DH. 60% of patients were male, and the mean age was 80 y. The majority of patients were injured by fall (76%, n = 1644). Patients who were DH were more often female (51% DH versus 39% WDLST), had a past medical history of dementia (45% DH versus 18% WDLST), and had lower admission injury severity score (14 DH versus 18.6 WDLST) (P < 0.001). Compared to those who DH, those who underwent WDLST had a lower GCS (9.8 versus 8.4, P < 0.001). CIF of WDSLT and DH increased with age, stabilizing by day 3. At day 3, patients ≥90 y had an increased RR of DH compared to WDLST (RR 2.5 versus 1.4). As GCS increased, CIF and RR of WDLST decreased, while CIF and RR of DH increased (RR on day 3 for GCS 12: WDLST 0.42 versus DH 1.31).Patients at nonprofit institutions were more likely to undergo WDLST (RR 1.15) compared to DH (0.68). Compared to patients of White race, patients of Black race had a lower RR of WDLST at all timepoints. CONCLUSIONS Patient and hospital factors influence the practice of end-of-life-care (WDLST, DH, and death), highlighting the need to better understand variability to target palliative care interventions and standardize care across populations and trauma centers.
Collapse
|
8
|
Japanese encephalitis virus: epidemiology and risk-based surveillance approaches for New Zealand. N Z Vet J 2023; 71:283-294. [PMID: 37621178 DOI: 10.1080/00480169.2023.2248054] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2023] [Accepted: 08/06/2023] [Indexed: 08/26/2023]
Abstract
The introduction and subsequent rapid spread of Japanese encephalitis virus genotype IV across all Australian mainland states and the Northern Territory since late 2021 has increased the risk of an incursion of this mosquito-transmitted zoonotic virus disease into New Zealand, with serious implications for both animal and human health. The potential modes of entry are through introduction of infected mosquitoes as hitchhikers on ships or aircraft, windborne transfer of mosquitoes, or arrival of infected reservoir bird species. A competent vector mosquito, Culex quinquefasciatus, is endemic in New Zealand and other mosquito species may also become involved. If infection becomes established in New Zealand, the scale of transmission may be considerably less than has occurred in Australia because climatic and epidemiological factors are not so favourable. Early evidence of an incursion could come from detection of clinical disease in horses or pigs, or from human cases. Targeted surveillance to confirm or refute indications of an incursion could be undertaken by antibody detection in a number of species. Dogs have been shown to be a particularly valuable sentinel species due to their cohabitation with people and high seroconversion rate. Other novel methods of surveillance could include reverse transcriptase PCR (RT-PCR) on oronasal secretions of pigs. Should evidence of the disease be detected, prompt action would be required to vaccinate at-risk human populations and clarify the epidemiological situation with respect to mammalian hosts and mosquito vector species, including whether a new mosquito species had arrived in the country.Abbreviations: AHL: Animal Health Laboratory; JE: Japanese encephalitis disease; JEV: Japanese encephalitis virus; RT-PCR: Reverse transcriptase PCR.
Collapse
|
9
|
Early Shared Decision-Making for Older Adults with Traumatic Brain Injury: Using Time-Limited Trials and Understanding Their Limitations. Neurocrit Care 2023; 39:284-293. [PMID: 37349599 DOI: 10.1007/s12028-023-01764-8] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/24/2022] [Accepted: 05/11/2023] [Indexed: 06/24/2023]
Abstract
Older adults account for a disproportionate share of the morbidity and mortality after traumatic brain injury (TBI). Predicting functional and cognitive outcomes for individual older adults after TBI is challenging in the acute phase of injury. Given that neurologic recovery is possible and uncertain, life-sustaining therapy may be pursued initially, even if for some, there is a risk of survival to an undesired level of disability or dependence. Experts recommend early conversations about goals of care after TBI, but evidence-based guidelines for these discussions or for the optimal method for communicating prognosis are limited. The time-limited trial (TLT) model may be an effective strategy for managing prognostic uncertainty after TBI. TLTs can provide a framework for early management: specific treatments or procedures are used for a defined period of time while monitoring for an agreed-upon outcome. Outcome measures, including signs of worsening and improvement, are defined at the outset of the trial. In this Viewpoint article, we discuss the use of TLTs for older adults with TBI, their potential benefits, and current challenges to their application. Three main barriers limit the implementation of TLTs in these scenarios: inadequate models for prognostication; cognitive biases faced by clinicians and surrogate decision-makers, which may contribute to prognostic discordance; and ambiguity regarding appropriate endpoints for the TLT. Further study is needed to understand clinician behaviors and surrogate preferences for prognostic communication and how to optimally integrate TLTs into the care of older adults with TBI.
Collapse
|
10
|
Undertriage of Severely Injured Trauma Patients. Am Surg 2023; 89:4129-4134. [PMID: 37259503 DOI: 10.1177/00031348231177939] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
Abstract
INTRODUCTION The American College of Surgeons (ACS) delineates trauma team activation (TTA) criteria to identify seriously injured trauma patients in the field. Patients are deemed to be severely undertriaged (SU), placing them at risk for adverse outcomes, when they do not meet TTA criteria but nonetheless sustain significant injuries (Injury Severity Score [ISS] ≥25). OBJECTIVES Delineate patient demographics, injuries, and outcomes after SU. PARTICIPANTS Trauma patients presenting to our ACS-verified Level 1 trauma center with ISS ≥25 were included (11/2015-03/2022). Transfers and private vehicle transports were excluded. Patients were dichotomized and compared by trauma arrival level: TTA (Appropriately Triaged, AT) vs routine consults (SU). RESULTS Study criteria were satisfied by 1653 patients: 1375 (83%) AT and 278 (17%) SU. Severely undertriaged patients were older than AT patients (47 vs 36 years, P < .001). Severely undertriaged occurred almost exclusively following blunt trauma (96% vs 71%, P < .001). Injury Severity Score was lower following SU than AT (29 vs 32, P < .001). The most common severe injuries (Abbreviated Injury Scale score [AIS] ≥3) among the SU group were in the Chest (n = 179, 64%). Severely undertriaged patients necessitated emergent intubation (n = 34, 12%), surgery (n = 59, 21%), and angioembolization (n = 22, 8%) at high rates. Severely undertriaged mortality was n = 40, 14%. CONCLUSION Severely undertriaged occurred among a substantial proportion of ISS ≥25 patients, predominately following blunt trauma. Severe chest injuries were most likely to evade capture. Rates of intubation, emergent intervention, and in-hospital mortality were high after SU. Efforts should be made to identify such patients in the field as they may benefit from TTA.
Collapse
|
11
|
The efficacy of various Enoxaparin dosing regimens in general surgery patients: A systematic review. Surgery 2023:S0039-6060(23)00208-8. [PMID: 37198037 DOI: 10.1016/j.surg.2023.04.032] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2023] [Revised: 03/28/2023] [Accepted: 04/18/2023] [Indexed: 05/19/2023]
Abstract
BACKGROUND Patients undergoing surgical procedures are at an increased risk of venous thromboembolism events. A fixed Enoxaparin dosing regimen is the standard of care for chemoprophylaxis in most institutions; however, breakthrough venous thromboembolism events are still reported. We aimed to systematically review the literature to determine the ability of various Enoxaparin dosing regimens to achieve adequate prophylactic anti-Xa levels for venous thromboembolism prevention in hospitalized general surgery patients. Additionally, we aimed to assess the correlation between subprophylactic anti-Xa levels and the development of clinically significant venous thromboembolism events. METHODS A systematic review was conducted using major databases from January 1, 1993, to February 17, 2023. Two independent researchers screened titles and abstracts, followed by a full-text review. Articles were included if Enoxaparin dosing regimens were evaluated by anti-Xa levels. Exclusion criteria included systematic reviews, pediatric population, nongeneral surgery (defined as trauma, orthopedics, plastics, and neurosurgery), and non-Enoxaparin chemoprophylaxis. The primary outcome was peak Anti-Xa level measured at steady state concentration. The risk of bias was assessed using the Risk of Bias in Nonrandomized studies-of Intervention tool. RESULTS A total of 6,760 articles were extracted, of which 19 were included in the scoping review. Nine studies included bariatric patients, whereas 5 studies explored abdominal surgical oncology patients. Three studies assessed thoracic surgery patients, and 2 studies included patients undergoing "general surgery" procedures. A total of 1,502 patients were included. The mean age was 47 years, and 38% were males. The percentages of patients reaching adequate prophylactic anti-Xa levels were 39%, 61%, 15%, 50%, and 78% across the 40 mg daily, 40 mg twice daily, 30 mg twice daily, and weight-tiered, and body mass index-based groups, respectively. The overall risk of bias was low to moderate. CONCLUSION Fixed Enoxaparin dosing regimens are not correlated with adequate anti-Xa levels in general surgery patients. Additional research is warranted to assess the efficacy of dosing regimens based on novel physiologic parameters (such as estimated blood volume).
Collapse
|
12
|
Anti-Factor Xa Monitoring of Enoxaparin Thromboembolism Prophylaxis in Emergency General Surgery Patients. J Am Coll Surg 2023:00019464-990000000-00610. [PMID: 37039364 DOI: 10.1097/xcs.0000000000000709] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/12/2023]
Abstract
BACKGROUND Rates of venous thromboembolism (VTE) remain high in emergency general surgery (EGS) patients despite chemical VTE prophylaxis. Emerging literature supports anti-factor Xa (AFXa) monitoring for patients on enoxaparin (LMWH), though a significant knowledge gap remains regarding the optimal dosing and monitoring in EGS patients. We hypothesize that standard dose VTE prophylaxis regimens provide inadequate VTE prophylaxis in EGS patients. STUDY DESIGN A prospective cohort study of all adult EGS patients at a single institution between August 2021-February 2022 receiving standard dose LMWH for VTE prophylaxis was performed. AFXa levels were obtained 4-hours following the third dose of enoxaparin with a target range of 0.3-0.5 IU/mL. Adjustment to dosing and repeat AFXa measurement after the adjusted 3 rd dose was obtained. RESULTS A total of 81 patients underwent AFXa monitoring, the majority (75%) of whom were started on 40 mg LMWH daily. Initial peak AFXa measurement was low in 87.7% of patients (mean 0.16 IU/mL). Of patients who had an initial low AFXa, remained admitted, and underwent dosing adjustment and AFXa reassessment (27%), the majority were adjusted to either 30mg or 40mg LMWH twice daily (23.7% and 55%, respectively), with 82% of patients remaining low. There were no significant differences in demographics or body mass index between those with low vs. adequate AFXa levels at either initial or subsequent measurement. CONCLUSION Standard LMWH dosing provides inadequate AFXa inhibition for adequate VTE prophylaxis. These findings highlight the importance of ongoing AFXa monitoring and the need to establish clinical protocols to improve VTE prophylaxis in EGS patients.
Collapse
|
13
|
The Geriatric Nutritional Risk Index as a predictor of complications in geriatric trauma patients. J Trauma Acute Care Surg 2022; 93:195-199. [PMID: 35293374 PMCID: PMC9329178 DOI: 10.1097/ta.0000000000003588] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
BACKGROUND Malnutrition is associated with increased morbidity and mortality after trauma. The Geriatric Nutritional Risk Index (GNRI) is a validated scoring system used to predict the risk of complications related to malnutrition in nontrauma patients. We hypothesized that GNRI is predictive of worse outcomes in geriatric trauma patients. METHODS This was a single-center retrospective study of trauma patients 65 years or older admitted in 2019. Geriatric Nutritional Risk Index was calculated based on admission albumin level and ratio of actual body weight to ideal body weight. Groups were defined as major risk (GNRI <82), moderate risk (GNRI 82-91), low risk (GNRI 92-98), and no risk (GNRI >98). The primary outcome was mortality. Secondary outcomes included ventilator days, intensive care unit length of stay (LOS), hospital LOS, discharge home, sepsis, pneumonia, and acute respiratory distress syndrome. Bivariate and multivariable logistic regression analyses were performed to determine the association between GNRI risk category and outcomes. RESULTS A total of 513 patients were identified for analysis. Median age was 78 years (71-86 years); 24 patients (4.7%) were identified as major risk, 66 (12.9%) as moderate risk, 72 (14%) as low risk, and 351 (68.4%) as no risk. Injury Severity Scores and Charlson Comorbidity Indexes were similar between all groups. Patients in the no risk group had decreased rates of death, and after adjusting for Injury Severity Score, age, and Charlson Comorbidity Index, the no risk group had decreased odds of death (odds ratio, 0.13; 95% confidence interval, 0.04-0.41) compared with the major risk group. The no risk group also had fewer infectious complications including sepsis and pneumonia, and shorter hospital LOS and were more likely to be discharged home. CONCLUSIONS Major GNRI risk is associated with increased mortality and infectious complications in geriatric trauma patients. Further studies should target interventional strategies for those at highest risk based on GNRI. LEVEL OF EVIDENCE Prognostic and Epidemiologic; Level III.
Collapse
|
14
|
Predicting outcomes after traumatic brain injury: A novel hospital prediction model for a patient reported outcome. Am J Surg 2022; 224:1150-1155. [DOI: 10.1016/j.amjsurg.2022.05.016] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2022] [Revised: 04/14/2022] [Accepted: 05/17/2022] [Indexed: 11/28/2022]
|
15
|
A national study defining 1.0 full-time employment in trauma and acute care surgery. J Trauma Acute Care Surg 2022; 92:648-655. [PMID: 34936589 DOI: 10.1097/ta.0000000000003504] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
BACKGROUND Trauma and acute care surgery (ACS) staffing models vary widely across the United States, resulting in large discrepancies in staffing, compensation, schedule, and clinical/nonclinical expectations. An urgent need exists to define clinical, academic, and schedule expectations for a full-time employment (FTE) of a trauma and ACS surgeon in the United States. METHODS A survey was distributed to departmental leaders at Levels I, II, III trauma centers across the United States regarding current workload. Variables concerning the responsibilities of surgeons, compensation models, and clinical expectations were collected. This was followed by virtual semistructured interviews of agreeable respondents. A thematic analysis was used to describe current staffing challenges and "ideal" staffing and compensation models of trauma centers. RESULTS Sixty-eight of 483 division chiefs/medical directors responded (14%), the majority (66%) representing Level I centers. There were differences in clinical responsibilities, elective surgery coverage as well as number of and reimbursement for call. The median description of an FTE was 26 weeks (interquartile range, 13 weeks) with a median of 8 (interquartile range, 8) 12-hour call shifts per month. Level III centers were more likely to perform elective surgery and covered more call shifts, typically from home. In our qualitative interviews, we identified numerous themes, including inconsistent models and staffing of services, surgeon-administration conflict and elective surgery driven by productivity and desire. CONCLUSION Defining the workload of a full-time trauma and ACS surgeon is nuanced and requires consideration of local volume, acuity and culture. Between the quantitative and qualitative analysis, a reasonable workload for a 1.0 FTE acute care surgeon at a Level I center is 24 to 28 service weeks per year and four to five in-house calls per month. Nighttime and daytime staffing needs can be divergent and may lead to conflict with administration. Future research should consider the individual surgeon's perspective on the definition of an FTE. LEVEL OF EVIDENCE Prognostic and epidemiological, Level III.
Collapse
|
16
|
Improved Prediction of Older Adult Discharge After Trauma Using a Novel Machine Learning Paradigm. J Surg Res 2021; 270:39-48. [PMID: 34628162 DOI: 10.1016/j.jss.2021.08.021] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2021] [Revised: 07/16/2021] [Accepted: 08/27/2021] [Indexed: 12/22/2022]
Abstract
BACKGROUND The ability to reliably predict outcomes after trauma in older adults (age ≥ 65 y) is critical for clinical decision making. Using novel machine-learning techniques, we sought to design a nonlinear, competing risks paradigm for prediction of older adult discharge disposition following injury. MATERIALS AND METHODS The National Trauma Databank (NTDB) was used to identify patients 65+ y between 2007 and 2014. Training was performed on an enriched cohort of diverse patients. Factors included age, comorbidities, length of stay, and physiologic parameters to predict in-hospital mortality and discharge disposition (home versus skilled nursing/long-term care facility). Length of stay and discharge status were analyzed via competing risks survival analysis with Bayesian additive regression trees and a multinomial mixed model. RESULTS The resulting sample size was 47,037 patients. Admission GCS and age were important in predicting mortality and discharge disposition. As GCS decreased, patients were more likely to die (risk ratio increased by average of 1.4 per 2-point drop in GCS, P < 0.001). As GCS decreased, patients were also more likely to be discharged to a skilled nursing or long-term care facility (risk ratio decreased by 0.08 per 2-point decrease in GCS, P< 0.001). The area under curve for prediction of discharge home was improved in the competing risks model 0.73 versus 0.43 in the traditional multinomial mixed model. CONCLUSIONS Predicting older adult discharge disposition after trauma is improved using machine learning over traditional regression analysis. We confirmed that a nonlinear, competing risks paradigm enhances prediction on any given hospital day post injury.
Collapse
|
17
|
Field-Triage, Hospital-Triage and Triage-Assessment: A Literature Review of the Current Phases of Adult Trauma Triage. J Trauma Acute Care Surg 2021; 90:e138-e145. [PMID: 33605709 DOI: 10.1097/ta.0000000000003125] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
ABSTRACT Despite major improvements in the United States trauma system over the past two decades, prehospital trauma triage is a significant challenge. Undertriage is associated with increased mortality, and overtriage results in significant resource overuse. The American College of Surgeons Committee on Trauma benchmarks for undertriage and overtriage are not being met. Many barriers to appropriate field triage exist, including lack of a formal definition for major trauma, absence of a simple and widely applicable triage mode, and emergency medical service adherence to triage protocols. Modern trauma triage systems should ideally be based on the need for intervention rather than injury severity. Future studies should focus on identifying the ideal definition for major trauma and creating triage models that can be easily deployed. This narrative review article presents challenges and potential solutions for prehospital trauma triage.
Collapse
|
18
|
Bowel Ischemia Score Predicts Early Operation in Patients With Adhesive Small Bowel Obstruction. Am Surg 2021; 88:205-211. [PMID: 33502222 DOI: 10.1177/0003134820988820] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
BACKGROUND Nonoperative management of adhesive small bowel obstruction (SBO) is successful in up to 80% of patients. Current recommendations advocate for computed tomography (CT) scan in all patients with SBO to supplement surgical decision-making. The hypothesis of this study was that cumulative findings on CT would predict the need for operative intervention in the setting of SBO. METHODS This is an analysis of a retrospectively and prospectively collected adhesive SBO database over a 6-year period. A Bowel Ischemia Score (BIS) was developed based on the Eastern Association for the Surgery of Trauma guidelines of CT findings suggestive of bowel ischemia. One point was assigned for each of the six variables. Early operation was defined as surgery within 6 hours of CT scan. RESULTS Of the 275 patients in the database, 249 (90.5%) underwent CT scan. The operative rate was 28.3% with a median time from CT to operation of 21 hours (Interquartile range 5.2-59.2 hours). Most patients (166/217, 76.4%) with a BIS of 0 or 1 were successfully managed nonoperatively, whereas the majority of those with a BIS of 3 required operative intervention (5/6, 83.3%). The discrimination (area under the receiver operating characteristic curve) of BIS for early surgery, any operative intervention, and small bowel resection were 0.83, 0.72, and 0.61, respectively. CONCLUSION The cumulative signs of bowel ischemia on CT scan represented by BIS, rather than the presence or absence of any one finding, correlate with the need for early operative intervention.
Collapse
|
19
|
mTOR inhibition in COVID-19: A commentary and review of efficacy in RNA viruses. J Med Virol 2020; 93:1843-1846. [PMID: 33314219 DOI: 10.1002/jmv.26728] [Citation(s) in RCA: 46] [Impact Index Per Article: 11.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2020] [Revised: 12/02/2020] [Accepted: 12/04/2020] [Indexed: 02/06/2023]
Abstract
In this commentary, we shed light on the role of the mammalian target of rapamycin (mTOR) pathway in viral infections. The mTOR pathway has been demonstrated to be modulated in numerous RNA viruses. Frequently, inhibiting mTOR results in suppression of virus growth and replication. Recent evidence points towards modulation of mTOR in severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection. We discuss the current literature on mTOR in SARS-CoV-2 and highlight evidence in support of a role for mTOR inhibitors in the treatment of coronavirus disease 2019.
Collapse
|
20
|
Immunomodulation in COVID-19. THE LANCET. RESPIRATORY MEDICINE 2020; 8:544-546. [PMID: 32380023 PMCID: PMC7198187 DOI: 10.1016/s2213-2600(20)30226-5] [Citation(s) in RCA: 119] [Impact Index Per Article: 29.8] [Reference Citation Analysis] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/15/2020] [Revised: 04/22/2020] [Accepted: 04/22/2020] [Indexed: 01/05/2023]
|
21
|
Evaluation of post-exposure prophylaxis practices to improve the cost-effectiveness of rabies control in human cases potentially exposed to rabies in southern Bhutan. BMC Infect Dis 2020; 20:203. [PMID: 32143641 PMCID: PMC7060656 DOI: 10.1186/s12879-020-4926-y] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2019] [Accepted: 02/27/2020] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Rabies is endemic in southern Bhutan, associated with 1-2 human deaths and high post exposure prophylaxis (PEP) costs annually. Evaluation of clinicians' management of human cases potentially exposed to rabies could contribute to improving PEP prescribing practices to both reduce unnecessary costs associated with PEP and reach the target of zero human deaths due to rabies by 2023. METHODS A cross-sectional survey of 50 clinicians' management of human cases potentially exposed to rabies was conducted in 13 health centers in high-rabies-risk areas of Bhutan during February-March 2016. RESULTS Data were collected on clinicians' management of 273 human cases potentially exposed to rabies. The 50 clinicians comprised health assistants or clinical officers (55%) and medical doctors (45%) with a respective median of 19, 21 and 2 years' experience. There was poor agreement between clinicians' rabies risk assessment compared with an independent assessment for each case based on criteria in the National Rabies Management Guidelines (NRMG). Of the 194 cases for which clinicians recorded a rabies risk category, only 53% were correctly classified when compared with the NRMG. Clinicians were more likely to underestimate the risk of exposure to rabies and appeared to prescribe PEP independently of their risk classification.. Male health assistants performed the most accurate risk assessments while female health assistants performed the least accurate. Clinicians in Basic Health Units performed less accurate risk assessments compared with those in hospitals. CONCLUSIONS This study highlights important discrepancies between clinicians' management of human cases potentially exposed to rabies and recommendations in the NRMG. In particular, clinicians were not accurately assessing rabies risk in potentially exposed cases and were not basing their PEP treatment on the basis of their risk assessment. This has significant implications for achieving the national goal of eliminating dog-mediated human rabies by 2030 and may result in unnecessary costs associated with PEP. Recommendations to improve clinician's management of human cases potentially exposed to rabies include: reviewing and updating the NRMG, providing clinicians with regular and appropriately targeted training about rabies risk assessment and PEP prescription, and regularly reviewing clinicians' practices.
Collapse
|
22
|
The re-emergence of Mycobacterium bovis infection in brushtail possums (Trichosurus vulpecula) after localised possum eradication. N Z Vet J 2012; 51:73-80. [PMID: 16032303 DOI: 10.1080/00480169.2003.36343] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
AIM To examine the spatial and temporal pattern of Mycobacterium bovis (bovine tuberculosis) infection in a population of brushtail possums (Trichosurus vulpecula) after localised possum eradication. METHODS Possums on a 36 ha site were eradicated and re-population from the surrounding area studied using population surveys conducted approximately every 2 months for 40 months from the cessation of eradication activity (month zero), using a capture-release programme. At each trapping session, all possums were examined for clinical signs of tuberculosis. The diagnosis of tuberculosis was confirmed by the isolation of M. bovis, and restriction endonuclease analysis (REA) was used to type the isolates. Infected possums were categorised as residents (present on the site for at least 6 months before diagnosis), range expanders (adult possums which had extended their nearby home ranges to become trappable within the study site), or juvenile immigrants (sub-adult possums which had dispersed into the site from an unknown distance away). This classification was used to identify the location where possums became infected. Capture locations and denning site locations were used to examine the spatial pattern of disease occurrence. RESULTS Thirty cases of tuberculosis were diagnosed among the 370 possums identified on the study site. Four different REA types (Types 2, 3, 8 and 10) were identified. The first two cases of tuberculosis were diagnosed in Month 4, in mature male possums categorised as range expanders, the third case was diagnosed in Month 6 and the fourth case at Month 9. Each of the first four cases was infected with a different REA type. The subsequent temporal pattern of infection was consistent with transmission from range expander cases and dispersing juvenile immigrants to resident possums. Clinical incidence remained low but persistent until the third year, when the incidence of Types 2, 8 and 10 escalated. Type 3 infections showed an earlier incidence peak, but disappeared from the site when the last known case died at Month 20. Of the dispersing juvenile possums entering the site, four became clinically tuberculous and represented a source of re-infection of other possums. CONCLUSIONS Re-emergence of tuberculosis after localised possum eradication was due to the continuing reintroduction of infection in mature and immature diseased possums, and not the survival of M. bovis in the environment.
Collapse
|
23
|
Likely effectiveness of pharmaceutical and non-pharmaceutical interventions for mitigating influenza virus transmission in Mongolia. Bull World Health Organ 2012; 90:264-71. [PMID: 22511822 DOI: 10.2471/blt.11.093419] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2011] [Revised: 01/12/2012] [Accepted: 01/18/2012] [Indexed: 11/27/2022] Open
Abstract
OBJECTIVE To assess the likely benefit of the interventions under consideration for use in Mongolia during future influenza pandemics. METHODS A stochastic, compartmental patch model of susceptibility, exposure, infection and recovery was constructed to capture the key effects of several interventions--travel restrictions, school closure, generalized social distancing, quarantining of close contacts, treatment of cases with antivirals and prophylaxis of contacts--on the dynamics of influenza epidemics. The likely benefit and optimal timing and duration of each of these interventions were assessed using Latin-hypercube sampling techniques, averaging across many possible transmission and social mixing parameters. FINDINGS Timely interventions could substantially alter the time-course and reduce the severity of pandemic influenza in Mongolia. In a moderate pandemic scenario, early social distancing measures decreased the mean attack rate from around 10% to 7-8%. Similarly, in a severe pandemic scenario such measures cut the mean attack rate from approximately 23% to 21%. In both moderate and severe pandemic scenarios, a suite of non-pharmaceutical interventions proved as effective as the targeted use of antivirals. Targeted antiviral campaigns generally appeared more effective in severe pandemic scenarios than in moderate pandemic scenarios. CONCLUSION A mathematical model of pandemic influenza transmission in Mongolia indicated that, to be successful, interventions to prevent transmission must be triggered when the first cases are detected in border regions. If social distancing measures are introduced at this stage and implemented over several weeks, they may have a notable mitigating impact. In low-income regions such as Mongolia, social distancing may be more effective than the large-scale use of antivirals.
Collapse
|
24
|
Biosecurity risks to New Zealand: Lessons from the South African experience with porcine reproductive and respiratory syndrome. N Z Vet J 2012; 60:84-5. [DOI: 10.1080/00480169.2011.641155] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/14/2022]
|
25
|
OBSERVATIONS ON THE HAEMATOPOIETIC HORMONE (ADDISIN) IN PERNICIOUS ANAEMIA. BRITISH MEDICAL JOURNAL 2011; 2:1050-1. [PMID: 20777223 DOI: 10.1136/bmj.2.3753.1050] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
|
26
|
|
27
|
Risk factors for musculoskeletal injuries of the lower limbs in Thoroughbred racehorses in New Zealand. N Z Vet J 2011; 53:171-83. [PMID: 16012587 DOI: 10.1080/00480169.2005.36502] [Citation(s) in RCA: 52] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
AIM To investigate risk factors for injury to musculoskeletal structures of the lower fore- and hind-limbs of Thoroughbred horses training and racing in New Zealand. METHODS A case-control study analysed by logistic regression was used to compare explanatory variables for musculoskeletal injuries (MSI) in racehorses. The first dataset, termed the Training dataset, involved 459 first-occurrence cases of lower-limb MSI in horses in training, and the second, the Starting dataset, comprised a subset of those horses that had started in at least one trial or race in the training preparation that ended with MSI (n=294). All training preparations for horses that did not suffer from MSI for which complete data were available were used in the analyses as controls, and provided 2,181 and 1,639 preparations for the Training and Starting datasets, respectively. Multivariate logistic regression was used to evaluate risk factors, and results were reported as odds ratios (OR) and 95% confidence intervals (CI). RESULTS Horses aged > or =5 years were at higher risk of injury than 2-year-olds. Elevated odds of MSI occurred in horses in the Starting dataset that were training in the 1997-1998 year compared with the 1999-2000 year, and in those horses where trials comprised >20% of all starts in a preparation. Training preparations that ended in winter, and horses in their third or later training preparation, had lower odds of MSI compared with those ending in other seasons or the first preparation, respectively. Reduced odds of MSI were observed in preparations in which starts occurred compared with those that had no starts, and in the Starting dataset, preparations that included more than one start had a reduced likelihood of MSI compared with preparations that had only one start. In the Training dataset, preparations longer than 20 weeks were associated with reduced odds of MSI compared with those shorter than 20 weeks. Cumulative racing distance in the last 30 days of a training preparation was best modelled with linear and quadratic terms. Results indicated that increasing cumulative racing distances were associated with an initial reduction in the odds of MSI that then levelled out and finally appeared to increase again as the explanatory variable continued to increase. The risk of MSI varied significantly between trainers. CONCLUSION This study identified intrinsic (age) and extrinsic risk factors for MSI in training and racing Thoroughbreds in New Zealand. The risk of MSI initially decreased, then increased, as cumulative racing distance increased. Significant variation between trainers indicated management and training methods influence the risk of MSI.
Collapse
|
28
|
Abstract
In reviewing the literature, no description of a lipemia occurring in relation to simple hemorrhage was found, so that the observation of the phenomenon here recorded would seem to be new. Very high percentages of fat have been found in the blood of diabetics. Fischer's case showed 18.1 per cent total ether extract. Of this very little was free fat (0.0018 gm. potassium hydroxide per gram of fat); iodine absorption was 60.6 per cent.; cholesterin, 2.6 per cent. Chatin's case, cited by Fischer, showed 1.2 per cent. cholesterin, 66.5 per cent. olein, 32.2 per cent. margarin in the fat. Neisser and Derlin in the ether extract of blood from a patient with diabetic coma found 19.7 per cent. fat, with melting point of from 39° to 41° C.; iodine absorption was 53.6 per cent. Javal in a similar case found 25.4 per cent. of fat in ether extract of dry serum (perhaps by Soxhlet method); 21 per cent. of the fat was lecithin. Bleibtreu produced alimentary lipemia in geese by feeding barley and butter. Ether extract of serum showed 6 per cent. of fat. The serum was milky with invisible droplets. Iodine absorption was 57 to 58 per cent. The fat was quite different, chemically, from the fat in the food. Lipemia disappeared a few days after discontinuing the forced feeding. Our experiments suggest, by analogy, the possible occurrence of lipemia in human anemias. In this connection it is of interest to note that we have recently demonstrated a moderate lipemia in a case of marked secondary anemia from hemorrhoids. The emaciation in such cases, as contrasted with the well-recognized conservation of the fat in pernicious anemia, suggests in human pathology a still further analogy which we now have under investigation. The fat in our lipemic rabbits differs from fats described above in its insolubility, as well as in its "constants." The change after precipitation of calcium from the serum suggests that the fat may be present in the serum as a protein-calcium-lecithin combination which is decomposed by decalcifying. While we are not prepared to offer an explanation of the mechanism of this lipemia, it is possible that the great loss of tissue proteins may have some influence on the abnormal fat metabolism. That the fat is derived from the tissues is a fair inference when its occurrence in connection with the loss of weight and the previous disappearance of the body fat are taken into consideration. A more careful study of the lipase in the blood and tissues is desirable. It may be that lowered oxidation following great loss of red cells plays a part.
Collapse
|
29
|
Quantitative risk assessment for the annual risk of exposure to Trichinella spiralis in imported chilled pork meat from New Zealand to Singapore. N Z Vet J 2009; 57:269-77. [PMID: 19802040 DOI: 10.1080/00480169.2009.58620] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
AIMS To determine the annual likelihood of exposure to an infectious dose of Trichinella spiralis from consuming imported pork meat from New Zealand to Singapore. METHODS Input values specific for chilled pork meat imported into Singapore from New Zealand were used in a quantitative risk-assessment model. The model, designed to allow any combination of importing and exporting countries, was divided into two components, viz the release assessment, and the exposure assessment that assessed the annual risk of exposure to the consumer (ARC). The former estimated the likelihood that a contaminated fresh meat product from New Zealand would arrive at Singapore's border, and took into consideration the prevalence of disease on different types of farms. The latter determined the likelihood over a year that a person in Singapore would consume one or more servings of imported fresh meat from New Zealand that contained a burden of greater than or equal to one larva(e) of T. spiralis per gram after preparation for consumption. RESULTS The ARC for offal was 2.41 x 10(-7), which was below the pre-selected safety threshold of 1.00 x 10(-6). The ARC for lean meat was 2.39 x 10(-5), which was above the acceptable safety threshold. CONCLUSIONS The study demonstrated that continued routine testing at slaughter is unnecessary for pig offal produced commercially, and provided a model with which to further assess management of the risk of exposure to T. spiralis in lean meat. CLINICAL RELEVANCE The potential of Trichinella species to cause disease in humans is a public health concern, and has created adverse effects on the international trade of fresh lean meat without regard to the surveillance measures employed by particular pork-producing countries.
Collapse
|
30
|
Association between human cases and poultry outbreaks of highly pathogenic avian influenza in Vietnam from 2003 to 2007: a nationwide study. Transbound Emerg Dis 2009; 56:311-20. [PMID: 19548896 DOI: 10.1111/j.1865-1682.2009.01086.x] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
This study quantifies the spatio-temporal association between outbreaks of highly pathogenic avian influenza H5N1 in domestic poultry (n = 3050) and human cases (n = 99) in Vietnam during 2003-2007, using rare events logistic regression. After adjusting for the effect of known confounders, the odds of a human case being reported to authorities increased by a factor of 6.17 [95% confidence interval (CI) 3.33-11.38] and 2.48 (95% CI 1.20 - 5.13) if poultry outbreaks were reported in the same district 1 week and 4 weeks later respectively. When jointly considering poultry outbreaks in the same and neighbouring districts, occurrence of poultry outbreaks in the same week, 1-week later, and 4 weeks later increased the odds of a human case by a factor of 2.75 (95% CI 1.43-5.30), 2.56 (95% CI 1.31-5.00) and 2.70 (95% CI 1.56-4.66) respectively. Our study found evidence of different levels of association between human cases and poultry outbreaks in the North and the South of the country. When considering the 9-week interval extending from 4 weeks before to 4 weeks after the week of reporting a human case, in the South poultry outbreaks were recorded in 58% of cases in the same district and 83% of cases in either the same or neighbouring districts, whereas in the North the equivalent results were only 23% and 42%. The strength of the association between human and poultry cases declined over the study period. We conclude that owner reporting of clinical disease in poultry needs to be enhanced by targeted agent-specific surveillance integrated with preventive and other measures, if human exposure is to be minimized.
Collapse
|
31
|
Re: Re: Analysis of the risk of introduction and spread of porcine reproductive and respiratory syndrome virus through importation of raw pigmeat into New Zealand. N Z Vet J 2008; 56:149-50. [PMID: 18536776 DOI: 10.1080/00480169.2008.36825] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
32
|
Descriptive summary of an outbreak of porcine post-weaning multisystemic wasting syndrome (PMWS ) in New Zealand. N Z Vet J 2008; 55:346-52. [PMID: 18059655 DOI: 10.1080/00480169.2007.36792] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
CASE HISTORY Investigations were conducted to determine the cause of an acute, multi-farm outbreak of porcine respiratory disease that included diarrhoea and subsequent loss of body condition in affected pigs. A definition for post-weaning multisystemic wasting syndrome (PMWS) including both clinical and pathological features, previously developed for the pig industry in New Zealand, was applied to the current outbreak. In addition to self-reporting by owners of affected farms, local veterinarians, disease and epidemiology consultants, and animal health officials from the Ministry of Agriculture and Forestry (MAF) were involved in conducting farm visits and submission of diagnostic specimens. CLINICAL FINDINGS AND DIAGNOSIS Pathogens known to be endemic in the pig industry in New Zealand as well as likely exotic diseases were excluded as causative agents of the outbreak. Clinical signs including dyspnoea, diarrhoea, and rapid loss of body condition were consistent with the New Zealand case definition for PMWS. Interstitial pneumonia, pulmonary oedema, generalised lymph-node enlargement, and presence of porcine circovirus type 2 (PCV2) inclusion bodies were consistently identified in affected pigs. Classical swine fever virus (CSFv), Porcine reproductive and respiratory syndrome virus (PRRSv), and Influenza virus were ruled out, using molecular and traditional virological techniques. Spread of the disease between farms was hypothesised to be facilitated by locally migrating flocks of black-backed seagulls. The original source of the disease incursion was not identified. DIAGNOSIS Based on the consistent presence of circovirus-associated lesions in lymphoid tissues in combination with generalised enlargement of lymph nodes, histiocytic interstitial pneumonia, clinical wasting, and poor response to antibiotic therapy, a diagnosis of PMWS was made. CLINICAL RELEVANCE PMWS should be considered in the differential diagnoses of sudden onset of respiratory dyspnoea, diarrhoea, and rapid loss of body condition in young pigs in New Zealand pig herds.
Collapse
|
33
|
Analysis of the risk of introduction and spread of porcine reproductive and respiratory syndrome virus through importation of raw pigmeat into New Zealand. N Z Vet J 2007; 55:326-36. [DOI: 10.1080/00480169.2007.36789] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
34
|
Decision support systems for monitoring and maintaining health in food animal populations. N Z Vet J 2007; 55:264-72. [DOI: 10.1080/00480169.2007.36780] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
35
|
A model (BSurvE) for evaluating national surveillance programs for bovine spongiform encephalopathy. Prev Vet Med 2007; 81:225-35. [PMID: 17517443 DOI: 10.1016/j.prevetmed.2007.03.006] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2005] [Revised: 03/28/2007] [Accepted: 03/28/2007] [Indexed: 11/16/2022]
Abstract
Our BSurvE spreadsheet model estimates the BSE prevalence in a national cattle population, and can be used to evaluate and compare alternative strategies for a national surveillance program. Each individual surveillance test has a point value (based on demographic and epidemiological information) that reflects the likelihood of detecting BSE in an animal of a given age leaving the population via the stated surveillance stream. A target sum point value for the country is calculated according to a user-defined design prevalence and confidence level, the number of cases detected in animals born after the selected starting date and the national adult-herd size. Surveillance tests carried out on different sub-populations of animals are ranked according to the number of points gained per unit cost, and the results can be used in designing alternative surveillance programs.
Collapse
|
36
|
A model (BSurvE) for estimating the prevalence of bovine spongiform encephalopathy in a national herd. Prev Vet Med 2007; 80:330-43. [PMID: 17507106 DOI: 10.1016/j.prevetmed.2007.03.007] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2005] [Revised: 03/28/2007] [Accepted: 03/28/2007] [Indexed: 10/23/2022]
Abstract
We developed the BSurvE spreadsheet model to estimate the true prevalence of bovine spongiform encephalopathy (BSE) in a national cattle population, and evaluate national BSE surveillance programs. BSurvE uses BSE surveillance data and demographic information about the national cattle population. The proportion of each cohort infected with BSE is found by equating the observed number of infected animals with the number expected, following a series of probability calculations and assuming a binomial distribution for the number of infected animals detected in each surveillance stream. BSurvE has been used in a series of international workshops, where analysis of national datasets demonstrated patterns of cohort infection that were consistent with infection-control activities within the country. The results also reflected the timing of known events that were high-risk for introduction of the infectious agent.
Collapse
|
37
|
Application of portfolio theory to risk-based allocation of surveillance resources in animal populations. Prev Vet Med 2007; 81:56-69. [PMID: 17509705 DOI: 10.1016/j.prevetmed.2007.04.009] [Citation(s) in RCA: 30] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
Distribution of finite levels of resources between multiple competing tasks can be a challenging problem. Resources need to be distributed across time periods and geographic locations to increase the probability of detection of a disease incursion or significant change in disease pattern. Efforts should focus primarily on areas and populations where risk factors for a given disease reach relatively high levels. In order to target resources into these areas, the overall risk level can be evaluated periodically across locations to create a dynamic national risk landscape. Methods are described to integrate the levels of various risk factors into an overall risk score for each area, to account for the certainty or variability around those measures and then to allocate surveillance resources across this risk landscape. In addition to targeting resources into high risk areas, surveillance continues in lower risk areas where there is a small yet positive chance of disease occurrence. In this paper we describe the application of portfolio theory concepts, routinely used in finance, to design surveillance portfolios for a series of examples. The appropriate level of resource investment is chosen for each disease or geographical area and time period given the degree of disease risk and uncertainty present.
Collapse
|
38
|
A description of cattle movements in two departments of Buenos Aires province, Argentina. Prev Vet Med 2006; 76:109-20. [PMID: 16777252 DOI: 10.1016/j.prevetmed.2006.04.010] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2005] [Revised: 04/06/2006] [Accepted: 04/25/2006] [Indexed: 10/24/2022]
Abstract
We present a descriptive analysis of cattle movement information retrieved from the Argentinean animal movement database for two departments in the province of Buenos Aires during 2004. For each quarter of the year (January to March, April to June, July to September, and October to December) we report the number of on- and off-farm movement events for the purpose of finishing. Our analyses show that the distribution of the number of finishing-related movement events per farm was skewed, with the majority of farms reporting at least 1 and less than 5% of farms of reporting greater than 15 finishing related movement events throughout the year. The frequency of finishing-related movement events varied over time, with a 1.2-1.8-fold increase in reported movement events from April to September, compared with the rest of the year. These analyses indicate that cattle movement patterns in these departments are dependent on the relative mix of constituent cattle enterprise types. Departments with a mixture of breeding and finishing enterprises behave as potential recipients and distributors of infectious disease, whereas departments comprised of primarily finishing enterprises are predominantly recipients of infectious disease, rather than distributors. Data integrity audits of the Argentinean animal movement database, on a regular or intermittent basis, should allow the presence of bias in these data to be quantified in greater detail.
Collapse
|
39
|
Simulation analyses to evaluate alternative control strategies for the 2002 foot-and-mouth disease outbreak in the Republic of Korea. Prev Vet Med 2006; 74:212-25. [PMID: 16423417 DOI: 10.1016/j.prevetmed.2005.12.002] [Citation(s) in RCA: 35] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2004] [Revised: 11/10/2005] [Accepted: 12/02/2005] [Indexed: 11/23/2022]
Abstract
Using the stochastic and spatial simulation model of between-farm spread of disease, InterSpread Plus, we evaluated the effect of alternative strategies for controlling the 2002 epidemic of foot-and-mouth disease (FMD) in the Republic of Korea. InterSpread Plus was parameterised to simulate epidemics of FMD in the population of farms containing susceptible animal species in the Korean counties of Yongin, Icheon, Pyongtaek, Anseong, Eumseong, Asan, Cheonan, and Jincheon. The starting point of our analyses was the simulation of a reference strategy, which approximated the real epidemic. The results of simulations of alternative epidemic-control strategies were compared with this reference strategy. Ring vaccination (when used with either limited or extended pre-emptive depopulation) reduced both the size and variability of the predicted number of infected farms. Reducing the time between disease incursion and commencement of controls had the greatest effect on reducing the predicted number of infected farms.
Collapse
|
40
|
Abstract
AIMS A survey of lung lesions and risk factors for respiratory diseases was conducted in order to estimate the prevalence of respiratory diseases in the New Zealand pig population and to identify influential management practices. METHODS Eighty-nine New Zealand pig farms with a minimum herd size of 50 sows participated in the survey, and risk factor data were collected using a mailed questionnaire. Abattoir data were recorded once in winter 1995 and once in summer 1996. A total of 6887 lungs was inspected. RESULTS The prevalence of enzootic pneumonia, pleuropneumonia and pleurisy in winter was 63.4%, 2.7% and 19.1%, respectively. Enzootic pneumonia was significantly less frequent in summer. Pleuropneumonia/pleurisy was found to be more prevalent in the South Island. The univariate risk factor analysis was consistent with earlier published evidence on the importance of environmental factors related to housing and management of the farm. The multivariate models for enzootic pneumonia and pleuropneumonia or pleurisy had a reasonably good predictive power of 81-91% for farms with high disease prevalence. CONCLUSION The results are useful to model the disease process on high-risk farms, which account for a considerable proportion of the New Zealand pig population.
Collapse
|
41
|
Spatial epidemiology of the Asian honey bee mite (Varroa destructor) in the North Island of New Zealand. Prev Vet Med 2005; 71:241-52. [PMID: 16143412 DOI: 10.1016/j.prevetmed.2005.07.007] [Citation(s) in RCA: 16] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
Abstract
We describe the spatial epidemiology of Varroa destructor infestation among honey bee apiaries in the greater Auckland area of the North Island of New Zealand. The study population was comprised of 641 apiaries located within the boundaries of the study area on 11 April 2000. Cases were those members of the study population declared Varroa-infested on the basis of testing conducted between April and June 2000. The odds of Varroa was highest in apiaries in the area surrounding transport and storage facilities in the vicinity of Auckland International Airport. A mixed-effects geostatistical model, accounting for spatial extra-binomial variation in Varroa prevalence, showed a 17% reduction in the odds of an apiary being Varroa infested for each kilometre increase in the squared distance from the likely site of incursion (95% Bayesian credible interval 7-28%). The pattern of spatially autocorrelated risk that remained after controlling for the effect of distance from the likely incursion site identified areas thought to be 'secondary' foci of Varroa infestation initiated by beekeeper-assisted movement of infested bees. Targeted investigations within these identified areas indicated that the maximum rate of local spread of Varroa was in the order of 12 km/year (interquartile range 10-15 km/year).
Collapse
|
42
|
Ranging behaviour and duration of survival of wild brushtail possums (Trichosurus vulpecula) infected withMycobacterium bovis. N Z Vet J 2005; 53:293-300. [PMID: 16220120 DOI: 10.1080/00480169.2005.36563] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
AIM To quantify the duration of survival of possums (Trichosurus vulpecula) infected with Mycobacterium bovis, and identify aspects of their behaviour which may influence the likelihood of disease transmission to domestic stock or wildlife. METHODS Capture and den locations of 14 naturally infected tuberculous possums, eight possums experimentally infected with M. bovis and eight non-infected possums were recorded between May 1998 and February 2000 at a study site near Castlepoint on the Wairarapa coast of the North Island in New Zealand. Denning behaviour was observed weekly using radiotelemetry, and possums were captured, examined and released bi-monthly. Data were used to estimate survival period; create denning, activity, and total ranges; and to identify extended forays by possums as individuals and groups. RESULTS Seventeen tuberculous possum carcasses were recovered, of which 14 (82%) were close to or within their activity range. Denning ranges were known for 10/17 possums that died. Four tuberculous possums were found dead within their denning range. Three possums made extended forays in the 3 weeks before death. Twelve possums were found dead in dense scrub, three in long grass in open woodland and two on pasture. Mean duration of survival of naturally infected possums following detection of clinical signs was 3.4 months (95% CI=2.1-5.4) and the instantaneous mortality rate was 0.293 per month (95% CI=0.184-0.470). Signs of disease were obvious for about 3 weeks prior to death. Tuberculous possums were commonly trapped on only part of the area where the total non-infected population was trapped. CONCLUSION Most tuberculous possums died within their activity range and in scrub, representing a risk of transmission of M. bovis to wildlife and livestock that forage in scrub. Smaller proportions dying on pasture represent a less frequent, but highly visible risk. Tuberculous possums were clustered on the study site, and localised possum control operations would be more effective if focussed on such areas.
Collapse
|
43
|
Abstract
All cases of lameness that occurred in cows from three dairy herds between August 198'3 and July 1990 were examined every 2 weeks from the onset of lameness until the lesions resolved. The incidences of herd lameness were 38%, 22% and 2%. Some 186 clinical lesions were identified in 134 cases of lameness in 120 cows. Sole bruising (42%) and white line separation (39%) were the most frequently diagnosed conditions. Lateral digits of the hind limbs were the most affected. The mean time from the onset of lameness to clinical recovery was 27 days and to lesion recovery was 35 days. The peak incidence of lameness occurred during winter for autumn-calving cows and during the late spring for spring-calving cows. The onset of lameness was associated with the stage of lactation and wet weather conditions. Survival analysis revealed that the probability of an individual cow lasting in the milking herd for any specified period of time without becoming lame was highly associated with both her herd environment and her age. Total lactation yields of milk, milk fat and milk protein were lower for cows suffering from lameness than for herd-mates matched on age and proximity on calving date (P<0.05). Reproductive performance was also poorer in lame cows than in their herd-mates.
Collapse
|
44
|
Effects of air temperature, air movement and artificial rain on the heat production of brushtail possums (Trichosurus vulpecula): an exploratory study. N Z Vet J 2005; 43:328-32. [PMID: 16031874 DOI: 10.1080/00480169./1995.35914] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
Two groups of six mature brushtail possums (Trichosurus vulpeculu) were housed in two respiration chambers, and their heat production, whole body conductance and lower critical temperatures were measured under a variety of simulated weather patterns. The possums were subjected to ambient temperatures of 30, 20 and 3 degrees C. At 20 and 3 degrees C, the animals were exposed to near still air and light winds (wind speed 0.8 and 6.7 km/h), both with, and without, simulated rain every 8 hours. The lower critical temperature in near still air lies between 7 and 10 degrees C. This temperature increases by about 2, 6 and 8 degrees C respectively for a wind velocity of 6.7 km/h, simulated rain and a combination of the two factors. Weather in New Zealand, especially in the cooler part of the year, will often produce conditions below the lower critical temperature of the thermoneutral zone of possums. This will necessitate significant increases in metabolic rate and hence food consumption or mobilisation of body fat reserves, which if not sustainable will result in the death of possums. Field studies have shown that this is often the case in the wild. It is proposed that this stress may be sufficient to decrease the resistance (especially cell-mediated immunity) of some possums and allow acceleration of the disease process in those infected with Mycobacterium bovis.
Collapse
|
45
|
Epidemiology of Mycobacterium bovis infection in feral ferrets (Mustela furo) in New Zealand: I. Pathology and diagnosis. N Z Vet J 2005; 45:140-50. [PMID: 16031974 DOI: 10.1080/00480169.1997.36014] [Citation(s) in RCA: 31] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
Necropsies from 228 ferrets captured from eight areas in the North and South Islands provided material for an investigation into the epidemiology of tuberculosis in feral ferrets. Mycobacterial culture of pooled lymph nodes (retropharyngeal, respiratory and jejunal) identified the prevalence of infection to be much higher than that estimated from gross lesions only. Seventy-three of the 228 animals examined (32%) were diagnosed as tuberculous. Fifty-three culture-positive ferrets and 18 seemingly uninfected animals were subjected to detailed histopathological examination. The outcomes of these investigations, including the characteristics of the disease, distribution of lesions and aids to diagnosis, are presented. Of the feral carnivores found in New Zealand, the disease persists at high prevalence only in ferrets, and is probably the maintained principally by ingestion of tuberculous carrion. The course of the disease may be prolonged in some ferrets, but tuberculosis eventually causes death of many infected animals. Microscopic hepatic granulomas may be considered pathognomonic of the disease, and have potential to be used as a rapid diagnostic tool in ferrets with no gross lesions.
Collapse
|
46
|
Abstract
OBJECTIVE To determine the temporal pattern of Yersinia infections in three goat flocks and examine the influence of management and seasonal factors on the incidence of those infections over a 1-year period. METHODS A longitudinal study involving monthly culture of faeces for Yersinia spp. from age groups of randomly selected goats on three farms in the Manawatu region of New Zealand. RESULTS The incidence of excretion of potentially pathogenic Yersinia (Yersinia pseudotuberculosis and Y enterocolitica biotypes 2, 3 and 5) peaked in winter and fell in summer. In contrast, environmental Yersinia (Y enterocolitica biotype 1A, Y frederiksenii, Y intermedia and Y rohdei) showed no clear pattern of seasonal variation. Pathogenic Yersinia were more prevalent in young animals than in adults, while environmental Yersinia were more prevalent in adults. The same type was isolated from the same animal in two or more successive months in about 20 to 25% of cases, and in the remaining cases there was a gap of at least one month between successive isolations, with many animals yielding a particular type on only a single occasion. A notable difference was that with the potentially pathogenic types, no animal had more than one period of time when it was found to be excreting a particular type, suggesting that immunity develops following exposure. In contrast, it was common for environmental types to be isolated from the same animal throughout the study period. Two goats were suspected to have developed clinical yersiniosis but all remaining infected animals showed no clinical signs of infection. CONCLUSIONS Asymptomatic Yersinia carriage was common in goats in New Zealand, with a clear seasonal and age group pattern of infection with potentially pathogenic types. There was evidence that immunity developed to potentially pathogenic types. This is the first time that Y rohdei has been isolated from goats.
Collapse
|
47
|
Abstract
AIMS To investigate the epidemiology of Yersinia species in healthy goats in New Zealand, in particular to determine the prevalence of farms with infected goats, the prevalence of infected goats on those farms, the serotypes involved, and potential risk factors for carriage. METHODS A cross-sectional study of the prevalence of Yersinia infection in infected flocks in a study population of thirty commercial goat farms in the Manawatu region of New Zealand. RESULTS Infection was detected on 60% of farms in an initial study. In a prevalence study on 18 infected farms, the study population comprised 6770 animals (mean of 376, median of 175 and range of 36 to 1295 goats/farm). Of 902 goats (296 < 1 year, 178 1 to 2 years, and 428 > 2 years) sampled from the study population, 135 (73 < 1 year, 21 1 to 2 years, and 41 > 2 years) were excreting Yersinia spp, giving an overall prevalence of 14.97% (95% confidence interval [CI]; 12.8 to 17.4) with individual farm prevalences ranging from 0.0 (+ 7.9) to 58.14% (95% Cl, 43.3 to 71.6). Goats < 1 year were more likely to be infected than 1-2 year and > 2 year old animals (relative risk [RR] = 2.1; 95% Cl, 1.3 to 3.3) and 2.6 (95% Cl, 1.8 to 3.6) respectively), but there was no significant difference between risks for 1 to 2 year and > 2 year goats (RR = 1.2; 95% CI, 0.7 to 2.0). Yersinia enterocolitica was the most common species isolated in the youngest age group, with prevalence declining with increasing age, while other species were more common in the older age groups. CONCLUSION Yersinia infections were common in goats in the study region, with younger animals apparently more susceptible to infection and in particular to infection with Y enterocolitica. The prevalence on infected farms appeared to decrease as flock size increased and to increase as stocking rates and the number of paddocks grazed increased.
Collapse
|
48
|
Abstract
A study was conducted to investigate the persistence of rabbit haemorrhagic disease virus (RHDV) in the environment. Virus was impregnated onto two carrier materials (cotton tape and bovine liver) and exposed to environmental conditions on pasture during autumn in New Zealand. Samples were collected after 1, 10, 44 and 91 days and the viability of the virus was determined by oral inoculation of susceptible 11- to 14-week-old New Zealand White rabbits. Evidence of RHDV infection was based on clinical and pathological signs and/or seroconversion to RHDV. Virus impregnated on cotton tape was viable at 10 days of exposure but not at 44 days, while in bovine liver it was still viable at 91 days. The results of this study suggest that RHDV in animal tissues such as rabbit carcasses can survive for at least 3 months in the field, while virus exposed directly to environmental conditions, such as dried excreted virus, is viable for a period of less than 1 month. Survival of RHDV in the tissues of dead animals could, therefore, provide a persistent reservoir of virus, which could initiate new outbreaks of disease after extended delays.
Collapse
|
49
|
Risk factors for injury to the superficial digital flexor tendon and suspensory apparatus in Thoroughbred racehorses in New Zealand. N Z Vet J 2005; 53:184-92. [PMID: 16012588 DOI: 10.1080/00480169.2005.36503] [Citation(s) in RCA: 59] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
AIM To investigate risk factors for injury to the superficial digital flexor tendon (SDFT) and suspensory apparatus (SA) of the forelimbs in Thoroughbred racehorses in New Zealand. METHODS Poisson and negative binomial regression, with exposure time represented by cumulative training days for each horse, were used to relate explanatory variables to the incidence rate (IR) of cases of inflammation of the SDFT (n=51), and injuries involving the SA (n=48) in a population of 1,571 commercially- trained racehorses over 554,745 study days. Only the first occurrence of an injury for any one horse was eligible for inclusion. Separate analyses were run for data from horses in training regardless of whether they had started in a trial or race, and using a subset of these data restricted to those preparations associated with at least one start in a trial or race. Results were reported as incidence rate ratios (IRR) and 95% confidence intervals (CI). RESULTS Male horses had a higher risk of injury to the SA (IRR 2.57; p=0.005) and tended to have a higher risk of injury to the SDFT (IRR 1.74; p=0.09) than female horses. Increasing age was associated with increased risk of injury. Horses aged 4 and > or =5 years were 6.76 (p<0.001) and 15.26 (p<0.001) times more likely to incur injury to the SDFT, and 2.91 (p=0.02) and 3.54 (p=0.005) times more likely to incur injury to the SA, respectively, than 2-year-olds. Horses were more likely to suffer an injury to the SDFT or SA in a training preparation that was not associated with any starts in official trials or races compared with those preparations that were associated with more than one start (p<0.001), and more likely to injure the SA compared with preparations containing one start (p=0.03). The IR of injury to the SDFT tended to be lower between November-January (IRR 0.78; p=0.08) and February-April (IRR 0.75; p=0.08) compared with August-October. Incidence of injury to the SDFT or SA was not associated with the cumulative distance raced in the last 30 days of a training preparation. CONCLUSION This study identified risk factors for injury to the SDFT and SA in Thoroughbred racehorses in New Zealand. Injuries were more likely in males, older horses and in horses in training preparations without any starts. There was no evidence of association between injury and cumulative high-speed exercise.
Collapse
|
50
|
Effect of training location and time period on racehorse performance in New Zealand. 1. Descriptive analysis. N Z Vet J 2005; 52:236-42. [PMID: 15768118 DOI: 10.1080/00480169.2004.36434] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
AIM To describe characteristics of Thoroughbred training stables in Matamata and in all other locations in New Zealand combined, over two 19-month time periods in 1996-1997 and 1998-1999, representing equal length periods immediately prior to and after the construction of a new training surface at the Matamata Racing Club. METHODS Retrospective records covering all horses training and racing in New Zealand during two 19-month time periods (1996-1997 and 1998-1999), covering 161 locations, were obtained from New Zealand Thoroughbred Racing (NZTR). Outcome variables included whether a horse was raced again in the 6 months following any start in the first 13 months of either time period, number of race starts for every horse, and finishing position. Summary measures with confidence intervals (CI) and unadjusted odds ratios (OR), measuring strength of associations for various factors, were computed. RESULTS The datasets contained information on 45,446 horses, 11,336 races, 5,110 trials and a total of 110,643 race starts. Horses trained at Matamata represented 8% (3,715) of the total horse datasets, and accounted for 11,977 race starts (10.8%). They were more likely to start in a race or trial in either time period and were 1.4 and 1.3 times as likely to finish first, second or third compared with horses trained at other locations in 1996-1997 and 1998-1999, respectively. A 6-month no-race period occurred for 9,306/12,584 (74%) horses that started at least once in the first 13 months of either time period. Horses trained at Matamata were less likely to have a 6-month no-race period than horses trained at other locations in both time periods. There was no effect of time period within each location on the probability of either a horse having a 6-month no-race period or of a race start being followed by a 6-month no-race period, but there was an overall effect of time and more 6-month no-race periods were observed in 1998-1999 relative to 1996-1997. CONCLUSION Summary statistics are presented for Thoroughbred racing in New Zealand over two 19-month time periods. Differences between the populations of horses trained in Matamata compared with those trained at other locations were attributed, in part, to the fact that many of the more successful racehorse trainers in the country have stables at Matamata. As a result, the population of horses in Matamata may not be representative of the racehorse population in New Zealand. Although more likely to win or place in both time periods, the magnitude of the advantage to horses in Matamata was reduced in 1998-1999 relative to 1996-1997, and this could be due, in part, to effects of the new track surface at Matamata. There was no evidence of a rise in risk of a 6-month no-race period following any race start in those horses trained in Matamata in 1998-1999 relative to either horses trained at other locations or to horses trained in Matamata during the earlier time period.
Collapse
|