1
|
Establishing peat-forming plant communities: A comparison of wetland reclamation methods in Alberta's oil sands region. ECOLOGICAL APPLICATIONS : A PUBLICATION OF THE ECOLOGICAL SOCIETY OF AMERICA 2024; 34:e2929. [PMID: 37942503 DOI: 10.1002/eap.2929] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/22/2023] [Revised: 08/23/2023] [Accepted: 09/15/2023] [Indexed: 11/10/2023]
Abstract
The Sandhill Wetland (SW) and Nikanotee Fen (NF) are two wetland research projects designed to test the viability of peatland reclamation in the Alberta oil sands post-mining landscape. To identify effective approaches for establishing peat-forming vegetation in reclaimed wetlands, we evaluated how plant introduction approaches and water level gradients influence species distribution, plant community development, and the establishment of bryophyte and peatland species richness and cover. Plant introduction approaches included seeding with a Carex aquatilis-dominated seed mix, planting C. aquatilis and Juncus balticus seedlings, and spreading a harvested moss layer transfer. Establishment was assessed 6 years after the introduction at SW and 5 years after the introduction at NF. In total, 51 species were introduced to the reclaimed wetlands, and 122 species were observed after 5 and 6 years. The most abundant species in both reclaimed wetlands was C. aquatilis, which produced dense canopies and occupied the largest water level range of observed plants. Introducing C. aquatilis also helped to exclude marsh plants such as Typha latifolia that has little to no peat accumulation potential. Juncus balticus persisted where the water table was lower and encouraged the formation of a diverse peatland community and facilitated bryophyte establishment. Various bryophytes colonized suitable areas, but the moss layer transfer increased the cover of desirable peat-forming mosses. Communities with the highest bryophyte and peatland species richness and cover (averaging 9 and 14 species, and 50%-160% cover respectively) occurred where the summer water level was between -10 and -40 cm. Outside this water level range, a marsh community of Typha latifolia dominated in standing water and a wet meadow upland community of Calamagrostis canadensis and woody species established where the water table was deeper. Overall, the two wetland reclamation projects demonstrated that establishing peat-forming vascular plants and bryophytes is possible, and community formation is dependent upon water level and plant introduction approaches. Future projects should aim to create microtopography with water tables within 40 cm of the surface and introduce vascular plants such as J. balticus that facilitate bryophyte establishment and support the development of a diverse peatland plant community.
Collapse
|
2
|
Characteristics of Premature Radiotherapy Terminations in Patients with Oral Cavity and Laryngeal Carcinomas. Int J Radiat Oncol Biol Phys 2023; 117:e573-e574. [PMID: 37785748 DOI: 10.1016/j.ijrobp.2023.06.1907] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/04/2023]
Abstract
PURPOSE/OBJECTIVE(S) Premature radiation therapy (RT) terminations in patients with head and neck cancer result in poor outcomes. However, the underlying factors that contribute to early RT termination are understudied, especially in the era of hypofractionated treatment. In this retrospective single institution study, we examined causes and clinical characteristics of premature terminations in oral cavity (OC) and laryngeal carcinomas. MATERIALS/METHODS We reviewed charts of 188 patients treated with RT ± systemic therapy for OC and laryngeal cancer from 2017-2022. Patients were typically prescribed standard 1.8-2.0 Gy fractionation regimens, though patients deemed unlikely to complete conventional RT upon initial evaluation were given SBRT. Premature termination was defined as completion of less than 95% prescribed RT. We collected pertinent demographic, clinicopathological data on this termination cohort, which was compared to a matched cohort of patients with RT completion. We used logistic regression analysis to examine factors predictive of premature termination. RESULTS Of the patients included in this analysis, 72.7% were prescribed adjuvant RT [9.1% OC, 45.5% larynx] vs. 27.3% primary RT [90.9% OC, 45.5% larynx]. 84.6% received conventional IMRT, while 15.4% received SBRT. 17 patients (9.0%) had premature RT (all IMRT) terminations- 9 OC and 8 laryngeal primaries. Mean age of those who had premature termination was 79.5 years (range: 70-98). 70.6% were male, 58.8% were white, and 23.5% were single/widowed. Majority received concurrent systemic therapy (58.8%), had AJCC (8th Ed.) Stage ≥ III (76.5%), Charlson-Comorbidity Index ≥6 (64.7%), ECOG score ≥2 (70.6%), smoked >10 pack-years (76.5%), and lived >10 miles from RT facility (58.8%). The most common documented reasons for premature termination were: subjective intolerance (29.4%), death (23.5%), objective RT toxicity (23.5%), and inpatient admission (17.6%). The mean time on treatment for IMRT was 27.8 days for termination cohort vs. 47.7 days for completion cohort. The percentage of patients reporting RT toxicity (CTCAE v5.0 mucositis, severe weight loss, oral infection, e.g.) was 88.2% for termination cohort vs. 29.6% for completion cohort. On regression analysis, ECOG score at the time of initiation of RT was independently associated with premature termination (OR: 2.438, 95% CI: 1.155-5.146, p = .019). CONCLUSION This retrospective analysis of patients undergoing RT for OC and laryngeal cancers at our tertiary care center demonstrated nearly 1 in 10 patients are at risk for premature termination. Poor performance status was independently associated with premature termination. There was a 100% completion rate in hypofractionated treatment with SBRT. Taken together, poor performance status may identify patients at risk for premature termination and thus identify good candidates for SBRT protocols.
Collapse
|
3
|
Impact of Dosimetric Parameters on Local Control and Toxicity of Head and Neck SBRT. Int J Radiat Oncol Biol Phys 2023; 117:e572-e573. [PMID: 37785745 DOI: 10.1016/j.ijrobp.2023.06.1904] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/04/2023]
Abstract
PURPOSE/OBJECTIVE(S) SBRT is potentially useful as a local therapy for head and neck cancer patients who require re-irradiation, as well as those who may not be candidates for surgical resection or a lengthy course of conventionally fractionated radiation therapy. The objective of this study was to assess rates of local control with SBRT, as well as the impact of mucosal dosimetric parameters on rates of grade 3 or higher toxicity in patients treated with head and neck SBRT. MATERIALS/METHODS We retrospectively reviewed patients within our institution who underwent SBRT for cancers of the oral cavity and oropharynx between 2013 and 2022. Primary endpoints were local control (LC) and grade 3 or higher toxicity potentially attributable to SBRT, based on CTCAE 5.0. We analyzed the following endpoints as potential predictors of toxicity: 1) ratio of total oral mucosal volume/oral mucosal volume outside of PTV, 2) mean dose to oral mucosa, and 3) maximum dose to oral mucosa. We conducted regression analysis to determine predictors of local failure and severe toxicity. We estimated local control using Kaplan-Meier analysis. RESULTS We treated 66 tumors in 60 patients with a median age of 71 years. 41 patients (68.3%) had oral cavity cancer and 19 (31.6%) had oropharynx cancer. 64 tumors (97.0%) were squamous cell carcinomas. 32 tumors (48.5%) were previously irradiated. Mean PTV volume was 55 cc (range: 39.4-74.1 ccs). Median prescribed radiation dose was 40 Gy given in 5 biweekly fractions. A total of 51 patients received systemic therapy (platinum-based chemotherapy in 52%, Cetuximab in 38%). 10 patients (16.7%) additionally received immunotherapy, constituting 56% of 18 patients treated from 2019-2022. Median pain score at presentation was 3/10. Oral pain non-significantly increased between 3 weeks and 3 months after starting treatment and subsequently returned to baseline after SBRT (P = 0.227). Local control was 68.5%, at a median follow up of 9.8 months. In oral cavity tumors, 1- and 2-year local control rates were 78.7% and 43.1% respectively. 1- and 2-year rates of LC were 78.9% and 50.2% in oropharynx tumors. Grade 3 or higher toxicities were present in 18 patients (30.0%), including osteonecrosis in 6 (10.0%) and ulceration or extensive tissue necrosis in 12 (20.0%). No significant relationship was present between mucosal surface radiation doses and acute oral mucosa toxicity. On regression analysis for both local control and grade 3 or higher toxicity, we did not find any significant association with prior radiation, disease site, age, or PTV volume. CONCLUSION SBRT provided comparable local control for tumors of the oropharynx and oral cavity, with slightly higher 2-year local control in tumors of the oropharynx, and comparable rates of toxicity. We appreciated increased use of immunotherapy in our study population from 2019 onwards. We did not find any relationships between dosimetric parameters and rates of grade 3 or higher toxicity, or local control, though our analysis is limited by a small sample size.
Collapse
|
4
|
The Association of Chemoradiation Induced Lymphopenia with Racial Disparity and Its Prognostic Impact on Survival for Anal Cancer. Int J Radiat Oncol Biol Phys 2023; 117:e299-e300. [PMID: 37785093 DOI: 10.1016/j.ijrobp.2023.06.2313] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/04/2023]
Abstract
PURPOSE/OBJECTIVE(S) While the association between chemoradiation induced lymphopenia (CIL) and poor overall survival (OS) is established in multiple solid malignancies, it has not been studied in anal cancer. Racial and socioeconomic disparities as potential predictors of lymphopenia have not been reported. We hypothesize that race and socioeconomic status is associated with increased incidence of severe CIL, which can predict worse overall survival for patients with anal cancer. MATERIALS/METHODS A cohort of 75 patients treated with definitive chemoradiation (CRT) for squamous cell anal cancer from January 2014 to December 2020 was reviewed. Total lymphocyte counts (TLC) at baseline and TLC nadir at 1 month post-CRT were analyzed. Logistic regression was used to identify associations between race, gender, ethnicity, median household income by zip code, marital status, baseline hematopoietic cell counts, and post-CRT Grade 3+ lymphopenia (TLC <0.5k/μL). Kaplan-Meier method and Cox regression model were used to perform survival analysis. RESULTS Of the 75 patients identified, mean age was 66.9 years and median follow-up time was 37.1 months. There were 63 females, 53 non-Hispanic whites, 22 minorities (12 Blacks, 9 Hispanics, 1 Asians) Radiation dose ranged from 41.4 Gray to 56 Gray. At 1 month post CRT, 85.3% developed lymphopenia (G1 9.3%, G2 26.7%, G3 37.3%, G4 12.0%). On multivariate logistic regression, non-white race demonstrated a trend to have more Grade 3+ lymphopenia (OR = 3.5, p = 0.07). On univariate Cox regression, poorer overall survival was associated with race (HR 3.7, p = 0.04), baseline white blood count (HR 1.3, p = 0.04), baseline hemoglobin (HR 0.6, p = 0.04), and post-CRT Grade 3+ lymphopenia (HR 5.8, p = 0.03). On multivariate Cox regression, only post-CRT Grade 3+ lymphopenia was associated with worse OS (HR 7.5, p = 0.049). 5-year OS significantly differed between patients with and without post-CRT Grade 3+ lymphopenia (62.3% vs 94.7%, P = 0.01). CONCLUSION Lymphopenia is commonly observed after chemoradiation for anal cancer. Racial disparity is associated with severe lymphopenia induced by chemoradiation, which is a robust predictor of poor survival in anal cancer. More attention to lymphopenia induced by chemoradiation for anal cancer is needed, particularly in racial minorities.
Collapse
|
5
|
The roles and impacts of worldviews in the context of meditation-related challenges. Transcult Psychiatry 2023; 60:637-650. [PMID: 36476189 DOI: 10.1177/13634615221128679] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
Abstract
Previous research has shown that worldviews can serve as a coping response to periods of difficulty or struggle, and worldviews can also change on account of difficulty. This paper investigates the impacts worldviews have on the nature and trajectory of meditation-related challenges, as well as how worldviews change or are impacted by such challenges. The context of meditation-related challenges provided by data from the Varieties of Contemplative Experience research project offers a unique insight into the dynamics between worldviews and meditation. Buddhist meditation practitioners and meditation experts interviewed for the study report how, for some, worldviews can serve as a risk factor impacting the onset and trajectory of meditation-related challenges, while, for others, worldviews (e.g., being given a worldview, applying a worldview, or changing a worldview) were reported as a remedy for mitigating challenging experiences and/or their associated distress. Buddhist meditation practitioners and teachers in the contemporary West are also situated in a cultural context in which religious and scientific worldviews and explanatory frameworks are dually available. Furthermore, the context of "Buddhist modernism" has also promoted a unique configuration in which the theory and practice of Buddhism is presented as being closely compatible with science. We identify and discuss the various impacts that religious and scientific worldviews have on meditation practitioners and meditation teachers who navigate periods of challenge associated with the practice.
Collapse
|
6
|
The effect of recombinant erythropoietin on long-term outcome after moderate-to-severe traumatic brain injury. Intensive Care Med 2023; 49:831-839. [PMID: 37405413 PMCID: PMC10353955 DOI: 10.1007/s00134-023-07141-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/17/2023] [Accepted: 06/12/2023] [Indexed: 07/06/2023]
Abstract
PURPOSE Recombinant erythropoietin (EPO) administered for traumatic brain injury (TBI) may increase short-term survival, but the long-term effect is unknown. METHODS We conducted a pre-planned long-term follow-up of patients in the multicentre erythropoietin in TBI trial (2010-2015). We invited survivors to follow-up and evaluated survival and functional outcome with the Glasgow Outcome Scale-Extended (GOSE) (categories 5-8 = good outcome), and secondly, with good outcome determined relative to baseline function (sliding scale). We used survival analysis to assess time to death and absolute risk differences (ARD) to assess favorable outcomes. We categorized TBI severity with the International Mission for Prognosis and Analysis of Clinical Trials in TBI model. Heterogeneity of treatment effects were assessed with interaction p-values based on the following a priori defined subgroups, the severity of TBI, and the presence of an intracranial mass lesion and multi-trauma in addition to TBI. RESULTS Of 603 patients in the original trial, 487 patients had survival data; 356 were included in the follow-up at a median of 6 years from injury. There was no difference between treatment groups for patient survival [EPO vs placebo hazard ratio (HR) (95% confidence interval (CI) 0.73 (0.47-1.14) p = 0.17]. Good outcome rates were 110/175 (63%) in the EPO group vs 100/181 (55%) in the placebo group (ARD 8%, 95% CI [Formula: see text] 3 to 18%, p = 0.14). When good outcome was determined relative to baseline risk, the EPO groups had better GOSE (sliding scale ARD 12%, 95% CI 2-22%, p = 0.02). When considering long-term patient survival, there was no evidence for heterogeneity of treatment effect (HTE) according to severity of TBI (p = 0.85), presence of an intracranial mass lesion (p = 0.48), or whether the patient had multi-trauma in addition to TBI (p = 0.08). Similarly, no evidence of treatment heterogeneity was seen for the effect of EPO on functional outcome. CONCLUSION EPO neither decreased overall long-term mortality nor improved functional outcome in moderate or severe TBI patients treated in the intensive care unit (ICU). The limited sample size makes it difficult to make final conclusions about the use of EPO in TBI.
Collapse
|
7
|
The High-Elevation Peatlands of the Northern Andes, Colombia. PLANTS (BASEL, SWITZERLAND) 2023; 12:plants12040955. [PMID: 36840306 PMCID: PMC9967791 DOI: 10.3390/plants12040955] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/15/2022] [Revised: 01/27/2023] [Accepted: 02/01/2023] [Indexed: 05/31/2023]
Abstract
Andean peatlands are important carbon reservoirs for countries in the northern Andes and have a unique diversity. Peatland plant diversity is generally related to hydrology and water chemistry, and the response of the vegetation in tropical high-elevation peatlands to changes in elevation, climate, and disturbance is poorly understood. Here, we address the questions of what the main vegetation types of peat-forming vegetation in the northern Andes are, and how the different vegetation types are related to water chemistry and pH. We measured plant diversity in 121 peatlands. We identified a total of 264 species, including 124 bryophytes and 140 vascular plants. We differentiated five main vegetation types: cushion plants, Sphagnum, true mosses, sedges, and grasses. Cushion-dominated peatlands are restricted to elevations above 4000 m. Variation in peatland vegetation is mostly driven be elevation and water chemistry. Encroachment of sedges and Sphagnum sancto-josephense in disturbed sites was associated with a reduction in soil carbon. We conclude that peatland variation is driven first by elevation and climate followed by water chemistry and human disturbances. Sites with higher human disturbances had lower carbon content. Peat-forming vegetation in the northern Andes was unique to each site bringing challenges on how to better conserve them and the ecosystem services they offer.
Collapse
|
8
|
Distributions of baseline categorical variables were different from the expected distributions in randomized trials with integrity concerns. J Clin Epidemiol 2023; 154:117-124. [PMID: 36584733 DOI: 10.1016/j.jclinepi.2022.12.018] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2022] [Revised: 12/08/2022] [Accepted: 12/21/2022] [Indexed: 12/29/2022]
Abstract
BACKGROUND AND OBJECTIVES Comparing observed and expected distributions of baseline continuous variables in randomized controlled trials (RCTs) can be used to assess publication integrity. We explored whether baseline categorical variables could also be used. METHODS The observed and expected (binomial) distribution of all baseline categorical variables were compared in four sets of RCTs: two controls, and two with publication integrity concerns. We also compared baseline calculated and reported P-values. RESULTS The observed and expected distributions of baseline categorical variables were similar in the control datasets, both for frequency counts (and percentages) and for between-group differences in frequency counts. However, in both sets of RCTs with publication integrity concerns, about twice as many variables as expected had between-group differences in frequency counts of one or 2, and far fewer variables than expected had between-group differences of >4 (P < 0.001 for both datasets). Furthermore, about one in six reported P-values for baseline categorial variables differed by > 0.1 from the calculated P-value in trials with publication integrity concerns. CONCLUSION Comparing the observed and expected distributions and reported and calculated P-values of baseline categorical variables may help in the assessment of publication integrity of a body of RCTs.
Collapse
|
9
|
CoreView: fresh tissue biopsy assessment at the bedside using a millifluidic imaging chip. LAB ON A CHIP 2022; 22:1354-1364. [PMID: 35212692 PMCID: PMC8967779 DOI: 10.1039/d1lc01142a] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Minimally invasive core needle biopsies for medical diagnoses have become increasingly common for many diseases. Although tissue cores can yield more diagnostic information than fine needle biopsies and cytologic evaluations, there is no rapid assessment at the point-of-care for intact tissue cores that is low-cost and non-destructive to the biopsy. We have developed a proof-of-concept 3D printed millifluidic histopathology lab-on-a-chip device to automatically handle, process, and image fresh core needle biopsies. This device, named CoreView, includes modules for biopsy removal from the acquisition tool, transport, staining and rinsing, imaging, segmentation, and multiplexed storage. Reliable removal from side-cutting needles and bidirectional fluid transport of core needle biopsies of five tissue types has been demonstrated with 0.5 mm positioning accuracy. Automation is aided by a MATLAB-based biopsy tracking algorithm that can detect the location of tissue and air bubbles in the channels of the millifluidic chip. With current and emerging optical imaging technologies, CoreView can be used for a rapid adequacy test at the point-of-care for tissue identification as well as glomeruli counting in renal core needle biopsies.
Collapse
|
10
|
Predictors of death and new disability after critical illness: a multicentre prospective cohort study. Intensive Care Med 2021; 47:772-781. [PMID: 34089063 DOI: 10.1007/s00134-021-06438-7] [Citation(s) in RCA: 25] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2021] [Accepted: 05/15/2021] [Indexed: 11/24/2022]
Abstract
PURPOSE This study aimed to determine the prevalence and predictors of death or new disability following critical illness. METHODS Prospective, multicentre cohort study conducted in six metropolitan intensive care units (ICU). Participants were adults admitted to the ICU who received more than 24 h of mechanical ventilation. The primary outcome was death or new disability at 6 months, with new disability defined by a 10% increase in the WHODAS 2.0. RESULTS Of 628 patients with the primary outcome available (median age of 62 [49-71] years, 379 [61.0%] had a medical admission and 370 (58.9%) died or developed new disability by 6 months. Independent predictors of death or new disability included age [OR 1.02 (1.01-1.03), P = 0.001], higher severity of illness (APACHE III) [OR 1.02 (1.01-1.03), P < 0.001] and admission diagnosis. Compared to patients with a surgical admission diagnosis, patients with a cardiac arrest [OR (95% CI) 4.06 (1.89-8.68), P < 0.001], sepsis [OR (95% CI) 2.43 (1.32-4.47), P = 0.004], or trauma [OR (95% CI) 6.24 (3.07-12.71), P < 0.001] diagnosis had higher odds of death or new disability, while patients with a lung transplant [OR (95% CI) 0.21 (0.07-0.58), P = 0.003] diagnosis had lower odds. A model including these three variables had good calibration (Brier score 0.20) and acceptable discriminative power with an area under the receiver operating characteristic curve of 0.76 (95% CI 0.72-0.80). CONCLUSION Less than half of all patients mechanically ventilated for more than 24 h were alive and free of new disability at 6 months after admission to ICU. A model including age, illness severity and admission diagnosis has acceptable discriminative ability to predict death or new disability at 6 months.
Collapse
|
11
|
Abstract
BACKGROUND Research on the adverse effects of mindfulness-based programs (MBPs) has been sparse and hindered by methodological imprecision. METHODS The 44-item Meditation Experiences Interview (MedEx-I) was used by an independent assessor to measure meditation-related side effects (MRSE) following three variants of an 8-week program of mindfulness-based cognitive therapy (n = 96). Each item was queried for occurrence, causal link to mindfulness meditation practice, duration, valence, and impact on functioning. RESULTS Eighty-three percent of the MBP sample reported at least one MRSE. Meditation-related adverse effects (MRAEs) with negative valences or negative impacts on functioning occurred in 58% and 37% of the sample, respectively. Lasting bad effects occurred in 6-14% of the sample and were associated with signs of dysregulated arousal (hyperarousal and dissociation). CONCLUSION Meditation practice in MBPs is associated with transient distress and negative impacts at similar rates to other psychological treatments.
Collapse
|
12
|
Descriptive record of the activity of military critical care transfer teams deployed to London in April 2020 to undertake transfer of patients with COVID-19. BMJ Mil Health 2020; 169:e74-e77. [PMID: 33372109 DOI: 10.1136/bmjmilitary-2020-001619] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2020] [Revised: 11/12/2020] [Accepted: 11/21/2020] [Indexed: 11/03/2022]
Abstract
In the face of the COVID-19 outbreak, military healthcare teams were deployed to London to assist the London Ambulance Service t transfer ventilated patients between medical facilities. This paper describes the preparation and activity of these military teams, records the lessons identified (LI) and reviews the complications encountered'. The teams each had two members. A consultant or registrar in emergency medicine (EM) and pre-hospitalemergency medicine (PHEM)E or anaesthesia and an emergency nurse or paramedic. Following a period of training, the teams undertook 52 transfers over a 14-day period. LI centred around minimising both interruption to ventilation and risk of aerosolisation of infectious particles and thus the risk of transmission of COVID-19 to the treating clinicians. Three patient-related complications (6% of all transfers) were identified. This was the first occasion on which the Defence Medical Services (DMS) were the main focus of a large-scale clinical military aid to the civil authorities. It demonstrated that DMS personnel have the flexibility to deliver a novel effect and the ability to seamlessly and rapidly integrate with a civilian organisation. It highlighted some clinical lessons that may be useful for future prehospital emergency care taskings where patients may have a transmissible respiratory pathogen. It also showed that clinicians from different backgrounds are able to safely undertake secondary transfer of ventilated patients. This approacmay enhance flexibility in future operational patient care pathways.
Collapse
|
13
|
Erratum to “Six-month mortality and functional outcomes in aneurysmal sub-arachnoid haemorrhage patients admitted to intensive care units in Australia and New Zealand: A prospective cohort study” [J. Clin. Neurosci. 80 (2020) 92–99]. J Clin Neurosci 2020; 82:192. [DOI: 10.1016/j.jocn.2020.11.018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
14
|
Can long‐lived species keep pace with climate change? Evidence of local persistence potential in a widespread conifer. DIVERS DISTRIB 2020. [DOI: 10.1111/ddi.13191] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/21/2022] Open
|
15
|
Short- and long-term responses of riparian cottonwoods (Populus spp.) to flow diversion: Analysis of tree-ring radial growth and stable carbon isotopes. THE SCIENCE OF THE TOTAL ENVIRONMENT 2020; 735:139523. [PMID: 32502819 DOI: 10.1016/j.scitotenv.2020.139523] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/26/2020] [Revised: 05/05/2020] [Accepted: 05/16/2020] [Indexed: 06/11/2023]
Abstract
Long duration tree-ring records with annual precision allow for the reconstruction of past growing conditions. Investigations limited to the most common tree-ring proxy of ring width can be difficult to interpret, however, because radial growth is affected by multiple environmental processes. Furthermore, studies of living trees may miss important effects of drought on tree survival and forest changes. Stable carbon isotopes can help distinguish drought from other environmental factors that influence tree-ring width and forest stand condition. We quantified tree-ring radial expansion and stable carbon isotope ratios (δ13C) in riparian cottonwoods (Populus angustifolia and P. angustifolia x P.trichocarpa) along Snake Creek in Nevada, USA. We investigated how hydrological drought affected tree growth and death at annual to half-century scales in a partially dewatered reach (DW) compared to reference reaches immediately upstream and downstream. A gradual decline in tree-ring basal area increment (BAI) began at DW concurrent to streamflow diversion in 1961. BAI at DW diverged from one reference reach immediately but not from the other until nearly 50 years later. In contrast, tree-ring δ13C had a rapid and sustained increase following diversion at DW only, providing the stronger and clearer drought signal. BAI and δ13C were not significantly correlated prior to diversion; after diversion they both reflected drought and were correlated for DW trees only. Cluster analyses distinguished all trees in DW from those in reference reaches based on δ13C, but BAI patterns left trees intermixed across reaches. Branch and tree mortality were also highest and canopy vigor was lowest in DW. Results indicate that water scarcity strongly limited cottonwood photosynthesis following flow diversion, thus reducing carbon assimilation, basal growth and survival. The dieback was not sudden, but occurred over decades as carbon deficits mounted and depleted streamflow left trees increasingly vulnerable to local meteorological drought.
Collapse
|
16
|
Abstract LB-285: Rapid needle biopsy assessment at point of care to advance personalized cancer therapy. Cancer Res 2020. [DOI: 10.1158/1538-7445.am2020-lb-285] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Abstract
Rationale: Although core needle biopsies (CNBs) are the preferred minimally invasive procedure for breast cancer diagnostics, the standard of care H&E histopathologic analysis of core biopsies is both tissue destructive and labor intensive. Our novel millifluidic instrument, CoreView, is designed to evaluate fresh CNBs to make a real time cancer diagnosis or adequacy determination, conserve tissue for downstream diagnostics and assist in timely treatment planning. Technology and Methods: CoreView is designed to handle, stain, and transport CNBs with the use of millifluidics. The first stage of the CoreView removes biopsy procured tissue from side-cut and end-cut needle-type biopsy guns (14-16 gauge). The reliability of the novel low pressure fluidic process in removing intact CNBs was tested using freshly-excised (ex-vivo) porcine breast (N = 28), kidney (N = 21), liver (N = 42), lung (N = 25), and lymph node (N = 30) tissue. The shear rate, provided by the driving pressure of saline solution, was increased in increments until the CNB was removed and the resulting intactness was qualitatively analyzed by preserved overall shape and size. For optical imaging, a second stage streams fluorescence dye over the CNB for H&E equivalent staining. High resolution imaging with 285 nm excitation is currently performed using microscopy with UV-surface excitation [MUSE, Levenson Group, UC Davis] for anatomical imaging. Previous studies have shown the optical sectioning thickness of MUSE is about 3x thicker than microtome-sectioned specimens. To improve the quality of images produced by MUSE within CoreView, we are limiting UV tissue penetration by steric hindrance of dye molecules using conjugated nanoparticles, such as 525-705 nm CdSe Qdots [ThermoFisher], as well as optical blocking dyes as counterstains. Results: The CNB removal stage was 3D-printed in a patent pending disposable design which is optically clear for monitoring the process. CNBs were fully released and structurally intact for analysis in 93% breast, 100% kidney, 100% liver, 84% lung, and 90% lymph node CNB samples. The average volume per time (µL/ms), an analog to shear rate, for each tissue type was 2.19 (σ;; = 1.04), 1.97 (σ;; =0.71), 1.74 (σ;; = 0.87), 2.05 (σ;; = 0.87), and 2.01 (σ;; = 0.87) for breast, kidney, liver, lung, and lymph node, respectively. The CNB tissue was removed in under 15 seconds, some as quickly as 2 seconds. Furthermore, the device's round channel, which holds a CNB with diameter 10-20% less than the channel, results in inherent structural preservation. Conclusions: The novel millifluidic biopsy removal device is an essential first stage to CoreView's automated biopsy handling and future MUSE imaging and adequacy analysis. The customizable CoreView system design allows optimization for specific CNB needle, tissue, and imaging procedure. We predict an adequacy check at nearly 100% reliability in CNB release and integrity within a few minutes at the point of care.
Citation Format: David J. Cooper, Mark E. Fauver, Suzanne M. Dintzis, Eric J. Seibel. Rapid needle biopsy assessment at point of care to advance personalized cancer therapy [abstract]. In: Proceedings of the Annual Meeting of the American Association for Cancer Research 2020; 2020 Apr 27-28 and Jun 22-24. Philadelphia (PA): AACR; Cancer Res 2020;80(16 Suppl):Abstract nr LB-285.
Collapse
|
17
|
Progress or Pathology? Differential Diagnosis and Intervention Criteria for Meditation-Related Challenges: Perspectives From Buddhist Meditation Teachers and Practitioners. Front Psychol 2020; 11:1905. [PMID: 32849115 PMCID: PMC7403193 DOI: 10.3389/fpsyg.2020.01905] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2020] [Accepted: 07/10/2020] [Indexed: 11/25/2022] Open
Abstract
Studies in the psychology and phenomenology of religious experience have long acknowledged similarities with various forms of psychopathology. Consequently, it has been important for religious practitioners and mental health professionals to establish criteria by which religious, spiritual, or mystical experiences can be differentiated from psychopathological experiences. Many previous attempts at differential diagnosis have been based on limited textual accounts of mystical experience or on outdated theoretical studies of mysticism. In contrast, this study presents qualitative data from contemporary Buddhist meditation practitioners and teachers to identify salient features that can be used to guide differential diagnosis. The use of certain existing criteria is complicated by Buddhist worldviews that some difficult or distressing experiences may be expected as a part of progress on the contemplative path. This paper argues that it is important to expand the framework for assessment in both scholarly and clinical contexts to include not only criteria for determining normative fit with religious experience or with psychopathology, but also for determining need for intervention, whether religious or clinical. Qualitative data from Buddhist communities shows that there is a wider range of experiences that are evaluated as potentially warranting intervention than has previously been discussed. Decision making around these experiences often takes into account contextual factors when determining appraisals or need for intervention. This is in line with person-centered approaches in mental health care that emphasize the importance of considering the interpersonal and cultural dynamics that inevitably constitute the context in which experiences are evaluated and rendered meaningful.
Collapse
|
18
|
The perceived barriers and facilitators to implementation of ECMO services in acute hospitals. Intensive Care Med 2020; 46:2115-2117. [DOI: 10.1007/s00134-020-06187-z] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/16/2020] [Indexed: 10/23/2022]
|
19
|
A survey of extracorporeal membrane oxygenation practice in 23 Australian adult intensive care units. CRIT CARE RESUSC 2020; 22:166-170. [PMID: 32389109 PMCID: PMC10692478] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
|
20
|
A survey of extracorporeal membrane oxygenation practice in 23 Australian adult intensive care units. CRIT CARE RESUSC 2020. [DOI: 10.51893/2020.2.sur7] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
In Australia, extracorporeal membrane oxygenation (ECMO) is one of the most expensive diagnosis-related groups, costing $305 463 per complex admission to the intensive care unit(ICU). Mortality in this group of patients is high, about 43% for respiratory failure and 68% for cardiac failure. ECMO is associated with significant risk to the patient and requires specialist training andexpertise. Variation in clinical practice for patients supported with ECMO may compromise patient care and outcomes.
Collapse
|
21
|
EPO treatment does not alter acute serum profiles of GFAP and S100B after TBI: A brief report on the Australian EPO-TBI clinical trial. J Clin Neurosci 2020; 76:5-8. [PMID: 32331937 DOI: 10.1016/j.jocn.2020.04.081] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2020] [Accepted: 04/13/2020] [Indexed: 10/24/2022]
Abstract
PURPOSE To determine the diagnostic and prognostic value of glial fibrillary acidic protein (GFAP) and S100B after traumatic brain injury (TBI) in an Erythropoietin (EPO) clinical trial and examine whether EPO therapy reduces biomarker concentrations. MATERIALS AND METHODS Forty-four patients with moderate-to-severe TBI were enrolled to a sub-study of the EPO-TBI trial. Patients were randomized to either Epoetin alfa 40,000 IU or 1 ml sodium chloride 0.9 as subcutaneous injection within 24 h of TBI. RESULTS GFAP and S100B were measured in serum by ELISA from D0 (within 24 h of injury, prior to EPO/vehicle administration) to D5. Biomarker concentrations were compared between injury severities, diffuse vs. focal TBI, 6-month outcome scores (GOS-E) and EPO or placebo treatments. At D0 GFAP was significantly higher than S100B (951 pg/mL vs. 476 pg/mL, p = 0.018). ROC analysis of S100B at 1D post-injury distinguished favorable vs. unfavorable outcomes (area under the curve = 0.73; p = 0.01). EPO did not reduce concentration of either biomarker. CONCLUSIONS Elevated serum concentrations of GFAP and S100B after TBI reflect a robust, acute glial response to injury. Consistent with lack of improved outcome in TBI patients treated with EPO and prior findings on neuronal and axonal markers, glial biomarker concentrations and acute profiles were not affected by EPO.
Collapse
|
22
|
Corrigendum to "Effect of age of red cells for transfusion on patient outcomes: a systematic review and meta-analysis" [Transfus Med Rev 32/2 (2018) 77-88]. Transfus Med Rev 2020; 34:138-139. [PMID: 32295729 DOI: 10.1016/j.tmrv.2020.03.002] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
23
|
Correction to: Serum sodium and intracranial pressure changes after desmopressin therapy in severe traumatic brain injury patients: a multi-centre cohort study. Ann Intensive Care 2019; 9:136. [PMID: 31802308 PMCID: PMC6892991 DOI: 10.1186/s13613-019-0610-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022] Open
|
24
|
Serum sodium and intracranial pressure changes after desmopressin therapy in severe traumatic brain injury patients: a multi-centre cohort study. Ann Intensive Care 2019; 9:99. [PMID: 31486921 PMCID: PMC6728106 DOI: 10.1186/s13613-019-0574-z] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2019] [Accepted: 08/26/2019] [Indexed: 12/24/2022] Open
Abstract
Background In traumatic brain injury (TBI) patients desmopressin administration may induce rapid decreases in serum sodium and increase intracranial pressure (ICP). Aim In an international multi-centre study, we aimed to report changes in serum sodium and ICP after desmopressin administration in TBI patients. Methods We obtained data from 14 neurotrauma ICUs in Europe, Australia and UK for severe TBI patients (GCS ≤ 8) requiring ICP monitoring. We identified patients who received any desmopressin and recorded daily dose, 6-hourly serum sodium, and 6-hourly ICP. Results We studied 262 severe TBI patients. Of these, 39 patients (14.9%) received desmopressin. Median length of treatment with desmopressin was 1 [1–3] day and daily intravenous dose varied between centres from 0.125 to 10 mcg. The median hourly rate of decrease in serum sodium was low (− 0.1 [− 0.2 to 0.0] mmol/L/h) with a median period of decrease of 36 h. The proportion of 6-h periods in which the rate of natremia correction exceeded 0.5 mmol/L/h or 1 mmol/L/h was low, at 8% and 3%, respectively, and ICPs remained stable. After adjusting for IMPACT score and injury severity score, desmopressin administration was independently associated with increased 60-day mortality [HR of 1.83 (1.05–3.24) (p = 0.03)]. Conclusions In severe TBI, desmopressin administration, potentially representing instances of diabetes insipidus is common and is independently associated with increased mortality. Desmopressin doses vary markedly among ICUs; however, the associated decrease in natremia rarely exceeds recommended rates and median ICP values remain unchanged. These findings support the notion that desmopressin therapy is safe.
Collapse
|
25
|
Economic assessment of aerated constructed treatment wetlands using whole life costing. WATER SCIENCE AND TECHNOLOGY : A JOURNAL OF THE INTERNATIONAL ASSOCIATION ON WATER POLLUTION RESEARCH 2019; 80:75-85. [PMID: 31461424 DOI: 10.2166/wst.2019.246] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
There is increasing pressure on water treatment practitioners to demonstrate and deliver best value and sustainability for the end user. The aim of this paper is to evaluate the sustainability and economics, using whole life costing, of wastewater treatment technologies used in small community wastewater treatment works (WwTW) of <2,000 population equivalent (PE). Three comparable wastewater treatment technologies - a saturated vertical flow (SVF) aerated wetland, a submerged aerated filter (SAF) and a rotating biological contactor (RBC) - were compared using whole life cost (WLC) assessment. The study demonstrates that the CAPEX of a technology or asset is only a small proportion of the WLC throughout its operational life. For example, the CAPEX of the SVF aerated wetland scenario presented here is up to 74% (mean = 66 ± 6%) less than the cumulative WLC throughout a 40-year operational time scale, which demonstrates that when comparing technology economics, the most cost-effective solution is one that considers both CAPEX and OPEX. The WLC assessment results indicate that over 40 years, the SVF aerated wetland and RBC technologies have comparable net present value (NPV) WLCs which are significantly below those identified for submerged aerated filter systems (SAF) for treatment of wastewater from communities of <1,000PE. For systems designed to treat wastewater from communities of >1,000PE, the SVF aerated wetland was more economical over 40 years, followed by the RBC and then the SAF. The aerated wetland technology can therefore potentially deliver long-term cost benefits and reduced payback periods compared to alternative treatment technologies for treating wastewater from small communities.
Collapse
|
26
|
Hydrologic similarity to reference wetlands does not lead to similar plant communities in restored wetlands. Restor Ecol 2019. [DOI: 10.1111/rec.12964] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
27
|
Wood chip soil amendments in restored wetlands affect plant growth by reducing compaction and increasing dissolved phenolics. Restor Ecol 2019. [DOI: 10.1111/rec.12942] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
28
|
Erythropoietin in traumatic brain injury associated acute kidney injury: A randomized controlled trial. Acta Anaesthesiol Scand 2019; 63:200-207. [PMID: 30132785 DOI: 10.1111/aas.13244] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2018] [Accepted: 07/29/2018] [Indexed: 12/17/2022]
Abstract
BACKGROUND Acute kidney injury (AKI) in traumatic brain injury (TBI) is poorly understood and it is unknown if it can be attenuated using erythropoietin (EPO). METHODS Pre-planned analysis of patients included in the EPO-TBI (ClinicalTrials.gov NCT00987454) trial who were randomized to weekly EPO (40 000 units) or placebo (0.9% sodium chloride) subcutaneously up to three doses or until intensive care unit (ICU) discharge. Creatinine levels and urinary output (up to 7 days) were categorized according to the Kidney Disease Improving Global Outcome (KDIGO) classification. Severity of TBI was categorized with the International Mission for Prognosis and Analysis of Clinical Trials in TBI. RESULTS Of 3348 screened patients, 606 were randomized and 603 were analyzed. Of these, 82 (14%) patients developed AKI according to KDIGO (60 [10%] with KDIGO 1, 11 [2%] patients with KDIGO 2, and 11 [2%] patients with KDIGO 3). Male gender (hazard ratio [HR] 4.0 95% confidence interval [CI] 1.4-11.2, P = 0.008) and severity of TBI (HR 1.3 95% CI 1.1-1.4, P < 0.001 for each 10% increase in risk of poor 6 month outcome) predicted time to AKI. KDIGO stage 1 (HR 8.8 95% CI 4.5-17, P < 0.001), KDIGO stage 2 (HR 13.2 95% CI 3.9-45.2, P < 0.001) and KDIGO stage 3 (HR 11.7 95% CI 3.5-39.7, P < 0.005) predicted time to mortality. EPO did not influence time to AKI (HR 1.08 95% CI 0.7-1.67, P = 0.73) or creatinine levels during ICU stay (P = 0.09). CONCLUSIONS Acute kidney injury is more common in male patients and those with severe compared to moderate TBI and appears associated with worse outcome. EPO does not prevent AKI after TBI.
Collapse
|
29
|
A new approach for hydrologic performance standards in wetland mitigation. JOURNAL OF ENVIRONMENTAL MANAGEMENT 2019; 231:1154-1163. [PMID: 30602240 DOI: 10.1016/j.jenvman.2018.11.001] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/18/2018] [Revised: 10/30/2018] [Accepted: 11/01/2018] [Indexed: 06/09/2023]
Abstract
Wetland restoration performed as a requirement of compensatory mitigation does not always replace lost acreage or functions. Most new projects are required to identify performance standards to evaluate restoration outcomes. Current performance standards are primarily related to vegetation with little to no evaluation of wetland hydrologic regimes. Because of the agreement in the scientific literature about the role of hydrology in creating and maintaining wetland structure and function, hydrologic performance standards may be an ecologically meaningful way to evaluate restoration outcomes. This research tests the use of water level data from project specific reference sites to evaluate restored water levels for three distinct wetland types across the United States. We analyzed existing datasets from past and ongoing wetland mitigation projects to identify the number of years it took water levels in restored wetlands to match reference sites, and to test whether similar water levels between restored and reference sites leads to increased vegetation success. Wetland types differed in the number of years it took for water levels to match reference sites. Vernal pools in California took nine years to match reference sites, fens and wet meadows in Colorado took four years, and forested wetlands in the southeastern US were hydrologically similar to reference sites the first year following restoration. Plant species cover in all three restored wetland types was related to the water level similarity to reference sites. Native cover was higher when water levels were more similar to reference sites, and was lower in areas where water levels were different. Exotic species cover showed the opposite relationship in fens and wet meadows, where hydrologic similarity led to low cover of exotic species. Along with the general agreement of the importance of hydrology for wetland form and function, this research shows that hydrologic performance standards may also lead to increased vegetation success in some wetland types.
Collapse
|
30
|
Irrigation canals are newly created streams of semi-arid agricultural regions. THE SCIENCE OF THE TOTAL ENVIRONMENT 2019; 646:770-781. [PMID: 30064103 DOI: 10.1016/j.scitotenv.2018.07.246] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/30/2018] [Revised: 07/16/2018] [Accepted: 07/17/2018] [Indexed: 06/08/2023]
Abstract
The natural hydrologic processes that create and maintain the diversity of aquatic and riparian habitats along the World's streams and rivers have been profoundly altered by humans. Diversion of surface water to support production agriculture in arid and semi-arid regions has degraded ecosystems but also created potential habitat along and in canals specifically designed to transport water. The prevalence of canals and the immense amount of water used for agriculture have created these new artificial stream systems. This study demonstrates the potential for irrigation canals to support riparian and aquatic communities similar to natural streams in urban/residential and agricultural landscapes. We examined the hydrological and ecological characteristics of streams and irrigation canals in urban and agricultural landscapes in northeastern Colorado, typical of regions dominated by irrigation-supported agriculture. Flow patterns in canals depended on their size and had a range of patterns with potential ecological consequences such as rapidly rising and falling water stage, intermittent dry periods, and delayed peak and base flows compared to natural streams. Despite these hydrologic differences, the taxonomic and functional composition of riparian plant and aquatic macroinvertebrate communities indicated that ecological similarities exist between streams and canals, but are dependent, in part, on their landscape setting with stronger similarities in agricultural areas. We also tested the influence of characterizing taxa by functional groups using physiology, ecology and life history traits to explore attributes of habitats including woody canopy structure and water quality. We used a Habitat Quality Index (HQI) that combined physical and biological measures into a single index. Streams scored higher on average within agriculture and urban/residential settings compared to canals; however, one third of urban canals scored above the average of agricultural streams. This multidisciplinary study shows that irrigation canals can be valuable riparian and aquatic habitat, especially in regions with severely degraded streams.
Collapse
|
31
|
Changes in bone mineral density in women before critical illness: a matched control nested cohort study. Arch Osteoporos 2018; 13:119. [PMID: 30397732 DOI: 10.1007/s11657-018-0533-6] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/24/2018] [Accepted: 10/19/2018] [Indexed: 02/03/2023]
Abstract
UNLABELLED The contribution of premorbid bone health to accelerated bone loss following critical illness is unknown. This study compared bone density in women before critical illness to women who did not become critically ill. Overall bone density was similar, although femoral neck bone mass increased immediately prior to critical illness. PURPOSE The relative contribution of acute and chronic factors to accelerated loss of bone mineral density (BMD) following critical illness is unknown. This study compared the BMD trajectory of women before critical illness to the BMD trajectory of women who did not become critically ill. METHODS This prospective, nested, age- and medication-matched, case-control study compared trajectory of BMD in women in the Geelong Osteoporosis study (GOS) requiring admission to an Australian Intensive Care Unit (ICU) between June 1998 and March 2016, to women not admitted to ICU. The main outcome was age and medication use adjusted change in BMD. RESULTS A total of 52 women, with a mean age of 77 ± 9 years were admitted to ICU, predominantly post-surgery (75%), during the study period. A greater age-adjusted annual rate of decline was observed for pre-ICU women compared to no-ICU women for AP spine BMD (-0.010 ± 0.002 g/cm2 vs -0.005 ± 0.002 g/cm2, p = 0.01) over the 15-year study period. In participants with multiple BMDs 2 years before critical illness, a significantly greater increase in femoral neck BMD compared to age- and medication-matched controls was observed (difference in BMD, ICU vs no-ICU = 0.037 ± 0.013 g/cm2, p = 0.006). CONCLUSION In a cohort of women with predominantly surgical ICU admission, bone health prior to critical illness was comparable to age- and medication-matched controls, with a relative increase in femoral neck bone mass immediately prior to critical illness. These findings suggest critical illness-related bone loss cannot be entirely explained as a continuation of pre-morbid bone trajectory.
Collapse
|
32
|
A Dose Ranging Study to Evaluate Dermatan Sulphate in Preventing Deep Vein Thrombosis following Total Hip Arthroplasty. Thromb Haemost 2018. [DOI: 10.1055/s-0038-1648963] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
Abstract
SummaryDermatan sulphate catalyses thrombin inhibition by heparin cofactor II; it has a lower haemorrhagic to antithrombotic ratio than that of heparin in animal models. Consecutive patients aged forty years or more, electively undergoing total hip replacement under general anaesthesia, were randomly allocated to one of three dosage regimens of dermatan sulphate (MF701, Mediolanum Farmaceutici) given intramuscularly. These were 200 mg once daily (n = 50), 200 mg twice daily (n = 52) and 300 mg twice daily (n = 51), administered from twenty-four hours pre-operatively until the tenth postoperative day. The overall incidence of DVT assessed by bilateral venography was 53%, 51% and 34% respectively (Chi-square test for trend p = 0.06). The incidence of major proximal DVT was 10.6%, 8.5% and 2.1% respectively. Pulmonary embolism (PE) and bleeding were assessed in all 153 patients. There was one case of PE in each dose group. The incidence of bleeding episodes, volume of blood lost and blood transfusion requirements were low and showed no increase with increasing dose. The patients were followed up 4-8 weeks after discharge.We conclude that the two lower doses were subtherapeutic in this population, however dermatan sulphate given 300 mg twice daily, proved to be efficacious with an incidence of proximal major DVT of 2.1% and a low incidence of bleeding complications. A trial of dermatan sulphate 300 mg twice daily compared to standard prophylactic agents is needed.
Collapse
|
33
|
Evaluating Success of Alternative Restoration Methods for Riparian Willows: Seeding and Ungulate Exclosures. ECOL RESTOR 2018. [DOI: 10.3368/er.36.2.127] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022]
|
34
|
|
35
|
Effect of age of red cells for transfusion on patient outcomes: a systematic review and meta-analysis. Transfus Med Rev 2018. [DOI: 10.1016/j.tmrv.2018.02.002] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
|
36
|
Day or overnight transfusion in critically ill patients: does it matter? Vox Sang 2018; 113:275-282. [PMID: 29392786 DOI: 10.1111/vox.12635] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2017] [Revised: 12/02/2017] [Accepted: 01/04/2018] [Indexed: 11/27/2022]
Abstract
BACKGROUND AND OBJECTIVES The timing of blood administration in critically ill patients is first driven by patients' needs. This study aimed to define the epidemiology and significance of overnight transfusion in critically ill patients. MATERIALS AND METHODS This is a post hoc analysis of a prospective multicentre observational study including 874 critically ill patients receiving red blood cells, platelets, fresh frozen plasma (FFP) or cryoprecipitate. Characteristics of patients receiving blood only during the day (8 am up until 8 pm) were compared to those receiving blood only overnight (8 pm up until 8 am). Characteristics of transfusion were compared, and factors independently associated with major bleeding were analysed. RESULTS The 287 patients transfused during the day only had similar severity and mortality to the 258 receiving blood products overnight only. Although bleeding-related admission diagnoses were similar, major bleeding was the indication for transfusion in 12% of patients transfused in daytime only versus 30% of patients transfused at night only (P < 0·001). Similar total amount of blood products were transfused at day and night (2856 versus 2927); however, patients were more likely to receive FFP and cryoprecipitate at night compared with daytime. Overnight transfusion was independently associated with increased odds of major bleeding (odds ratio, 3·16, 95% confidence interval, 2·00-5·01). CONCLUSION Transfusion occurs evenly across day and night in ICU; nonetheless, there are differences in type of blood products administered that reflect differences in indication. Critically ill patients were more likely to receive blood for major bleeding at night irrespective of admission diagnosis.
Collapse
|
37
|
Vegetation response to invasive Tamarix control in southwestern U.S. rivers: a collaborative study including 416 sites. ECOLOGICAL APPLICATIONS : A PUBLICATION OF THE ECOLOGICAL SOCIETY OF AMERICA 2017; 27:1789-1804. [PMID: 28445000 DOI: 10.1002/eap.1566] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/07/2016] [Revised: 02/21/2017] [Accepted: 04/12/2017] [Indexed: 06/07/2023]
Abstract
Most studies assessing vegetation response following control of invasive Tamarix trees along southwestern U.S. rivers have been small in scale (e.g., river reach), or at a regional scale but with poor spatial-temporal replication, and most have not included testing the effects of a now widely used biological control. We monitored plant composition following Tamarix control along hydrologic, soil, and climatic gradients in 244 treated and 172 reference sites across six U.S. states. This represents the largest comprehensive assessment to date on the vegetation response to the four most common Tamarix control treatments. Biocontrol by a defoliating beetle (treatment 1) reduced the abundance of Tamarix less than active removal by mechanically using hand and chain-saws (2), heavy machinery (3) or burning (4). Tamarix abundance also decreased with lower temperatures, higher precipitation, and follow-up treatments for Tamarix resprouting. Native cover generally increased over time in active Tamarix removal sites, however, the increases observed were small and was not consistently increased by active revegetation. Overall, native cover was correlated to permanent stream flow, lower grazing pressure, lower soil salinity and temperatures, and higher precipitation. Species diversity also increased where Tamarix was removed. However, Tamarix treatments, especially those generating the highest disturbance (burning and heavy machinery), also often promoted secondary invasions of exotic forbs. The abundance of hydrophytic species was much lower in treated than in reference sites, suggesting that management of southwestern U.S. rivers has focused too much on weed control, overlooking restoration of fluvial processes that provide habitat for hydrophytic and floodplain vegetation. These results can help inform future management of Tamarix-infested rivers to restore hydrogeomorphic processes, increase native biodiversity and reduce abundance of noxious species.
Collapse
|
38
|
The varieties of contemplative experience: A mixed-methods study of meditation-related challenges in Western Buddhists. PLoS One 2017; 12:e0176239. [PMID: 28542181 PMCID: PMC5443484 DOI: 10.1371/journal.pone.0176239] [Citation(s) in RCA: 125] [Impact Index Per Article: 17.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2016] [Accepted: 04/02/2017] [Indexed: 11/18/2022] Open
Abstract
Buddhist-derived meditation practices are currently being employed as a popular form of health promotion. While meditation programs draw inspiration from Buddhist textual sources for the benefits of meditation, these sources also acknowledge a wide range of other effects beyond health-related outcomes. The Varieties of Contemplative Experience study investigates meditation-related experiences that are typically underreported, particularly experiences that are described as challenging, difficult, distressing, functionally impairing, and/or requiring additional support. A mixed-methods approach featured qualitative interviews with Western Buddhist meditation practitioners and experts in Theravāda, Zen, and Tibetan traditions. Interview questions probed meditation experiences and influencing factors, including interpretations and management strategies. A follow-up survey provided quantitative assessments of causality, impairment and other demographic and practice-related variables. The content-driven thematic analysis of interviews yielded a taxonomy of 59 meditation-related experiences across 7 domains: cognitive, perceptual, affective, somatic, conative, sense of self, and social. Even in cases where the phenomenology was similar across participants, interpretations of and responses to the experiences differed considerably. The associated valence ranged from very positive to very negative, and the associated level of distress and functional impairment ranged from minimal and transient to severe and enduring. In order to determine what factors may influence the valence, impact, and response to any given experience, the study also identified 26 categories of influencing factors across 4 domains: practitioner-level factors, practice-level factors, relationships, and health behaviors. By identifying a broader range of experiences associated with meditation, along with the factors that contribute to the presence and management of experiences reported as challenging, difficult, distressing or functionally impairing, this study aims to increase our understanding of the effects of contemplative practices and to provide resources for mediators, clinicians, meditation researchers, and meditation teachers.
Collapse
|
39
|
Fibrinogen is an independent predictor of mortality in major trauma patients: A five-year statewide cohort study. Injury 2017; 48:1074-1081. [PMID: 28190583 DOI: 10.1016/j.injury.2016.11.021] [Citation(s) in RCA: 90] [Impact Index Per Article: 12.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/24/2016] [Revised: 10/11/2016] [Accepted: 11/19/2016] [Indexed: 02/02/2023]
Abstract
INTRODUCTION Fibrinogen may be reduced following traumatic injury due to loss from haemorrhage, increased consumption and reduced synthesis. In the absence of clinical trials, guidelines for fibrinogen replacement are based on expert opinion and vary internationally. We aimed to determine prevalence and predictors of low fibrinogen on admission in major trauma patients and investigate association of fibrinogen levels with patient outcomes. PATIENTS AND METHODS Data on all major trauma patients (January 2007-July 2011) identified through a prospective statewide trauma registry in Victoria, Australia were linked with laboratory and transfusion data. Major trauma included any of the following: death after injury, injury severity score (ISS) >15, admission to intensive care unit requiring mechanical ventilation, or urgent surgery for intrathoracic, intracranial, intra-abdominal procedures or fixation of pelvic or spinal fractures. Associations between initial fibrinogen level and in-hospital mortality were analysed using multiple logistic regression. RESULTS Of 4773 patients identified, 114 (2.4%) had fibrinogen less than 1g/L, 283 (5.9%) 1.0-1.5g/L, 617 (12.9%) 1.6-1.9g/L, 3024 (63.4%) 2-4g/L and 735 (15%) >4g/L. Median fibrinogen was 2.6g/L (interquartile range 2.1-3.4). After adjusting for age, gender, ISS, injury type, pH, temperature, Glasgow Coma Score (GCS), initial international normalised ratio and platelet count, the lowest fibrinogen categories, compared with normal range, were associated with increased in-hospital mortality (adjusted odds ratio [OR] for less than 1g/L 3.28 [95% CI 1.71-6.28, p<0.01], 1-1.5g/L adjusted OR 2.08 [95% CI 1.36-3.16, p<0.01] and 1.6-1.9g/L adjusted OR 1.39 [95% CI 0.97-2.00, p=0.08]). Predictors of initial fibrinogen <1.5g/L were younger age, lower GCS, systolic blood pressure <90mmHg, chest decompression, penetrating injury, ISS >25 and lower pH and temperature. CONCLUSIONS Initial fibrinogen levels less than the normal range are independently associated with higher in-hospital mortality in major trauma patients. Future studies are warranted to investigate whether earlier and/or greater fibrinogen replacement improves clinical outcomes.
Collapse
|
40
|
Cardiac magnetic resonance imaging in suspected blunt cardiac injury: A prospective, pilot, cohort study. Injury 2017; 48:1013-1019. [PMID: 28318537 DOI: 10.1016/j.injury.2017.02.025] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/09/2016] [Revised: 02/16/2017] [Accepted: 02/23/2017] [Indexed: 02/02/2023]
Abstract
INTRODUCTION The aim of this study was to evaluate the incidence and severity of blunt cardiac injury (BCI) as determined by cardiac magnetic resonance imaging (CMR), and to compare this to currently used diagnostic methods in severely injured patients. MATERIALS AND METHODS We conducted a prospective, pilot cohort study of 42 major trauma patients from July 2013 to Jan 2015. The cohort underwent CMR within 7 days, enrolling 21 patients with evidence of chest injury and an elevated Troponin I compared to 21 patients without chest injury who acted as controls. Major adverse cardiac events (MACE) including ventricular arrhythmia, unexplained hypotension requiring inotropes, or a requirement for cardiac surgery were recorded. RESULTS 6/21 (28%) patients with chest injuries had abnormal CMR scans, while all 21 control patients had normal scans. CMR abnormalities included myocardial oedema, regional wall motion abnormalities, and myocardial haemorrhage. The left ventricle was the commonest site of injury (5/6), followed by the right ventricle (2/6) and tricuspid valve (1/6). MACE occurred in 5 patients. Sensitivity and specificity values for CMR at predicting MACE were 60% (15-95) and 81% (54-96), which compared favourably with other tests. CONCLUSION In this pilot trial, CMR was found to give detailed anatomic information of myocardial injury in patients with suspected BCI, and may have a role in the diagnosis and management of patients with suspected BCI.
Collapse
|
41
|
The association of time and medications with changes in bone mineral density in the 2 years after critical illness. CRITICAL CARE : THE OFFICIAL JOURNAL OF THE CRITICAL CARE FORUM 2017; 21:69. [PMID: 28327171 PMCID: PMC5361814 DOI: 10.1186/s13054-017-1657-6] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/05/2016] [Accepted: 02/28/2017] [Indexed: 11/30/2022]
Abstract
Background Critical illness is associated with increased risk of fragility fracture and loss of bone mineral density (BMD), although the impact of medication exposures (bone anti-fracture therapy or glucocorticoids) and time remain unexplored. The objective of this study was to describe the association of time after ICU admission, and post-ICU administration of bone anti-fracture therapy or glucocorticoids after critical illness, with change in BMD. Methods In this prospective observational study, conducted in a tertiary hospital ICU, we studied adult patients requiring mechanical ventilation for at least 24 hours and measured BMD annually for 2 years after ICU discharge. We performed mixed linear modelling to describe the association of time, and post-ICU administration of anti-fracture therapy or glucocorticoids, with annualised change in BMD. Results Ninety-two participants with a mean age of 63 (±15) years had at least one BMD assessment after ICU discharge. In women, a greater loss of spine BMD occurred in the first year after critical illness (year 1: -1.1 ± 2.0% vs year 2: 3.0 ± 1.7%, p = 0.02), and anti-fracture therapy use was associated with reduced loss of BMD (femur 3.1 ± 2.4% vs -2.8 ± 1.7%, p = 0.04, spine 5.1 ± 2.5% vs -3.2 ± 1.8%, p = 0.01). In men anti-fracture and glucocorticoid use were not associated with change in BMD, and a greater decrease in BMD occurred in the second year after critical illness (year 1: -0.9 ± 2.1% vs year 2: -2.5 ± 2.1%, p = 0.03). Conclusions In women a greater loss of spine BMD was observed in the first year after critical illness, and anti-fracture therapy use was associated with an increase in BMD. In men BMD loss increased in the second year after critical illness. Anti-fracture therapy may be an effective intervention to prevent bone loss in women after critical illness. Electronic supplementary material The online version of this article (doi:10.1186/s13054-017-1657-6) contains supplementary material, which is available to authorized users.
Collapse
|
42
|
Alteration of hydrogeomorphic processes by invasive beavers in southern South America. THE SCIENCE OF THE TOTAL ENVIRONMENT 2017; 574:183-190. [PMID: 27636003 DOI: 10.1016/j.scitotenv.2016.09.045] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/12/2016] [Revised: 09/06/2016] [Accepted: 09/07/2016] [Indexed: 06/06/2023]
Abstract
The North American beaver (Castor canadensis) is an invasive species in southern Patagonia, introduced in 1946 as part of a program by the Argentine government to augment furbearers. Research focus has turned from inventorying the beaver's population and ecosystem impacts toward eradicating it from the region and restoring degraded areas. Successful restoration, however, requires a fuller determination of how beavers have altered physical landscape characteristics, and of what landscape features and biota need to be restored. Our goal was to identify changes to the physical landscape by invasive beaver. We analyzed channel and valley morphology in detail at one site in each of the three major forest zones occurring on the Argentine side of Tierra del Fuego's main island. We also assessed 48 additional sites across the three forest biomes on the island to identify a broader range of aquatic habitat occupied and modified by beaver. Beaver build dams with Nothofagus tree branches on streams, which triggered mineral sediment accretion processes in the riparian zone, but not in ways consistent with the beaver meadow theory and only at a few sites. At the majority of sites, beavers actively excavated peat and mineral sediment, moved thousands of cubic meters of sediment within their occupied landscapes and used it to build dams. Beaver were also common in fen ecosystems where pond formation inundated and drowned peat forming mosses and sedges, and triggered a massive invasion of exotic plant species. Results highlight that restoration of fen ecosystems is a previously unrecognized but pressing and challenging restoration need in addition to reforestation of Nothofagus riparian forests. We recommend that decision-makers include the full ecosystem diversity of the Fuegian landscape in their beaver eradiation and ecosystem restoration plans.
Collapse
|
43
|
Improving outcomes for hospital patients with critical bleeding requiring massive transfusion: the Australian and New Zealand Massive Transfusion Registry study methodology. BMC Res Notes 2016; 9:457. [PMID: 27716381 PMCID: PMC5052932 DOI: 10.1186/s13104-016-2261-6] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2016] [Accepted: 09/27/2016] [Indexed: 12/28/2022] Open
Abstract
Background The Australian and New Zealand (ANZ) Massive Transfusion (MT) Registry (MTR) has been established to improve the quality of care of patients with critical bleeding (CB) requiring MT (≥ 5 units red blood cells (RBC) over 4 h). The MTR is providing data to: (1) improve the evidence base for transfusion practice by systematically collecting data on transfusion practice and clinical outcomes; (2) monitor variations in practice and provide an opportunity for benchmarking, and feedback on practice/blood product use; (3) inform blood supply planning, inventory management and development of future clinical trials; and (4) measure and enhance translation of evidence into policy and patient blood management guidelines. The MTR commenced in 2011. At each participating site, all eligible patients aged ≥18 years with CB from any clinical context receiving MT are included using a waived consent model. Patient information and clinical coding, transfusion history, and laboratory test results are extracted for each patient’s hospital admission at the episode level. Results Thirty-two hospitals have enrolled and 3566 MT patients have been identified across Australia and New Zealand between 2011 and 2015. The majority of CB contexts are surgical, followed by trauma and gastrointestinal haemorrhage. Validation studies have verified that the definition of MT used in the registry correctly identifies 94 % of CB events, and that the median time of transfusion for the majority of fresh products is the ‘product event issue time’ from the hospital blood bank plus 20 min. Data linkage between the MTR and mortality databases in Australia and New Zealand will allow comparisons of risk-adjusted mortality estimates across different bleeding contexts, and between countries. Data extracts will be examined to determine if there are differences in patient outcomes according to transfusion practice. The ratios of blood components (e.g. FFP:RBC) used in different types of critical bleeding will also be investigated. Conclusions The MTR is generating data with the potential to have an impact on management and policy decision-making in CB and MT and provide benchmarking and monitoring tools for immediate application.
Collapse
|
44
|
|
45
|
Abstract
RATIONALE Critical illness may be associated with increased bone turnover and loss of bone mineral density (BMD). Prospective evidence describing long-term changes in BMD after critical illness is needed to further define this relationship. OBJECTIVES To measure the change in BMD and bone turnover markers (BTMs) in subjects 1 year after critical illness compared with population-based control subjects. METHODS We studied adult patients admitted to a tertiary intensive care unit (ICU) who required mechanical ventilation for at least 24 hours. We measured clinical characteristics, BTMs, and BMD during admission and 1 year after ICU discharge. We compared change in BMD to age- and sex-matched control subjects from the Geelong Osteoporosis Study. MEASUREMENTS AND MAIN RESULTS Sixty-six patients completed BMD testing. BMD decreased significantly in the year after critical illness at both femoral neck and anterior-posterior spine sites. The annual decrease was significantly greater in the ICU cohort compared with matched control subjects (anterior-posterior spine, -1.59%; 95% confidence interval, -2.18 to -1.01; P < 0.001; femoral neck, -1.20%; 95% confidence interval, -1.69 to -0.70; P < 0.001). There was a significant increase in 10-year fracture risk for major fractures (4.85 ± 5.25 vs. 5.50 ± 5.52; P < 0.001) and hip fractures (1.57 ± 2.40 vs. 1.79 ± 2.69; P = 0.001). The pattern of bone resorption markers was consistent with accelerated bone turnover. CONCLUSIONS Critically ill individuals experience a significantly greater decrease in BMD in the year after admission compared with population-based control subjects. Their bone turnover biomarker pattern is consistent with an increased rate of bone loss.
Collapse
|
46
|
Abstract
This article focuses on rationalization, its dimensions, the possibilities of reasoned justification in the public sphere, and the technologies that would operationalize this. It does so through an analysis of the introduction of performance measurement in the Provincial Government of Alberta, Canada. We argue that performance measurement represents twin dimensions of rationalization: the pursuit of reason in human affairs, that is, the process of bringing to light the justifications by which actions and policies are pursued; and rationalization as the increasing dominance of a means-end instrumental rationality. The article illustrates how an initial enthusiasm by managers for the performance management initiatives was replaced with scepticism and cynicism. We show how the potential for reasoned justification was frustrated in practice, through a growing disparity between a discourse of reasoned justification and the practical operationalization of mechanisms of business planning and performance measurement. The search for reasoned justification and instrumental mastery are part of the same rationalization process, and these two contradictory, but inherently connected forces are an important explanation of the dynamics of managers' responses to organizational change.
Collapse
|
47
|
Nitrification cessation and recovery in an aerated saturated vertical subsurface flow treatment wetland: Field studies and microscale biofilm modeling. BIORESOURCE TECHNOLOGY 2016; 209:125-132. [PMID: 26967335 DOI: 10.1016/j.biortech.2016.02.065] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/31/2015] [Revised: 02/16/2016] [Accepted: 02/17/2016] [Indexed: 06/05/2023]
Abstract
In aerated treatment wetlands, oxygen availability is not a limiting factor in sustaining a high level of nitrification in wastewater treatment. In the case of an air blower failure, nitrification would cease, potentially causing adverse effects to the nitrifying bacteria. A field trial was completed investigating nitrification loss when aeration is switched off, and the system recovery rate after the aeration is switched back on. Loss of dissolved oxygen was observed to be more rapid than loss of nitrification. Nitrate was observed in the effluent long after the aeration was switched off (48h+). A complementary modeling study predicted nitrate diffusion out of biofilm over a 48h period. After two weeks of no aeration in the established system, nitrification recovered within two days, whereas nitrification establishment in a new system was previously observed to require 20-45days. These results suggest that once established resident nitrifying microbial communities are quite robust.
Collapse
|
48
|
Short-term effect of nitrogen addition on nitric oxide emissions from an alpine meadow in the Tibetan Plateau. ENVIRONMENTAL SCIENCE AND POLLUTION RESEARCH INTERNATIONAL 2016; 23:12474-12479. [PMID: 27146528 DOI: 10.1007/s11356-016-6763-5] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/13/2015] [Accepted: 04/25/2016] [Indexed: 06/05/2023]
Abstract
Little information is available on nitric oxide (NO) fluxes from alpine ecosystems. We measured NO fluxes in control and nitrogen (N) addition (NH4NO3, 6 g N m(-2) year(-1)) plots from early June through October 2013 in an alpine meadow on the Tibetan Plateau, China. During the sample period, NO fluxes varied from -0.71 to 3.12 ug m(-2) h(-1) and -0.46 to 7.54 ug m(-2) h(-1) for control and N treatment plots. The mean NO emission in N addition plots (1.68 ug m(-2) h(-1)) was 2.15 times higher than the control plots (0.78 ug m(-2) h(-1)), indicating that alpine meadows may be a source of atmospheric NO, and N additions stimulated NO flux. A positive correlation was found between NO flux and soil temperature, water-filled pore space (WFPS), nitrate (NO3 (-)-N) content but no correlation with soil ammonium (NH4 (+)-N). These results suggest that denitrification is a principal process producing NO flux from alpine meadows.
Collapse
|
49
|
Strategic Alliances within a Big-Six Accounting Firm. INTERNATIONAL STUDIES OF MANAGEMENT & ORGANIZATION 2016. [DOI: 10.1080/00208825.1996.11656681] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
50
|
Supplemental parenteral nutrition in critically ill patients: a study protocol for a phase II randomised controlled trial. Trials 2015; 16:587. [PMID: 26703919 PMCID: PMC4690293 DOI: 10.1186/s13063-015-1118-y] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2015] [Accepted: 12/14/2015] [Indexed: 01/23/2023] Open
Abstract
Background Nutrition is one of the fundamentals of care provided to critically ill adults. The volume of enteral nutrition received, however, is often much less than prescribed due to multiple functional and process issues. To deliver the prescribed volume and correct the energy deficit associated with enteral nutrition alone, parenteral nutrition can be used in combination (termed “supplemental parenteral nutrition”), but benefits of this method have not been firmly established. A multi-centre, randomised, clinical trial is currently underway to determine if prescribed energy requirements can be provided to critically ill patients by using a supplemental parenteral nutrition strategy in the critically ill. Methods/design This prospective, multi-centre, randomised, stratified, parallel-group, controlled, phase II trial aims to determine whether a supplemental parenteral nutrition strategy will reliably and safely increase energy intake when compared to usual care. The study will be conducted for 100 critically ill adults with at least one organ system failure and evidence of insufficient enteral intake from six intensive care units in Australia and New Zealand. Enrolled patients will be allocated to either a supplemental parenteral nutrition strategy for 7 days post randomisation or to usual care with enteral nutrition. The primary outcome will be the average energy amount delivered from nutrition therapy over the first 7 days of the study period. Secondary outcomes include protein delivery for 7 days post randomisation; total energy and protein delivery, antibiotic use and organ failure rates (up to 28 days); duration of ventilation, length of intensive care unit and hospital stay. At both intensive care unit and hospital discharge strength and health-related quality of life assessments will be undertaken. Study participants will be followed up for health-related quality of life, resource utilisation and survival at 90 and 180 days post randomisation (unless death occurs first). Discussion This trial aims to determine if provision of a supplemental parenteral nutrition strategy to critically ill adults will increase energy intake compared to usual care in Australia and New Zealand. Trial outcomes will guide development of a subsequent larger randomised controlled trial. Trial registration NCT01847534 (First registered 5 February 2013, last updated 14 October 2015) Electronic supplementary material The online version of this article (doi:10.1186/s13063-015-1118-y) contains supplementary material, which is available to authorized users.
Collapse
|