1
|
Reducing Grip Uncertainty During Initial Prosthetic Hand Use Improves Eye-Hand Coordination and Lowers Mental Workload. J Mot Behav 2024:1-11. [PMID: 38522858 DOI: 10.1080/00222895.2024.2328297] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/26/2024]
Abstract
The reliance on vision to control a myoelectric prosthesis is cognitively burdensome and contributes to device abandonment. The feeling of uncertainty when gripping an object is thought to be the cause of this overreliance on vision in hand-related actions. We explored if experimentally reducing grip uncertainty alters the visuomotor control and mental workload experienced during initial prosthesis use. In a repeated measures design, twenty-one able-bodied participants took part in a pouring task across three conditions: (a) using their anatomical hand, (b) using a myoelectric prosthetic hand simulator, and (c) using a myoelectric prosthetic hand simulator with Velcro attached to reduce grip uncertainty. Performance, gaze behaviour (using mobile eye-tracking) and self-reported mental workload, was measured. Results showed that using a prosthesis (with or without Velcro) slowed task performance, impaired typical eye-hand coordination and increased mental workload compared to anatomic hand control. However, when using the prosthesis with Velcro, participants displayed better prosthesis control, more effective eye-hand coordination and reduced mental workload compared to when using the prosthesis without Velcro. These positive results indicate that reducing grip uncertainty could be a useful tool for encouraging more effective prosthesis control strategies in the early stages of prosthetic hand learning.
Collapse
|
2
|
A service evaluation of Zio XT: the Liverpool experience. Eur Heart J 2022. [DOI: 10.1093/eurheartj/ehac544.2812] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Abstract
Introduction
The Zio XT is an adhesive, ambulatory heart rhythm monitoring device that can be worn for up to 14 days. It can be fitted by patients and utilises an Artificial Intelligence-based algorithm for rhythm analysis, offering potential convenience, accuracy and efficiency compared to Holter monitors. However, there is a lack of data regarding its efficacy and long-term impact. Thus, until further evidence ensues, NICE guidelines recommend Zio XT as a potential option for those requiring prolonged rhythm monitoring.
Purpose
We evaluated the efficacy of Zio XT for heart rhythm monitoring compared to Holter monitors.
Methods
200 sequential patients that had Holter monitors and 204 that had Zio XT were included. Zio cases were randomly selected over 6 months to avoid the learning curve effect. Primary outcomes included time to results and the arrhythmia detection rate. Secondary outcomes included the proportion of patients that had heart rhythm monitoring in the 12 months preceding their investigation, those who required further tests as well as rates of outpatient appointments (OPAs) for device fitting and follow-up, and procedures such as device implantation and ablations.
Results
Data from 22 (10.8%) Zio patches was unavailable due to these being lost/not returned/unwearable, thus post-investigation outcomes were analysed for 182 Zio and 200 Holter cases. Zio XT was associated with a significantly shorter time to results compared to Holter monitors (median time: 21 days (interquartile range (IQR) 18–25) vs. 46 days (IQR 37.3–87.8), p<0.001), and a higher significant arrhythmia detection rate (55.4% vs. 17.5%, p<0.001). 26.5% of Zio patients had heart-rhythm monitoring in the preceding 12 months, compared to the 14.5% in the Holter group, p=0.003, with 55.8% having Holters and 28.8% having Zios previously, in the Zio group. A higher proportion of Zio recipients also required repeat tests (19.4 vs. 8.5%, p=0.002). Reasons for this included post-intervention monitoring (44.1%), lack of results due to devices being lost/faulty/not returned (41.2%) and a lack of diagnosis (14.7%). Zio monitoring was associated with a significant reduction in the need for OPAs for fitting (0.5% vs. 96%, p<0.001) and follow-up (70.1% vs. 87.0, p<0.001), and resulted in a significant increase in ablations (5.9% vs. 1.0%, p=0.005) but not device implantations (5.9% vs. 3.9, p=0.209).
Conclusion
Our findings indicate that Zio XT is associated with a statistically significant reduction in time to results, higher arrhythmia detection rate and a reduced need for OPAs. We demonstrated a higher rate of both Holter and Zio testing before and Zio testing after these investigations. We postulate that this has partly been due to a learning curve effect with the introduction of a new technology compared to the Holter which has been in use for many decades. Further large-scale evaluation is recommended to yield vital information on management pathways and cost efficacy.
Funding Acknowledgement
Type of funding sources: None.
Collapse
|
3
|
Evaluation of a training programme for Pharmacist Independent Prescribers in a care home medicine management intervention. BMC MEDICAL EDUCATION 2022; 22:551. [PMID: 35840960 PMCID: PMC9287970 DOI: 10.1186/s12909-022-03575-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/27/2021] [Accepted: 06/16/2022] [Indexed: 06/15/2023]
Abstract
BACKGROUND The provision of independent prescribing rights for United Kingdom (UK) pharmacists has enabled them to prescribe within their area of competence. The aim of this study was to evaluate an evidence-based training programme designed to prepare Pharmacist Independent Prescribers (PIPs) to safely and effectively assume responsibility for pharmaceutical care of older people in care homes in the UK, within a randomised controlled trial. METHODS The training and competency assessment process included two training days, professional development planning against a bespoke competency framework, mentor support, and a viva with an independent General Practitioner (GP). Data on the PIPs' perceptions of the training were collected through evaluation forms immediately after the training days and through online questionnaires and interviews after delivery of the 6-month intervention. Using a mixed method approach each data set was analysed separately then triangulated providing a detailed evaluation of the process. Kaufman's Model of Learning Evaluation guided interpretations. RESULTS All 25 PIPs who received the training completed an evaluation form (N = 25). Post-intervention questionnaires were completed by 16 PIPs and 14 PIPs took part in interviews. PIPs reported the training days and mentorship enabled them to develop a personalised portfolio of competence in preparation for discussion during a viva with an independent GP. Contact with the mentor reduced as PIPs gained confidence in their role. PIPs applied their new learning throughout the delivery of the intervention leading to perceived improvements in residents' quality of life and medicines management. A few PIPs reported that developing a portfolio of competence was time intensive, and that further training on leadership skills would have been beneficial. CONCLUSIONS The bespoke training programme was fit for purpose. Mentorship and competency assessment were resource intensive but appropriate. An additional benefit was that many PIPs reported professional growth beyond the requirement of the study. TRIAL REGISTRATION The definitive RCT was registered with the ISRCTN registry (registration number ISRCTN 17,847,169 ).
Collapse
|
4
|
Use of CIED Generated Heart Failure Risk Score (HFRS) Alerts in an Integrated, Multi-Disciplinary Approach to HF Management-A Prospective Cohort Study. SENSORS 2022; 22:s22051825. [PMID: 35270971 PMCID: PMC8914972 DOI: 10.3390/s22051825] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/28/2022] [Revised: 02/22/2022] [Accepted: 02/23/2022] [Indexed: 12/24/2022]
Abstract
Aim: To evaluate use of CIED-generated Heart Failure Risk Score (HFRS) alerts in an integrated, multi-disciplinary approach to HF management. Methods: We undertook a prospective, single centre outcome study of patients implanted with an HFRS-enabled Medtronic CIED, generating a “high risk” alert between November 2018 and November 2020. All patients generating a “high risk” HFRS alert were managed within an integrated HF pathway. Alerts were shared with local HF teams, prompting patient contact and appropriate intervention. Outcome data on health care utilisation (HCU) and mortality were collected. A validated questionnaire was completed by the HF teams to obtain feedback. Results: 367 “High risk” alerts were noted in 188 patients. The mean patient age was 70 and 49% had a Charlson Comorbidity Score of >6. Mean number of alerts per patients was 1.95 and 44 (23%) of patients had >3 “high risk” alerts in the follow up period. Overall, 75 (39%) patients were hospitalised in the 4−6-week period of the alert; 53 (28%) were unplanned of which 24 (13%) were for decompensated HF. A total of 33 (18%) patients died in the study period. Having three or more alerts significantly increased the risk of hospitalisation for heart failure (HR 2.5, CI 1.1−5.6 p = 0.03). The feedback on the pathway was positive. Conclusions: Patients with “high risk” alerts are co-morbid and have significant HCU. An integrated approach can facilitate timely risk stratification and intervention. Intervention in these patients is not limited to HF alone and provides the opportunity for holistic management of this complex cohort.
Collapse
|
5
|
Should I stay or should I go? An exploration of the decision-making behavior of acute cardiac patients during the COVID-19 pandemic. Heart Lung 2021; 52:16-21. [PMID: 34823051 PMCID: PMC8606948 DOI: 10.1016/j.hrtlng.2021.07.018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2021] [Revised: 07/23/2021] [Accepted: 07/25/2021] [Indexed: 11/10/2022]
Abstract
Background During the SARS-COV-2 (COVID-19) pandemic efforts to reduce virus transmission resulted in non-emergency patients being deterred from seeking help. The number of patients presenting with acute cardiac conditions reduced, significantly Objectives To explore the decision-making process, and influential factors in that process, of patients and their family during an acute cardiac event. Methods A qualitative research design was employed using purposive sampling of patients who experienced an acute cardiac event during the social containment mandates. Semi-structured interviews were conducted, with thematic analysis of interview transcripts. Results Twenty-five participants were recruited from three UK hospitals. Themes identified were reliance on informal support network, lack of awareness of cardiac symptoms leading to delayed help-seeking, and an indirect COVID-19 effect (e.g. avoiding treatment). Conclusions These results highlight the need for informed public health messages, targeting patients and their support networks, that allow those in need of treatment to access care.
Collapse
|
6
|
Correction to: Validation of a novel patient reported tool to assess the impact of treatment in erythropoietic protoporphyria: the EPP-QoL. J Patient Rep Outcomes 2021; 5:73. [PMID: 34396462 PMCID: PMC8364892 DOI: 10.1186/s41687-021-00348-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|
7
|
Validation of a novel patient reported tool to assess the impact of treatment in erythropoietic protoporphyria: the EPP-QoL. J Patient Rep Outcomes 2021; 5:65. [PMID: 34342778 PMCID: PMC8333176 DOI: 10.1186/s41687-021-00345-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2020] [Accepted: 07/15/2021] [Indexed: 01/01/2023] Open
Abstract
BACKGROUND A novel treatment has been developed for erythropoietic protoporphyria (EPP) (a rare condition that leaves patients highly sensitive to light). To fully understand the burden of EPP and the benefit of treatment, a novel patient reported outcome (PRO) measure was developed called the EPP-QoL. This report describes work to support the validation of this measure. METHODS Secondary analysis of trial data was undertaken. These analyses explored the underlying factor structure of the measure. This supported the deletion of some items. Further work then explored the reliability of these factors, their construct validity and estimates of meaningful change. RESULTS The factor analyses indicated that the items could be summarised in terms of two factors. One of these was labelled EPP Symptoms and the other EPP Wellbeing, based on the items included in the domain. EPP Symptoms had evidence to support its reliability and validity. EPP Wellbeing had poor psychometric properties. CONCLUSIONS Based on the analysis it was recommended to drop the EPP Wellbeing domain (and associated items). EPP Symptoms, despite limitations in the development of items, showed evidence of validity. This work is consistent with the recommendations of a task force that provided recommendations regarding the development, modification and use of PROs in rare diseases.
Collapse
|
8
|
Submuscular reburial as an alternative to lead extraction in high risk patients. Europace 2021. [DOI: 10.1093/europace/euab116.485] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Abstract
Funding Acknowledgements
Type of funding sources: None.
Background
The Heart Rhythm Society (HRS) and European Heart Rhythm Association (EHRA) consensus states complete extraction is recommended for all patients with definite cardiac implantable electronic device (CIED) infection. Although complete removal of hardware is the best way to manage infections, lead extraction is a complex procedure with significant risk. As age and complexity of patients increase so too does extraction risk. In very high risk cases conservative management is cited, though little is known on outcomes.
Purpose
We are a high volume tertiary centre which serves a population of 2 million. 2 experienced operators perform 65 extraction procedures per year for the past 10 years. We report our experience of device reburial as initial management of CIED pre-erosion and erosion in cases deemed too high risk for extraction.
Method
We retrospectively reviewed all reburial procedures undertaken over 9 years. Patient and lead factors influencing decisions were assessed. Information on number of leads, dwell time, prior procedure, infective status and comorbidity was collated. The outcomes included morbidity, defined by repeat procedure (revision and/or extraction) and mortality.
Results
86 patients underwent 96 procedures from March 2013 until August 2020. All patients undergoing device reburial were included. 55.8% of patients were male, mean age was 73. 21 patients died, 7 of these deaths occurred within 12 months of the index reburial procedure. The mean follow up period was 39 months (range 5–90). 65.1% of patients had a procedure (de novo implant, upgrade or replacement) within 12 months prior to revision.
We reviewed patients in 2 subgroups based on revision indication – erosion and pre-erosion. Erosion was defined as externalised lead/device. Pre-erosion was defined as superficial device with skin tethering but no exposure. The former is a definite indication for lead extraction, the latter a relative indication. All in the pre-erosion group were systemically well with no infection evident. One patient with erosion had a positive blood culture.
The mean age in the erosion group was 85 years with a Clinical Frailty Score (CFS) 4.98 and lead dwell time 17.87 years compared to age 68 years, CFS 3.98 and dwell time 8.14 years in the pre-erosion group. Patients in this cohort with an eroded device were deemed too high risk to undergo transvenous lead extraction.
A higher proportion of patients presenting with erosion died within 12 months of the index reburial procedure (16.67% vs 4.84%). 21% with an eroded device and 11% with a pre-eroding device undergoing reburial as first line management required future extraction.
Conclusion
Our 9 year data suggests less invasive intervention is a valid option in high risk groups such as older age, frailty, long lead dwell time, with an acceptable incidence of reintervention and/or extraction. This data can help guide informed consent in the future.
Collapse
|
9
|
CIED guided HF management : a prospective cohort study. Europace 2021. [DOI: 10.1093/europace/euab116.473] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
Abstract
Funding Acknowledgements
Type of funding sources: Public hospital(s). Main funding source(s): Liverpool Clinical Commissioning Group
Background
Heart failure (HF) is associated with significant morbidity and mortality. (1) Cardiac Implantable Electronic Devices (CIED) generated Heart Failure Risk Score (HFRS) alerts may guide management in this complex cohort and help direct resources to appropriate patients. (2)
Aim
To develop and evaluate an integrated, multidisciplinary approach to HF management for patients with CIED by sharing HFRS alerts directly with the HF teams.
Methods
We undertook a prospective, single centre cohort study of patients who generated high risk HFRS alerts. These alerts were shared with community HF teams responsible for routine care of patient, prompting patient contact and appropriate intervention by the team. Impact of the pathway was evaluated by review of outcomes including hospitalisation and clinical intervention within 4- 6 weeks of the alert and mortality during the follow up period. Ongoing education was provided to help teams deal with alerts. A validated user questionnaire was completed by the stake holders to obtain user feedback.
Results
365 "High risk" alerts were noted in 188 patients in a 2 year period (November 2018 - November 2020). The mean number of alerts per patients was 1.9 and 44 (23%) of patients had >3 "high risk" alerts in the follow up period. Having three or more alerts significantly increased the risk of hospitalisation for heart failure (HR 2.5, CI 1.1–5.6 p = 0.03) but not mortality (HR 2.1 CI 0.6-7.2 p = 0.23). Overall 75 (39%) of patients were hospitalised in the 4-6 week period of the alert – 53 (28%) of these were unplanned of which 24(13%) were for decompensated HF. A further 24(13%) had planned admissions for care to improve therapy (AV node ablation, device and lead replacement) and reduce morbidity (LA appendage occlude, IV Iron therapy). 33(18%) of patients died in the follow up period. 15(8%) received therapy from the device. 18(10%) of patient underwent deactivation of ICD therapy.
Contact was established in 176 (94%) of patients, and alerts actioned appropriately. 55 patients reported being asymptomatic, and in 45 the trends were improving so no further clinical action was taken. 76 patients had an onward referral made for further management including; 32 to a Cardiologist, 20 to primary care, 13 referrals to community HF teams and 11 referrals to palliative care. 23 patients had medications changes instituted. The feedback on the pathway was positive.
Conclusions
An integrated approach to HF for patients with CIEDs in situ can facilitate timely risk stratification and intervention in this cohort of patients and potentially reduce unplanned health care utilisation. Intervention in these patients is not limited to HF alone and provides the opportunity for holistic management of this complex cohort Abstract Figure.
Collapse
|
10
|
Impact of Coronavirus Disease 2019 (COVID-19) Outbreak on Acute Admissions at the Emergency and Cardiology Departments Across Europe. Am J Med 2021; 134:482-489. [PMID: 33010226 PMCID: PMC7526639 DOI: 10.1016/j.amjmed.2020.08.043] [Citation(s) in RCA: 46] [Impact Index Per Article: 15.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/30/2020] [Revised: 08/17/2020] [Accepted: 08/18/2020] [Indexed: 02/06/2023]
Abstract
PURPOSE We evaluated whether the severe acute respiratory syndrome coronavirus 2 (SARS-COV-2) pandemic was associated with changes in the pattern of acute cardiovascular admissions across European centers. METHODS We set-up a multicenter, multinational, pan-European observational registry in 15 centers from 12 countries. All consecutive acute admissions to emergency departments and cardiology departments throughout a 1-month period during the COVID-19 outbreak were compared with an equivalent 1-month period in 2019. The acute admissions to cardiology departments were classified into 5 major categories: acute coronary syndrome, acute heart failure, arrhythmia, pulmonary embolism, and other. RESULTS Data from 54,331 patients were collected and analyzed. Nine centers provided data on acute admissions to emergency departments comprising 50,384 patients: 20,226 in 2020 compared with 30,158 in 2019 (incidence rate ratio [IRR] with 95% confidence interval [95%CI]: 0.66 [0.58-0.76]). The risk of death at the emergency departments was higher in 2020 compared to 2019 (odds ratio [OR] with 95% CI: 4.1 [3.0-5.8], P < 0.0001). All 15 centers provided data on acute cardiology departments admissions: 3007 patients in 2020 and 4452 in 2019; IRR (95% CI): 0.68 (0.64-0.71). In 2020, there were fewer admissions with IRR (95% CI): acute coronary syndrome: 0.68 (0.63-0.73); acute heart failure: 0.65 (0.58-0.74); arrhythmia: 0.66 (0.60-0.72); and other: 0.68(0.62-0.76). We found a relatively higher percentage of pulmonary embolism admissions in 2020: odds ratio (95% CI): 1.5 (1.1-2.1), P = 0.02. Among patients with acute coronary syndrome, there were fewer admissions with unstable angina: 0.79 (0.66-0.94); non-ST segment elevation myocardial infarction: 0.56 (0.50-0.64); and ST-segment elevation myocardial infarction: 0.78 (0.68-0.89). CONCLUSION In the European centers during the COVID-19 outbreak, there were fewer acute cardiovascular admissions. Also, fewer patients were admitted to the emergency departments with 4 times higher death risk at the emergency departments.
Collapse
|
11
|
1261Geographical variations in the incidence of CIED infection and infection prevention strategies: Update from the global WRAP-IT study. Europace 2020. [DOI: 10.1093/europace/euaa162.364] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
Abstract
Funding Acknowledgements
Medtronic, Inc.
Introduction
Cardiac Implantable Electronic Device (CIED) infections lead to significant morbidity, mortality, and use of health care resources. There is variation in infection prevention strategies among centers, and it is not clear whether there is also variation in infection rates across different geographies. Recently, WRAP-IT, the largest global randomized trial to evaluate an infection reduction strategy, randomized 6,983 patients to receive an antibacterial envelope (treatment) vs. no envelope (control). The results demonstrated a significant reduction in major CIED infection with the TYRX antibiotic envelope (12-mo infection rate for envelope vs. control 0.7% and 1.2%, respectively; HR, 0.60; 95% [CI], 0.36 to 0.98; P = 0.04). The purpose of this analysis is to assess geographical variations in patient characteristics, procedural routines, and infection rates.
Methods
The WRAP-IT study enrolled patients undergoing a CIED pocket revision, generator replacement, or system upgrade or an initial implantation of a cardiac resynchronization therapy defibrillator and randomized them to receive the envelope or not, in addition to mandated pre-procedure intravenous antibiotic prophylaxis. To assess geographical variations in infection rates, the control group (per protocol) baseline demographics and procedural characteristics were identified. Major infection was defined as CIED infections resulting in system extraction or revision, long-term antibiotic therapy with infection recurrence, or death.
Results
A total of 3429 control patients were evaluated and followed for a mean of 20.9 ± 8.3 months; 2530 patients from 123 centers in North America, 777 patients from 46 centers in Europe, and 122 patients from 11 centers in Asia/South America. The 24-month Kaplan-Meier major infection rates were 1.2% in North America (30 pts), 2.5% in Europe (16 pts), and 4.3% Asia/South America (5 pts) (see Figure). These geographical variations in the incidence of major CIED infections were significant (overall P = 0.008, univariate). There were differences in baseline patient characteristics, including age, sex, medication use, NYHA Class, and number of previous devices across geographies. Differences also included procedural characteristics, such as device type, use of pocket wash, skin preparation, pre-operative antibiotic drug use, and procedure time.
Conclusion
Major CIED infection rates vary significantly across geographies. The effect of patient demographics and procedural characteristics on these findings will be assessed and presented at EHRA. Insights into geographical variability of CIED infections is important to mitigate infection risk, reduce morbidity and cost.
Abstract Figure. Major CIED Infection Rate by Geography
Collapse
|
12
|
Combined action observation and motor imagery facilitates visuomotor adaptation in children with developmental coordination disorder. RESEARCH IN DEVELOPMENTAL DISABILITIES 2020; 98:103570. [PMID: 31918039 DOI: 10.1016/j.ridd.2019.103570] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/29/2019] [Revised: 11/28/2019] [Accepted: 12/30/2019] [Indexed: 06/10/2023]
Abstract
The internal modelling deficit (IMD) hypothesis suggests that motor control issues associated with Developmental Coordination Disorder (DCD) are the result of impaired predictive motor control. In this study, we examined the benefits of a combined action observation and motor imagery (AO + MI) intervention designed to alleviate deficits in internal modelling and improve eye-hand coordination during a visuomotor rotation task. Twenty children with DCD were randomly assigned to either an AO + MI group (who watched a video of a performer completing the task whilst simultaneously imagining the kinaesthetic sensations associated with action execution) or a control group (who watched unrelated videos involving no motor content). Each group then attempted to learn a 90° visuomotor rotation while measurements of completion time, eye-movement behaviour and movement kinematics were recorded. As predicted, after training, the AO + MI group exhibited quicker completion times, more target-focused eye-movement behaviour and smoother movement kinematics compared to the control group. No significant after-effects were present. These results offer further support for the IMD hypothesis and suggest that AO + MI interventions may help to alleviate such deficits and improve motor performance in children with DCD.
Collapse
|
13
|
2405When do cardiac rhythm management device complications occur? Eur Heart J 2019. [DOI: 10.1093/eurheartj/ehz748.0158] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
Abstract
Background
The rate of implantation of cardiac rhythm management devices continues to increase but it is difficult to compare the complication rates of routine cardiac implantable electronic device procedures as the follow-up durations published in literature vary widely (peri-procedure to 3 years).
Purpose
We demonstrate the appropriate follow-up duration for a complication to be attributed to an index procedure, by analysing the complications within the first year of routine cardiac implantable electronic device procedures (primary implant, generator replacement, lead revision only, generator and lead revision, system upgrade and reburial) and the timing of complication interventions.
Methods
A retrospective database was constructed of all de novo CIED operations performed at a tertiary cardio-thoracic centre between April 2008 – March 2016. Procedures were identified from theatre logbooks, with demographic and procedural data extracted from contemporaneously maintained health records (paper and electronic). Objective complications were defined as follows; 1. Any pneumothorax identified on chest x-ray, 2. Any pericardial effusion identified on transthoracic echocardiogram performed post procedure due to intra-operative clinical concern, 3. Haematoma requiring surgical evacuation, 4. Device pocket revision or system reburial, 5. Lead intervention requiring repositioning or placement of a new lead and 6. System explant/ extraction due to any indication. Post-operative complications were identified through theatre log books and cross-referenced with 3 contemporaneously maintained records (clinical health records, TOMCAT electronic database, audit department archive). All data collection was performed by a single investigator, independent of the procedures performed and operating physicians.
Results
10,125 procedures were reviewed; 6,583 primary implant procedures, 2,170 generator replacements, 382 lead revisions, 253 generator and lead interventions, 575 system upgrades and 162 reburial procedures. The procedures involved 3,403 female patients and 6,722 male, with a median age 73 years (±13.7). 2,303 procedures were acute and 7,822 elective. The complication rates were; 4.3% primary implant, 4.0% generator replacements, 12.6% lead revisions, 13.0% generator and lead interventions, 6.9% system upgrades and 22.8% reburial procedures. The timing of complication intervention varied significantly (Figure 1).
Timing of Complication Intervention
Conclusion
Complications within the first year of any cardiac implantable electronic device intervention is common, particularly system reburials, lead revisions and generator and lead procedures. 10% of complications after generator replacement, system upgrades or reburial procedures occurred at between 36–52 weeks. For lead and combined generator and lead procedures this figure rose to approximately 15%.
Collapse
|
14
|
32A PATIENTS CHARTER TO IMPROVE MEDICATION ADMINISTRATION FOR PATIENTS IN CARE HOMES: IMPLEMENTATION AND PILOT. Age Ageing 2019. [DOI: 10.1093/ageing/afz055.32] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
|
15
|
Insertion of miniaturized cardiac monitors outside the catheter operating room: experience and practical advice. Europace 2018; 19:1624-1629. [PMID: 28340242 PMCID: PMC5834127 DOI: 10.1093/europace/euw304] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2016] [Accepted: 08/30/2016] [Indexed: 12/12/2022] Open
Abstract
Minor surgical procedures are increasingly being performed as outpatient procedures in settings outside hospital operating rooms (ORs). In electrophysiology, the recent miniaturization of insertable cardiac monitors (ICMs) has enabled the routine insertion of the device as a minimally invasive procedure without the need of a catheter OR. However, a shift to office-based environments for minor surgical procedures is associated with some concerns, particularly with respect to patient- and procedure-related safety in the new setting. In the present document, the authors provide practical advice on facilities, practices, and adaptations necessary when performing ICM insertions in office settings, based on available recommendations as well as their own experience with the use of the novel Reveal LINQ ICM. The main differences from in-hospital implant settings are simplified requirements of room, equipment, and insertion procedures, while ensuring and maintaining an adequate, sterile environment. Patient selection is important: certain groups of patients are recommended to be treated in the catheter OR (e.g. those at increased risk for bleeding or very frail elderly individuals). Insertion in alternative positions, as is sometimes performed for cosmetic reasons, should be referred to dedicated hospitals. Quality assurance and internal quality control are critical in the new procedural landscape, and it is important not to trivialize minor surgical procedures. Operators' sharing of experiences and lessons learned, e.g. in the form of registries, should be encouraged.
Collapse
|
16
|
Abstract
Aims Remote management of heart failure using implantable electronic devices (REM-HF) aimed to assess the clinical and cost-effectiveness of remote monitoring (RM) of heart failure in patients with cardiac implanted electronic devices (CIEDs). Methods and results Between 29 September 2011 and 31 March 2014, we randomly assigned 1650 patients with heart failure and a CIED to active RM or usual care (UC). The active RM pathway included formalized remote follow-up protocols, and UC was standard practice in nine recruiting centres in England. The primary endpoint in the time to event analysis was the 1st event of death from any cause or unplanned hospitalization for cardiovascular reasons. Secondary endpoints included death from any cause, death from cardiovascular reasons, death from cardiovascular reasons and unplanned cardiovascular hospitalization, unplanned cardiovascular hospitalization, and unplanned hospitalization. REM-HF is registered with ISRCTN (96536028). The mean age of the population was 70 years (range 23–98); 86% were male. Patients were followed for a median of 2.8 years (range 0–4.3 years) completing on 31 January 2016. Patient adherence was high with a drop out of 4.3% over the course of the study. The incidence of the primary endpoint did not differ significantly between active RM and UC groups, which occurred in 42.4 and 40.8% of patients, respectively [hazard ratio 1.01; 95% confidence interval (CI) 0.87–1.18; P = 0.87]. There were no significant differences between the two groups with respect to any of the secondary endpoints or the time to the primary endpoint components. Conclusion Among patients with heart failure and a CIED, RM using weekly downloads and a formalized follow up approach does not improve outcomes.
Collapse
|
17
|
Abstract
A variety of neuropharmacological agents were tested to elucidate how chlorpromazine influenced an endotoxin-induced reaction. The results obtained, particularly with beta-adrenergic blocking agents, reserpine and fusaric acid, suggested that the primary locus of chlorpromazine's action was mediated by peripheral beta-adrenergic receptor blockade. Such a locus is compatible with the low doses of propranolol which suppress the reaction, and with successful treatment of shock with dopamine.
Collapse
|
18
|
Cost-effectiveness of TYRX absorbable antibacterial envelope for prevention of cardiovascular implantable electronic device infection. J Med Econ 2018; 21:294-300. [PMID: 29171319 DOI: 10.1080/13696998.2017.1409227] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
AIMS Infection is a major complication of cardiovascular implantable electronic device (CIED) therapy that usually requires device extraction and is associated with increased morbidity and mortality. The TYRX Antibacterial Envelope is a polypropylene mesh that stabilizes the CIED and elutes minocycline and rifampin to reduce the risk of post-operative infection. METHODS A decision tree was developed to assess the cost-effectiveness of TYRX vs standard of care (SOC) following implantation of four CIED device types. The model was parameterized for a UK National Health Service perspective. Probabilities were derived from the literature. Resource use included drug acquisition and administration, hospitalization, adverse events, device extraction, and replacement. Incremental cost-effectiveness ratios (ICERs) were calculated from costs and quality-adjusted life-years (QALYs). RESULTS Over a 12-month time horizon, TYRX was less costly and more effective than SOC when utilized in patients with an ICD or CRT-D. TYRX was associated with ICERs of £46,548 and £21,768 per QALY gained in patients with an IPG or CRT-P, respectively. TYRX was cost-effective at a £30,000 threshold at baseline probabilities of infection exceeding 1.65% (CRT-D), 1.95% (CRT-P), 1.87% (IPG), and 1.38% (ICD). LIMITATIONS AND CONCLUSIONS Device-specific infection rates for high-risk patients were not available in the literature and not used in this analysis, potentially under-estimating the impact of TYRX in certain devices. Nevertheless, TYRX is associated with a reduction in post-operative infection risk relative to SOC, resulting in reduced healthcare resource utilization at an initial cost. The ICERs are below the accepted willingness-to-pay thresholds used by UK decision-makers. TYRX, therefore, represents a cost-effective prevention option for CIED patients at high-risk of post-operative infection.
Collapse
|
19
|
Reply to: Brough CEP, Haycox A. Resource Use In Rectifying Pacemaker Complications. Journal of Medical Economics 2018. doi: 10.1080/13696998.2017.1423075. J Med Econ 2018; 21:310-311. [PMID: 29295629 DOI: 10.1080/13696998.2017.1423076] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
20
|
Abstract
AIM To estimate health resource utilization (HRU) associated with the management of pacemaker complications in various healthcare systems. METHODS Electrophysiologists (EPs) from four geographical regions (Western Europe, Australia, Japan, and North America) were invited to participate. Survey questions focused on HRU in the management of three chronic pacemaker complications (i.e. pacemaker infections requiring extraction, lead fractures/insulation breaches requiring replacement, and upper extremity deep venous thrombosis [DVT]). Panelists completed a maximum of two web-based surveys (iterative rounds). Mean, median values, and interquartile ranges were calculated and used to establish consensus. RESULTS Overall, 32 and 29 panelists participated in the first and second rounds of the Delphi panel, respectively. Consensus was reached on treatment and HRU associated with a typical pacemaker implantation and complications. HRU was similar across regions, except for Japan, where panelists reported the longest duration of hospital stay in all scenarios. Infections were the most resource-intensive complications and were characterized by intravenous antibiotics days of 9.6?13.5 days and 21.3?29.2 days for pocket and lead infections respectively; laboratory and diagnostic tests, and system extraction and replacement procedures. DVT, on the other hand, was the least resource intensive complication. LIMITATIONS The results of the panel represent the views of the respondents who participated and may not be generalizable outside of this panel. The surveys were limited in scope and, therefore, did not include questions on management of acute complications (e.g. hematoma, pneumothorax). CONCLUSIONS The Delphi technique provided a reliable and efficient approach to estimating resource utilization associated with chronic pacemaker complications. Estimates from the Delphi panel can be used to generate costs of pacemaker complications in various regions.
Collapse
|
21
|
Novel genetic factors involved in resistance to Bacillus thuringiensis in Plutella xylostella. INSECT MOLECULAR BIOLOGY 2015; 24:589-600. [PMID: 26335439 DOI: 10.1111/imb.12186] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
The widespread and sustainable exploitation of the entomopathogen Bacillus thuringiensis (Bt) in pest control is threatened by the evolution of resistance. Although resistance is often associated with loss of binding of the Bt toxins to the insect midgut cells, other factors have been implicated. Here we used suppressive subtractive hybridization and gene expression suppression to identify additional molecular components involved in Bt-resistance in Plutella xylostella. We isolated transcripts from genes that were differentially expressed in the midgut of larvae from a resistant population, following ingestion of a Bt kurstaki HD1 strain-based commercial formulation (DiPel), and compared with a genetically similar susceptible population. Quantitative real-time polymerase-chain reaction (RT-PCR) analysis confirmed the differential basal expression of a subset of these genes. Gene expression suppression of three of these genes (P. xylostella cyclin-dependent kinase 5 regulatory subunit associated protein 1-like 1, stromal cell-derived factor 2-like 1 and hatching enzyme-like 1) significantly increased the pathogenicity of HD1 to the resistant population. In an attempt to link the multitude of factors reportedly influencing resistance to Bt with the well-characterized loss of toxin binding, we also considered Bt-resistance models in P. xylostella and other insects.
Collapse
|
22
|
Abstract
OBJECTIVE To pilot and feasibility-test supervised final year undergraduate pharmacy student-led medication reviews for patients with diabetes to enable definitive trial design. METHOD Third year pharmacy students were recruited from one UK School of Pharmacy and trained to review patient's medical records and provide face-to-face consultations under supervision while situated within the patient's medical practice. Patients with type 2 diabetes were recruited by postal invitation letter from their medical practice and randomised via automated system to intervention or usual care. Diabetes-related clinical data, quality of life, patient reported beliefs, adherence and satisfaction with medicines information were collected with validated tools at baseline and 6 months postintervention. The process for collecting resource utilisation data was tested. Stakeholder meetings were held before and after intervention to develop study design and learn from its implementation. Recruitment and attrition rates were determined plus the quality of the outcome data. Power calculations for a definitive trial were performed on the different outcome measures to identify the most appropriate primary outcome measure. RESULTS 792 patients were identified as eligible from five medical practices. 133 (16.8%) were recruited and randomised to control (n=66) or usual care (n=67). 32 students provided the complete intervention to 58 patients. Initial data analysis showed potential for impact in the right direction for some outcomes measured including glycated haemoglobin, quality of life and patient satisfaction with information about medicines. The intervention was found to be feasible and acceptable to patients. The pilot and feasibility study enabled the design of a future full randomised controlled trial. CONCLUSIONS Student and patient recruitment are possible. The intervention was well received and demonstrated some potential benefits. While the intervention was relatively inexpensive and provided an experiential learning opportunity for pharmacy students, its cost-effectiveness remains to be determined. TRIAL REGISTRATION NUMBER ISRCTN26445805; Results.
Collapse
|
23
|
The strategies to reduce iron deficiency in blood donors randomized trial: design, enrolment and early retention. Vox Sang 2014; 108:178-85. [PMID: 25469720 DOI: 10.1111/vox.12210] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2014] [Revised: 09/08/2014] [Accepted: 09/16/2014] [Indexed: 12/13/2022]
Abstract
BACKGROUND AND OBJECTIVES Repeated blood donation produces iron deficiency. Changes in dietary iron intake do not prevent donation-induced iron deficiency. Prolonging the interdonation interval or using oral iron supplements can mitigate donation-induced iron deficiency. The most effective operational methods for reducing iron deficiency in donors are unknown. MATERIALS AND METHODS 'Strategies To Reduce Iron Deficiency' (STRIDE) was a two-year, randomized, placebo-controlled study in blood donors. 692 donors were randomized into one of two educational groups or one of three interventional groups. Donors randomized to educational groups either received letters thanking them for donating, or, suggesting iron supplements or delayed donation if they had low ferritin. Donors randomized to interventional groups either received placebo, 19-mg or 38-mg iron pills. RESULTS Iron deficient erythropoiesis was present in 52·7% of males and 74·6% of females at enrolment. Adverse events within 60 days of enrolment were primarily mild gastrointestinal symptoms (64%). The incidence of de-enrolment within 60 days was more common in the interventional groups than in the educational groups (P = 0·002), but not more common in those receiving iron than placebo (P = 0·68). CONCLUSION The prevalence of iron deficient erythropoiesis in donors enrolled in the STRIDE study is comparable to previously described cohorts of regular blood donors. De-enrolment within 60 days was higher for donors receiving tablets, although no more common in donors receiving iron than placebo.
Collapse
|
24
|
The influence of the HPG axis on stress response and depressive-like behaviour in a transgenic mouse model of Huntington's disease. Exp Neurol 2014; 263:63-71. [PMID: 25246229 DOI: 10.1016/j.expneurol.2014.09.009] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2014] [Revised: 07/19/2014] [Accepted: 09/09/2014] [Indexed: 12/13/2022]
Abstract
Huntington's disease (HD) is an autosomal dominant, neurodegenerative disease caused by a CAG tandem repeat mutation encoding a polyglutamine tract expansion in the huntingtin protein. Depression is among the most common affective symptoms in HD but the pathophysiology is unclear. We have previously discovered sexually dimorphic depressive-like behaviours in the R6/1 transgenic mouse model of HD at a pre-motor symptomatic age. Interestingly, only female R6/1 mice display this phenotype. Sexual dimorphism has not been explored in the human HD population despite the well-established knowledge that the clinical depression rate in females is almost twice that of males. Female susceptibility suggests a role of sex hormones, which have been shown to modulate stress response. There is evidence suggesting that the gonads are adversely affected in HD patients, which could alter sex hormone levels. The present study examined the role sex hormones play on stress response in the R6/1 mouse model of HD, in particular, its modulatory effect on the hypothalamic-pituitary-adrenal (HPA) axis and depression-like behaviour. We found that the gonads of female R6/1 mice show atrophy at an early age. Expression levels of gonadotropin-releasing hormone (GnRH) were decreased in the hypothalamus of female HD mice, relative to wild-type female littermates, as were serum testosterone levels. Female serum estradiol levels were not significantly changed. Gonadectomy surgery reduced HPA-axis activity in female mice but had no effect on behavioural phenotypes. Furthermore, expression of the oestrogen receptor (ER) α gene was found to be higher in the adrenal cells of female HD mice. Finally, administration of an ERβ agonist diarylpropionitrile (DPN) rescued depressive-like behaviour in the female HD mice. Our findings provide new insight into the pathogenesis of sexually dimorphic neuroendocrine, physiological and behavioural endophenotypes in HD, and suggest a new avenue for therapeutic intervention.
Collapse
|
25
|
17 De NovoCardiac Rhythm Management Device Complications at 1-Year – Introducing Mortality Censoring. BRITISH HEART JOURNAL 2014. [DOI: 10.1136/heartjnl-2014-306118.17] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
|
26
|
Knowledge of HIV testing and attitudes towards blood donation at three blood centres in Brazil. Vox Sang 2013; 106:344-53. [PMID: 24313562 DOI: 10.1111/vox.12114] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2013] [Revised: 10/08/2013] [Accepted: 10/17/2013] [Indexed: 11/28/2022]
Abstract
BACKGROUND Reducing risk of HIV window period transmission requires understanding of donor knowledge and attitudes related to HIV and risk factors. STUDY DESIGN AND METHODS We conducted a survey of 7635 presenting blood donors at three Brazilian blood centres from 15 October through 20 November 2009. Participants completed a questionnaire on HIV knowledge and attitudes about blood donation. Six questions about blood testing and HIV were evaluated using maximum likelihood chi-square and logistic regression. Test seeking was classified in non-overlapping categories according to answers to one direct and two indirect questions. RESULTS Overall, respondents were male (64%) repeat donors (67%) between 18 and 49 years old (91%). Nearly 60% believed blood centres use better HIV tests than other places; however, 42% were unaware of the HIV window period. Approximately 50% believed it was appropriate to donate to be tested for HIV, but 67% said it was not acceptable to donate with risk factors even if blood is tested. Logistic regression found that less education, Hemope-Recife blood centre, replacement, potential and self-disclosed test-seeking were associated with less HIV knowledge. CONCLUSION HIV knowledge related to blood safety remains low among Brazilian blood donors. A subset finds it appropriate to be tested at blood centres and may be unaware of the HIV window period. These donations may impose a significant risk to the safety of the blood supply. Decreasing test-seeking and changing beliefs about the appropriateness of individuals with behavioural risk factors donating blood could reduce the risk of transfusing an infectious unit.
Collapse
|
27
|
081 A NOVEL STANDARD OF QUALITY EVALUATION IN BRADYARRHYTHMIA PACING. BRITISH HEART JOURNAL 2013. [DOI: 10.1136/heartjnl-2013-304019.81] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
|
28
|
Risk factors for human immunodeficiency virus infection among Brazilian blood donors: a multicentre case-control study using audio computer-assisted structured interviews. Vox Sang 2013; 105:91-9. [PMID: 23517235 DOI: 10.1111/vox.12028] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2012] [Revised: 01/09/2013] [Accepted: 01/13/2013] [Indexed: 11/30/2022]
Abstract
BACKGROUND Although risk factors for HIV infection are known, it is important for blood centres to understand local epidemiology and disease transmission patterns. Current risk factors for HIV infection in blood donors in Brazil were assessed. METHODS A case-control study was conducted at large public blood centres located in four major cities between April 2009 and March 2011. Cases were persons whose donations were confirmed positive by enzyme immunoassays followed by Western blot confirmation. Audio computer-assisted structured interviews (ACASI) were completed by all cases and controls. Multivariable logistic regression was used to estimate adjusted odds ratios (AORs) and associated 95% confidence intervals (CIs). RESULTS There were 341 cases, including 47 with recently acquired infection, and 791 controls. Disclosed risk factors for both females and males were sex with an HIV-positive person AOR 11.3, 95% CI (4.1, 31.7) and being an IVDU or sexual partner of an IVDU [AOR 4.65 (1.8, 11.7)]. For female blood donors, additional risk factors were having male sex partners who also are MSM [AOR 13.5 (3.1, 59.8)] and having unprotected sex with multiple sexual partners [AOR 5.19 (2.1, 12.9)]. The primary risk factor for male blood donors was MSM activity [AOR 21.6 (8.8, 52.9)]. Behaviours associated with recently acquired HIV were being a MSM or sex partner of MSM [13.82, (4.7, 40.3)] and IVDU [11.47, (3.0, 43.2)]. CONCLUSION Risk factors in blood donors parallel those in the general population in Brazil. Identified risk factors suggest that donor compliance with selection procedures at the participating blood centres is inadequate.
Collapse
|
29
|
GRP-061 Evaluation of Dose Recovery from Tablet Manipulation For Enteral Tube Administration: Abstract GRP-061 Table 1. Eur J Hosp Pharm 2013. [DOI: 10.1136/ejhpharm-2013-000276.061] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022] Open
|
30
|
Abstract
BACKGROUND On 12 May 2008, a severe earthquake struck Sichuan in China. Many people donated blood for the first time, leading us to question whether these donors might become repeat donors in the future. The return pattern of post-earthquake first-time donors (PEFTD) was compared with that of first-time donors (FTD) in a comparable period. METHODS Demographic characteristics, transfusion-transmissible infection rates and 1-year return rates were compared between 5147 PEFTD (5/13-5/19, 2008) and 3176 FTD (5/13-5/19, 2009) from five Chinese blood centres using chi-squared tests. Adjusted logistic regression was used to detect earthquake effect on donor return. RESULTS Post-earthquake first-time donors were more frequently between 26 and 45 years, men, and better educated compared with the control group. Slightly higher but not statistically significant increased rates of hepatitis B virus surface antigen (HBsAg) (0·87% vs. 0·50%, P=0·054), hepatitis C virus (HCV) (0·70% vs. 0·63%, P=0·414), syphilis (0·9% vs. 0·7%, P=0·489) and lower rates of human immunodeficiency virus (HIV) (0·31% vs. 0·60%, P=0·078) reactivity were detected for PEFTD. The 1-year return rate for PEFTD was significantly lower than that of the controls (8·0% vs. 13·0%, P<0·001). After adjusting for demographic factors, donation volume and sites, the PEFTD were less likely to return in 1 year than the controls (OR: 0·520; 95% CI: 0·442, 0·611). CONCLUSION Post-earthquake first-time donors may be less likely to donate again without continuing motivation strategies. Further studies on PEFTD's lack of motivation to return for donation are needed to design recruitment strategies to convert PEFTD to become repeat donors to continuously replenish the blood supply.
Collapse
|
31
|
Implementation of statin prescribing guidelines: a cost effectiveness analysis. INTERNATIONAL JOURNAL OF PHARMACY PRACTICE 2011. [DOI: 10.1111/j.2042-7174.2001.tb01080.x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
Abstract
Focal points
Collapse
|
32
|
A Change in a Single Midgut Receptor in the Diamondback Moth (Plutella xylostella) Is Only in Part Responsible for Field Resistance to Bacillus thuringiensis subsp. kurstaki and B. thuringiensis subsp. aizawai. Appl Environ Microbiol 2010; 63:1814-9. [PMID: 16535597 PMCID: PMC1389152 DOI: 10.1128/aem.63.5.1814-1819.1997] [Citation(s) in RCA: 63] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
Abstract
A population (SERD3) of the diamondback moth (Plutella xylostella L.) with field-evolved resistance to Bacillus thuringiensis subsp. kurstaki HD-1 (Dipel) and B. thuringiensis subsp. aizawai (Florbac) was collected. Laboratory-based selection of two subpopulations of SERD3 with B. thuringiensis subsp. kurstaki (Btk-Sel) or B. thuringiensis subsp. aizawai (Bta-Sel) increased resistance to the selecting agent with little apparent cross-resistance. This result suggested the presence of independent resistance mechanisms. Reversal of resistance to B. thuringiensis subsp. kurstaki and B. thuringiensis subsp. aizawai was observed in the unselected SERD3 subpopulation. Binding to midgut brush border membrane vesicles was examined for insecticidal crystal proteins specific to B. thuringiensis subsp. kurstaki (Cry1Ac), B. thuringiensis subsp. aizawai (Cry1Ca), or both (Cry1Aa and Cry1Ab). In the unselected SERD3 subpopulation (ca. 50- and 30-fold resistance to B. thuringiensis subsp. kurstaki and B. thuringiensis subsp. aizawai), specific binding of Cry1Aa, Cry1Ac, and Cry1Ca was similar to that for a susceptible population (ROTH), but binding of Cry1Ab was minimal. The Btk-Sel (ca. 600-and 60-fold resistance to B. thuringiensis subsp. kurstaki and B. thuringiensis subsp. aizawai) and Bta-Sel (ca. 80-and 300-fold resistance to B. thuringiensis subsp. kurstaki and B. thuringiensis subsp. aizawai) subpopulations also showed reduced binding to Cry1Ab. Binding of Cry1Ca was not affected in the Bta-Sel subpopulation. The results suggest that reduced binding of Cry1Ab can partly explain resistance to B. thuringiensis subsp. kurstaki and B. thuringiensis subsp. aizawai. However, the binding of Cry1Aa, Cry1Ac, and Cry1Ca and the lack of cross-resistance between the Btk-Sel and Bta-Sel subpopulations also suggest that additional resistance mechanisms are present.
Collapse
|
33
|
Effects of host plant and genetic background on the fitness costs of resistance to Bacillus thuringiensis. Heredity (Edinb) 2010; 106:281-8. [PMID: 20517345 DOI: 10.1038/hdy.2010.65] [Citation(s) in RCA: 40] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022] Open
Abstract
Novel resistance to pathogens and pesticides is commonly associated with a fitness cost. However, measurements of the fitness costs of insecticide resistance have used diverse methods to control for genetic background and rarely assess the effects of environmental variation. Here, we explored how genetic background interacts with resource quality to affect the expression of the fitness costs associated with resistance. We used a serially backcrossed line of the diamondback moth, Plutella xylostella, resistant to the biopesticide Bacillus thuringiensis, to estimate the costs of resistance for insects feeding on two Brassica species. We found that fitness costs increased on the better-defended Brassica oleracea cultivars. These data were included in two meta-analyses of fitness cost experiments that used standardized protocols (and a common resistant insect stock) but which varied in the methodology used to control for the effects of genetic background. The meta-analysis confirmed that fitness costs were higher on the low-quality host (B. oleracea); and experimental methodology did not influence estimates of fitness costs on that plant species. In contrast, fitness costs were heterogeneous in the Brassica pekinensis studies: fitness costs in genetically homogenized lines were significantly higher than in studies using revertant insects. We hypothesize that fitness modifiers can moderate fitness costs on high-quality plants but may not affect fitness when resource quality is low.
Collapse
|
34
|
Implantable cardioverter defibrillator: what a hospital practitioner needs to know. Eur J Intern Med 2009; 20:591-7. [PMID: 19782919 DOI: 10.1016/j.ejim.2009.06.006] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/15/2009] [Revised: 06/17/2009] [Accepted: 06/22/2009] [Indexed: 10/20/2022]
Abstract
The implantable cardioverter defibrillator (ICD) has undergone a remarkable transformation in the last three decades, both in generator size and functionality. This, coupled with improvements in lead design, allows the simplicity of defibrillator implantation to approach that of pacemakers, with outpatient placement now feasible. Nowadays, the majority of new ICD implants are performed on primary prevention grounds with device longevity of more than 7 years. In this article, we will concisely explain the evolution of this treatment, the implantation technique, arrhythmia detection and patient follow-up. In addition, we will review the relevant clinical trials as well as prescription guidelines.
Collapse
|
35
|
Abstract
The right ventricular apex (RVA) has been the elective site for placing endocardial pacing leads since 1959 when Furman described the use of the transvenous route for pacemaker implantation. This site was used because it is easily accessible, readily identified and associated with a stable position and reliable chronic pacing parameters. It was recognised, however, that pacing from the RVA did not reproduce normal ventricular conduction or contraction. With the advent of reliable active fixation leads, alternative right ventricular sites became accessible and began to be explored. In this review, the detrimental effects of RVA pacing are outlined, the right ventricular outflow tract is defined and the evidence for selective site pacing is discussed.
Collapse
|
36
|
The bunged bung--one-way valve malfunction during elective surgery. Anaesth Intensive Care 2008; 36:459-460. [PMID: 18564814] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/26/2023]
|
37
|
Cardiac resynchronisation therapy: evidence based benefits and patient selection. Eur J Intern Med 2008; 19:165-72. [PMID: 18395159 DOI: 10.1016/j.ejim.2007.09.012] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/20/2007] [Revised: 04/27/2007] [Accepted: 09/26/2007] [Indexed: 10/22/2022]
Abstract
Despite the improvement in pharmacologic treatment of heart failure, many patients continue to have severe persistent symptoms, and their prognosis remains poor. One of the most recent advances in heart failure management is the concept of cardiac resynchronization therapy (CRT) with right and left ventricular pacing. Large clinical trials have demonstrated morbidity and mortality benefits of CRT in patients with moderate to severe drug refractory heart failure (New York Heart Association (NYHA) functional class III or IV), and ejection fraction < or = 35% with QRS duration > or = 120 ms. Despite the documented benefits, 20-30% of patients selected to have CRT do not respond to this treatment. Echocardiography will probably play a more important role in better selecting patients with mechanical dyssynchrony who are more likely to respond to CRT. This article reviews the available evidence for CRT as well as the way to select responders to this rather invasive therapy.
Collapse
|
38
|
Abstract
Cardiopulmonary exercise testing (CPET) has become an important clinical tool to evaluate exercise capacity and predict outcome in patients with heart failure and other cardiac conditions. It provides assessment of the integrative exercise responses involving the pulmonary, cardiovascular and skeletal muscle systems, which are not adequately reflected through the measurement of individual organ system function. CPET is being used increasingly in a wide spectrum of clinical applications for evaluation of undiagnosed exercise intolerance and for objective determination of functional capacity and impairment. This review focuses on the exercise physiology and physiological basis for functional exercise testing and discusses the methodology, indications, contraindications and interpretation of CPET in normal people and in patients with heart failure.
Collapse
|
39
|
Dermcidin expression in hepatic cells improves survival without N-glycosylation, but requires asparagine residues. Br J Cancer 2006; 94:1663-71. [PMID: 16685272 PMCID: PMC2361319 DOI: 10.1038/sj.bjc.6603148] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022] Open
Abstract
Proteolysis-inducing factor, a cachexia-inducing tumour product, is an N-glycosylated peptide with homology to the unglycosylated neuronal survival peptide Y-P30 and a predicted product of the dermcidin gene, a pro-survival oncogene in breast cancer. We aimed to investigate whether dermcidin is pro-survival in liver cells, in which proteolysis-inducing factor induces catabolism, and to determine the role of potentially glycosylated asparagine residues in this function. Reverse cloning of proteolysis-inducing factor demonstrated ∼100% homology with the dermcidin cDNA. This cDNA was cloned into pcDNA3.1+ and both asparagine residues removed using site-directed mutagenesis. In vitro translation demonstrated signal peptide production, but no difference in molecular weight between the products of native and mutant vectors. Immunocytochemistry of HuH7 cells transiently transfected with V5-His-tagged dermcidin confirmed targeting to the secretory pathway. Stable transfection conferred protection against oxidative stress. This was abrogated by mutation of both asparagines in combination, but not by mutation of either asparagine alone. These findings suggest that dermcidin may function as an oncogene in hepatic as well as breast cells. Glycosylation does not appear to be required, but the importance of asparagine residues suggests a role for the proteolysis-inducing factor core peptide domain.
Collapse
|
40
|
How do different indicators of cardiac pump function impact upon the long-term prognosis of patients with chronic heart failure? Am Heart J 2005; 150:983. [PMID: 16290976 DOI: 10.1016/j.ahj.2005.08.018] [Citation(s) in RCA: 29] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/21/2005] [Accepted: 08/30/2005] [Indexed: 11/15/2022]
Abstract
BACKGROUND The prognosis of patients with mild-moderate chronic heart failure (CHF) over a long-term follow-up period is more difficult to predict than for patients with more severe CHF in the short term. This study assessed the prognostic value of various indicators of cardiac pump function to gain insight into how different aspects of organ function impact upon prognosis. METHODS Unselected, consecutive patients with CHF (n = 219, 166 men, mean [+/-SD] age 56 +/- 13 years) who underwent symptom limited cardiopulmonary treadmill exercise testing with noninvasive estimation of cardiac output using carbon dioxide rebreathing techniques were followed up for a median period of 8.6 +/- 1.0 years in survivors. Cardiac power output (CPO) was calculated from the product of cardiac output and mean arterial pressure and cardiac reserve was estimated by subtracting resting from peak exercise CPO or cardiac output (CO). RESULTS All-cause mortality was 36% (78 deaths). Survivors had a significantly greater cardiac pumping reserve with the greatest difference seen in CPO reserve (+57%) and CO reserve (+49%) (both P < .001). Although various direct and indirect indicators of cardiac function were predictive of outcome on univariate analyses, multivariate analysis using the Cox proportional hazards model identified CO reserve to be the independent variable predictive of all-cause mortality, with a hazard ratio (95% CI) of 0.682 (0.612-0.757, P < .001) for each L/min increase in cardiac output reserve. Survival at 10 years in patients with tertiles of good, moderate, or poor cardiac output reserve was 89%, 63%, and 36.1%, respectively (P < .001). CONCLUSION In this long-term follow-up study involving a cohort of unselected ambulatory patients with mild-moderate CHF, cardiac pumping reserve measured noninvasively by cardiopulmonary exercise testing was found to be the strongest independent predictor of prognosis.
Collapse
|
41
|
Abstract
As the population ages and survival from ischaemic heart disease improves, the incidence and prevalence of congestive cardiac failure has increased dramatically. Medical treatments including ACE inhibitors, beta blockers, and aldosterone antagonists have improved the outlook for most patients. However, despite optimal medical treatment there is a significant group of patients who continue to suffer poor morbidity and mortality. Device based treatment consisting of implantable cardioverter defibrillators (ICD) and cardiac resynchronisation therapy (CRT) devices offer new modes of treatment to patients with symptomatic heart failure despite optimal medical therapy. ICDs have been shown to reduce mortality in patients with severe heart failure while CRT leads to an improvement in functional class, quality of life scores, physiological measures such as peak Vo(2), and reduce hospitalisations. Combination devices, which provide both ICD and CRT functions, have now been seen to provide synergistic benefits in selected patients.
Collapse
|
42
|
Abstract
BACKGROUND AND OBJECTIVES Converting first-time donors to become regular donors continues to be a challenge facing blood centres. We examined whether first-time donors with frequent return in the first 12 months were more likely to become regular donors. SUBJECTS AND METHODS The donation histories of 179 409 community whole-blood donors, whose first-time donation in 1991 was negative on donor screening tests, were evaluated. Donors were categorized by the number of donations made in the 12 months after (and including) their first donation. The donor return pattern in the subsequent 6 years, and its association with first-year donation frequency and demographics, was evaluated by using logistic regression analysis. A 'regular donor' was defined as one who returned to donate in at least 4 of the 6 years of follow-up. RESULTS First-year donation frequency was significantly correlated with long-term donor return (P < 0.0001). Among those giving 1, 2, 3, 4 and > or = 5 donations in the first year, 4%, 11%, 21%, 32% and 42%, respectively, became regular donors (P < 0.0001). Similar associations between donation pattern and donor return behaviour were observed after adjusting for demographic variables (P < 0.0001). CONCLUSIONS Strategies aimed at encouraging current donors to donate more frequently during the first year may help to establish a regular donation behaviour.
Collapse
|
43
|
Complementary roles of simple variables, NYHA and N-BNP, in indicating aerobic capacity and severity of heart failure. Int J Cardiol 2005; 102:279-86. [PMID: 15982497 DOI: 10.1016/j.ijcard.2004.05.054] [Citation(s) in RCA: 31] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/07/2004] [Revised: 04/19/2004] [Accepted: 05/05/2004] [Indexed: 11/29/2022]
Abstract
AIMS The extent of exercise intolerance in patients with chronic heart failure (CHF) is dependent on and representative of the severity of heart failure. However, few primary care physicians have direct access to facilities for formal exercise testing. We have therefore explored whether information readily obtainable in the community can reliably predict the functional capacity of patients. METHODS AND RESULTS Ninety-six subjects with a wide range of cardiac function (10 healthy controls and 86 CHF patients with NYHA classes I-IV, LVEF 36.9+/-15.2%) were recruited into the study and had resting plasma N-BNP and cardiopulmonary exercise testing to measure peak oxygen consumption (VO2). Significantly higher N-BNP levels were found in the CHF group (299.3 [704.8] fmol/ml, median [IQR]) compared with the healthy control group (7.2 [51.2] fmol/ml), p<0.0001. There were significant correlations between peak VO2 and N-BNP levels (R=0.64, P<0.001), peak VO2 and NYHA class (R=0.76, P=0.001), but no significant correlation was seen between peak VO2 and LVEF (R=0.0788, P=0.33). Multivariate analysis identified plasma N-BNP (P<0.0001) and NYHA class (P<0.0001) as significant independent predictors of peak VO2. Logistic modelling with NYHA class and log N-BNP to predict peak VO2<20 ml/kg/min showed that the area under the curve of receiver-operating-characteristic (ROC) curve was 0.906 (95% CI 0.844-0.968). A nomogram based on the data has been constructed to allow clinicians to estimate the likelihood of peak VO2 to be <20 ml/kg/min for given values of plasma N-BNP and NYHA class. CONCLUSIONS By combining information from a simple objective blood test (N-BNP) and a simple scoring of functional status (NYHA), a clinician can deduce the aerobic exercise capacity and indirectly the extent of cardiac dysfunction of patients with CHF.
Collapse
|
44
|
Does prevalence of transfusion-transmissible viral infection reflect corresponding incidence in United States blood donors? Transfusion 2005; 45:1089-96. [PMID: 15987352 DOI: 10.1111/j.1537-2995.2005.00178.x] [Citation(s) in RCA: 30] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
BACKGROUND Calculation of viral residual risk is dependent on estimating incidence, which is not easily obtainable by most blood centers. Prevalence, however, is readily available. Understanding whether prevalence reflects corresponding incidence may help blood centers monitor disease risks. STUDY DESIGN AND METHODS With data on 12 million allogeneic donations, prevalence and incidence of transfusion-transmitted viral infections (TTVIs) were calculated. Relationships between prevalence (in total, first-time, and repeat donations) and incidence were analyzed for human immunodeficiency virus (HIV), hepatitis B virus (HBV), and hepatitis C virus (HCV) relative to temporal and donor demographic stratifications, respectively. RESULTS Overall prevalence of HIV, HBV, and HCV did not consistently reflect corresponding incidence. The relationship between prevalence and incidence varied with time and donors' age and was virus-specific. CONCLUSION Incidence of TTVIs cannot be easily predicted from overall prevalence. Accurate assessment of TTVI risk necessitates knowledge about donation histories and person-years at risk. Establishing comprehensive frameworks for monitoring blood donations and infectious disease markers remains a key to monitoring blood safety.
Collapse
|
45
|
Comparison of demographic and donation profiles and transfusion-transmissible disease markers and risk rates in previously transfused and nontransfused blood donors. Transfusion 2004; 44:1243-51. [PMID: 15265131 DOI: 10.1111/j.1537-2995.2004.04034.x] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
BACKGROUND Increasing concern about transfusion transmission of variant Creutzfeldt-Jakob disease has resulted in indefinite deferral of transfused donors in France and the UK. Little is known, however, about the impact of indefinite deferral of transfused donors on blood safety and availability in the US. STUDY DESIGN AND METHODS Data were collected on allogeneic donations at five US blood centers during 1991 through 2000. Donation characteristics, prevalence, and incidence of human immunodeficiency virus (HIV), hepatitis B virus (HBV), and hepatitis C virus (HCV) were compared between transfused and nontransfused donors. Unreported deferrable risk (UDR) and reasons to donate were evaluated with data from a mail survey. RESULTS Transfusion history was reported by 4.2 percent of donors. Prevalence and incidence of HIV and HBV were comparable between transfused and nontransfused donors. Although HCV incidence was similar in both groups, HCV prevalence was nearly three times higher in transfused than in nontransfused first-time donors. UDR and reasons to donate were similar in the two groups, except transfused donors were less likely to donate for screening test results (odds ratio, 0.5; 95% confidence interval, 0.3-0.8). CONCLUSION Transfused and nontransfused donors had similar viral incidence and comparable UDR, suggesting that indefinite deferral of transfused donors would unlikely improve blood safety. Until more is known about the prevalence and transfusion transmissibility of emerging agents, indefinite deferral of previously transfused donors in the US does not appear warranted.
Collapse
|
46
|
Analysis of 153 deaths after upper gastrointestinal endoscopy: room for improvement? Surg Endosc 2004; 18:22-5. [PMID: 14625742 DOI: 10.1007/s00464-003-9069-x] [Citation(s) in RCA: 54] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2003] [Accepted: 06/24/2003] [Indexed: 01/14/2023]
Abstract
BACKGROUND Upper gastrointestinal (GI) endoscopy is a widely used procedure that is generally considered to be safe. METHODS Of a total of 33,854 patients who underwent upper gastrointestinal endoscopy during 1999 under the care of surgeons in Scotland, 153 (0.004%) died. We reviewed the case notes of these 153 patients. RESULTS Death was directly related to endoscopy in 20 of 153 cases (13%), most commonly due to gastrointestinal perforation or acute pancreatitis. Ninety-one percent (139) of the patients undergoing endoscopy were American Society of Anesthesiologists grades (ASA) 3-5, and 88% received intravenous sedation; an anesthetist was present in 31 cases (20%). Oxygen was administered to 45% of patients during the endoscopy. In 56% of the procedures, there was monitoring of electrocardiograms (ECG), pulse oximetry, or blood pressure readings. CONCLUSIONS Although deaths after endoscopy may be unavoidable, clinicians undertaking upper GI endoscopy or endoscopic retrograde cholangiopancreatography (ERCP) in ASA 3-5 patients should provide oxygen therapy and cardiovascular monitoring, and keep accurate records. The involvement of an anesthetist in airway management and the administration of intravenous sedation should be actively considered.
Collapse
|
47
|
|
48
|
|
49
|
Does balloon mitral valvuloplasty improve cardiac function? A mechanistic investigation into impact on exercise capacity. Int J Cardiol 2003; 91:81-91. [PMID: 12957733 DOI: 10.1016/s0167-5273(02)00591-0] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
Procedural technical success of balloon mitral valvuloplasty (BMV) is indicated by an increase in valve area and a reduction in transvalvar gradient, but there are conflicting results regarding whether these indicators correlate with subsequent improvements in exercise capacity. We conducted a study to explore the effects of valvuloplasty on cardiac function to gain insight into the mechanisms responsible for the impact on exercise ability. Sixteen patients with mitral stenosis participated in the study and the five who did not proceed to valvuloplasty served as the control group. All patients performed maximal cardiopulmonary exercise tests before and 6 weeks after valvuloplasty (without valvuloplasty in controls). Central haemodynamics including cardiac output were measured non-invasively at rest and peak exercise. At baseline, the cardiopulmonary exercise test results were similar in the two groups. Following valvuloplasty, cardiac output did not alter at rest, but increased significantly at peak exercise (8.7+/-1.7 to 10.5+/-2.1 l min(-1), P<0.01), as did peak cardiac power output (1.88+/-0.55 to 2.28+/-0.74, P<0.05) and cardiac reserve (1.07+/-0.33 to 1.45+/-0.55 watts, P<0.05). Aerobic exercise capacity improved (13.9+/-4.2 to 16.4+/-4.3 ml kg(-1) min(-1), P<0.01) as did exercise duration (354+/-270 to 500+/-266 s, P<0.01). There were no significant changes in the controls. There was a significant correlation between the changes in peak VO(2) and changes in cardiac reserve (r=0.62, P<0.01) but not with changes in resting haemodynamics. These changes did not correlate with changes in peri-procedural mitral valve haemodynamics, despite increases in mitral valve area from 1.05+/-0.16 to 1.74+/-0.4 cm(2) (P<0.0001), accompanied by falls in the transvalvar gradient and pulmonary artery pressure (12.4+/-4.7 to 4.5+/-3 mmHg, and 26.8+/-8.4 to 17.4+/-5.2 mmHg, respectively, all P<0.0001). In conclusion, we found that successful mitral valvuloplasty in our patient cohort led to improved cardiac and physical functional capacity but not resting haemodynamics. Neither indicators of technical success nor resting haemodynamics were very reliable in predicting functional improvement.
Collapse
|
50
|
Multicenter comparison of serologic assays and estimation of human herpesvirus 8 seroprevalence among US blood donors. Transfusion 2003; 43:1260-8. [PMID: 12919429 DOI: 10.1046/j.1537-2995.2003.00490.x] [Citation(s) in RCA: 91] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
BACKGROUND As part of assessing the possibility of transfusion transmission of human herpesvirus 8 (HHV-8 or Kaposi's sarcoma-associated herpesvirus), HHV-8 seroprevalence was estimated among US blood donors, the performance of HHV-8 serologic tests was compared, and the presence of HHV-8 DNA was tested for in donated blood. STUDY DESIGN AND METHODS Replicate panels of 1040 plasma specimens prepared from 1000 US blood donors (collected in 1994 and 1995) and 21 Kaposi's sarcoma patients were tested for antibodies to HHV-8 in six laboratories. HHV-8 PCR was performed on blood samples from 138 donors, including all 33 who tested seropositive in at least two laboratories and 22 who tested positive in at least one. RESULTS The estimated HHV-8 seroprevalence among US blood donors was 3.5 percent (95% CI, 1.2%-9.8%) by a conditional dependence latent-class model, 3.0 percent (95% CI, 2.0%-4.6%) by a conditional independence latent-class model, and 3.3 percent (95% CI, 2.3%-4.6%) by use of a consensus-derived gold standard (specimens positive in two or more laboratories); the conditional dependence model best fit the data. In this model, laboratory specificities ranged from 96.6 to 100 percent. Sensitivities ranged widely, but with overlapping 95 percent CIs. HHV-8 DNA was detected in blood from none of 138 donors evaluated. CONCLUSIONS Medical and behavioral screening does not eliminate HHV-8-seropositive persons from the US blood donor pool, but no viral DNA was found in donor blood. Further studies of much larger numbers of seropositive individuals will be required to more completely assess the rate of viremia and possibility of HHV-8 transfusion transmission. Current data do not indicate a need to screen US blood donors for HHV-8.
Collapse
|