1
|
Social network analysis reveals the failure of between-farm movement restrictions to reduce Salmonella transmission. J Dairy Sci 2024:S0022-0302(24)00816-6. [PMID: 38788850 DOI: 10.3168/jds.2023-24554] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2023] [Accepted: 04/01/2024] [Indexed: 05/26/2024]
Abstract
An increasing number of countries are investigating options to stop the spread of the emerging zoonotic infection Salmonella (S.) Dublin, which mainly spreads among bovines and with cattle manure. Detailed surveillance and cattle movement data from an 11-year period in Denmark provided an opportunity to gain new knowledge for mitigation options through a combined social network and simulation modeling approach. The analysis revealed similar network trends for non-infected and infected cattle farms despite stringent cattle movement restrictions imposed on infected farms in the national control program. The strongest predictive factor for farms becoming infected was their cattle movement activities in the previous month, with twice the effect of local transmission. The simulation model indicated an endemic S. Dublin occurrence, with peaks in outbreak probabilities and sizes around observed cattle movement activities. Therefore, pre- and post-movement measures within a 1-mo time-window may help reduce S. Dublin spread.
Collapse
|
2
|
Review of transmission routes of 24 infectious diseases preventable by biosecurity measures and comparison of the implementation of these measures in pig herds in six European countries. Transbound Emerg Dis 2017; 65:381-398. [PMID: 29124908 DOI: 10.1111/tbed.12758] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2017] [Indexed: 01/18/2023]
Abstract
This study aimed to review the transmission routes of important infectious pig diseases and to translate these into biosecurity measures preventing or reducing the transmission between and within pig herds. Furthermore, it aimed to identify the level of implementation of these measures in different European countries and discuss the observed variations to identify potentials for improvement. First, a literature review was performed to show which direct and indirect transmission routes of 24 infectious pig diseases can be prevented through different biosecurity measures. Second, a quantitative analysis was performed using the Biocheck.UGent™, a risk-based scoring system to evaluate biosecurity in pig herds, to obtain an insight into the implementation of these biosecurity measures. The database contained farm-specific biosecurity data from 574 pig farms in Belgium, Denmark, France, Germany, the Netherlands and Sweden, entered between January 2014 and January 2016. Third, a qualitative analysis based on a review of literature and other relevant information resources was performed for every subcategory of internal and external biosecurity in the Biocheck.UGent™ questionnaire. The quantitative analysis indicated that at the level of internal, external and overall biosecurity, Denmark had a significantly distinct profile with higher external biosecurity scores and less variation than the rest of the countries. This is likely due to a widely used specific pathogen-free (SPF) system with extensive focus on biosecurity since 1971 in Denmark. However, the observed pattern may also be attributed to differences in data collection methods. The qualitative analysis identified differences in applied policies, legislation, disease status, pig farm density, farming culture and habits between countries that can be used for shaping country-specific biosecurity advice to attain improved prevention and control of important pig diseases in European pig farms.
Collapse
|
3
|
Methods and processes of developing the strengthening the reporting of observational studies in epidemiology - veterinary (STROBE-Vet) statement. Prev Vet Med 2017; 134:188-196. [PMID: 27836042 DOI: 10.1016/j.prevetmed.2016.09.005] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2016] [Revised: 09/06/2016] [Accepted: 09/07/2016] [Indexed: 10/20/2022]
Abstract
BACKGROUND The reporting of observational studies in veterinary research presents many challenges that often are not adequately addressed in published reporting guidelines. OBJECTIVE To develop an extension of the STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) statement that addresses unique reporting requirements for observational studies in veterinary medicine related to health, production, welfare, and food safety. DESIGN A consensus meeting of experts was organized to develop an extension of the STROBE statement to address observational studies in veterinary medicine with respect to animal health, animal production, animal welfare, and food safety outcomes. SETTING Consensus meeting May 11-13, 2014 in Mississauga, Ontario, Canada. PARTICIPANTS Seventeen experts from North America, Europe, and Australia attended the meeting. The experts were epidemiologists and biostatisticians, many of whom hold or have held editorial positions with relevant journals. METHODS Prior to the meeting, 19 experts completed a survey about whether they felt any of the 22 items of the STROBE statement should be modified and if items should be added to address unique issues related to observational studies in animal species with health, production, welfare, or food safety outcomes. At the meeting, the participants were provided with the survey responses and relevant literature concerning the reporting of veterinary observational studies. During the meeting, each STROBE item was discussed to determine whether or not re-wording was recommended, and whether additions were warranted. Anonymous voting was used to determine whether there was consensus for each item change or addition. RESULTS The consensus was that six items needed no modifications or additions. Modifications or additions were made to the STROBE items numbered: 1 (title and abstract), 3 (objectives), 5 (setting), 6 (participants), 7 (variables), 8 (data sources/measurement), 9 (bias), 10 (study size), 12 (statistical methods), 13 (participants), 14 (descriptive data), 15 (outcome data), 16 (main results), 17 (other analyses), 19 (limitations), and 22 (funding). LIMITATION Published literature was not always available to support modification to, or inclusion of, an item. CONCLUSION The methods and processes used in the development of this statement were similar to those used for other extensions of the STROBE statement. The use of this extension to the STROBE statement should improve the reporting of observational studies in veterinary research related to animal health, production, welfare, or food safety outcomes by recognizing the unique features of observational studies involving food-producing and companion animals, products of animal origin, aquaculture, and wildlife.
Collapse
|
4
|
Abstract
1. The performance of the scoring in the Danish footpad dermatitis (FPD) surveillance system was evaluated by determining inter-rater agreement in visual inspection of FPD in broilers between two independent raters (R1 and R2) and the official scoring at a Danish slaughterhouse. 2. FPD scores were evaluated in 1599 chicken feet. The two raters and the slaughterhouse scored equal proportions of score 0. So did R1 and R2 when assessing score 1 and the more severe lesion score 2, whereas the slaughterhouse scored a markedly higher proportion of score 1 and a lower proportion of score 2. Aggregated FPD flock scores ranged from 5 to 163 (R1 and R2) and from 8 to 107 (slaughterhouse). 3. The level of agreement between the two raters was high for scores 0, 1 and 2 and for flock scores. Agreement between raters and the slaughterhouse was lower when R1 and R2 recorded score 2 than when they recorded scores 0 and 1. 4. This study indicates that the occurrence and severity of lesions are underestimated in the official Danish FPD scoring system.
Collapse
|
5
|
Methods and Processes of Developing the Strengthening the Reporting of Observational Studies in Epidemiology-Veterinary (STROBE-Vet) Statement. J Food Prot 2016; 79:2211-2219. [PMID: 28221964 DOI: 10.4315/0362-028x.jfp-16-016] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Reporting of observational studies in veterinary research presents challenges that often are not addressed in published reporting guidelines. Our objective was to develop an extension of the STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) statement that addresses unique reporting requirements for observational studies in veterinary medicine related to health, production, welfare, and food safety. We conducted a consensus meeting with 17 experts in Mississauga, Canada. Experts completed a premeeting survey about whether items in the STROBE statement should be modified or added to address unique issues related to observational studies in animal species with health, production, welfare, or food safety outcomes. During the meeting, each STROBE item was discussed to determine whether or not rewording was recommended, and whether additions were warranted. Anonymous voting was used to determine consensus. Six items required no modifications or additions. Modifications or additions were made to the STROBE items 1 (title and abstract), 3 (objectives), 5 (setting), 6 (participants), 7 (variables), 8 (data sources and measurement), 9 (bias), 10 (study size), 12 (statistical methods), 13 (participants), 14 (descriptive data), 15 (outcome data), 16 (main results), 17 (other analyses), 19 (limitations), and 22 (funding). The methods and processes used were similar to those used for other extensions of the STROBE statement. The use of this STROBE statement extension should improve reporting of observational studies in veterinary research by recognizing unique features of observational studies involving food-producing and companion animals, products of animal origin, aquaculture, and wildlife.
Collapse
|
6
|
Explanation and Elaboration Document for the STROBE-Vet Statement: Strengthening the Reporting of Observational Studies in Epidemiology-Veterinary Extension. J Vet Intern Med 2016; 30:1896-1928. [PMID: 27859752 PMCID: PMC5115190 DOI: 10.1111/jvim.14592] [Citation(s) in RCA: 40] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2016] [Revised: 06/24/2016] [Accepted: 08/29/2016] [Indexed: 01/15/2023] Open
Abstract
The STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) statement was first published in 2007 and again in 2014. The purpose of the original STROBE was to provide guidance for authors, reviewers, and editors to improve the comprehensiveness of reporting; however, STROBE has a unique focus on observational studies. Although much of the guidance provided by the original STROBE document is directly applicable, it was deemed useful to map those statements to veterinary concepts, provide veterinary examples, and highlight unique aspects of reporting in veterinary observational studies. Here, we present the examples and explanations for the checklist items included in the STROBE-Vet statement. Thus, this is a companion document to the STROBE-Vet statement methods and process document (JVIM_14575 "Methods and Processes of Developing the Strengthening the Reporting of Observational Studies in Epidemiology-Veterinary (STROBE-Vet) Statement" undergoing proofing), which describes the checklist and how it was developed.
Collapse
|
7
|
Methods and Processes of Developing the Strengthening the Reporting of Observational Studies in Epidemiology - Veterinary (STROBE-Vet) Statement. J Vet Intern Med 2016; 30:1887-1895. [PMID: 27859753 PMCID: PMC5115188 DOI: 10.1111/jvim.14574] [Citation(s) in RCA: 35] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2016] [Revised: 06/24/2016] [Accepted: 08/10/2016] [Indexed: 12/29/2022] Open
Abstract
Background Reporting of observational studies in veterinary research presents challenges that often are not addressed in published reporting guidelines. Objective To develop an extension of the STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) statement that addresses unique reporting requirements for observational studies in veterinary medicine related to health, production, welfare, and food safety. Design Consensus meeting of experts. Setting Mississauga, Canada. Participants Seventeen experts from North America, Europe, and Australia. Methods Experts completed a pre‐meeting survey about whether items in the STROBE statement should be modified or added to address unique issues related to observational studies in animal species with health, production, welfare, or food safety outcomes. During the meeting, each STROBE item was discussed to determine whether or not rewording was recommended and whether additions were warranted. Anonymous voting was used to determine consensus. Results Six items required no modifications or additions. Modifications or additions were made to the STROBE items 1 (title and abstract), 3 (objectives), 5 (setting), 6 (participants), 7 (variables), 8 (data sources/measurement), 9 (bias), 10 (study size), 12 (statistical methods), 13 (participants), 14 (descriptive data), 15 (outcome data), 16 (main results), 17 (other analyses), 19 (limitations), and 22 (funding). Conclusion The methods and processes used were similar to those used for other extensions of the STROBE statement. The use of this STROBE statement extension should improve reporting of observational studies in veterinary research by recognizing unique features of observational studies involving food‐producing and companion animals, products of animal origin, aquaculture, and wildlife.
Collapse
|
8
|
PS-021 Use of an E-learning program to improve paediatric nurses’ dose calculation skills. Eur J Hosp Pharm 2014. [DOI: 10.1136/ejhpharm-2013-000436.372] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022] Open
|
9
|
Dynamic changes in antibody levels as an early warning of Salmonella Dublin in bovine dairy herds. J Dairy Sci 2013; 96:7558-64. [PMID: 24140322 DOI: 10.3168/jds.2012-6478] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2012] [Accepted: 09/09/2013] [Indexed: 11/19/2022]
Abstract
Salmonella Dublin is a bacterium that causes disease and production losses in cattle herds. In Denmark, a surveillance and control program was initiated in 2002 to monitor and reduce the prevalence of Salmonella Dublin. In dairy herds, the surveillance includes herd classification based on bulk tank milk measurements of antibodies directed against Salmonella Dublin at 3-mo intervals. In this study, an "alarm herd" concept, based on the dynamic progression of these repeated measurements, was formulated such that it contains predictive power for Salmonella Dublin herd classification change from "likely free of infection" to "likely infected" in the following quarter of the year, thus warning the farmer 3 mo earlier than the present system. The alarm herd concept was defined through aberrations from a stable development over time of antibody levels. For suitable parameter choices, alarm herd status was a positive predictor for Salmonella Dublin status change in dairy herds, in that alarm herds had a higher risk of changing status in the following quarter compared with nonalarm herds. This was despite the fact that both alarm and nonalarm herds had antibody levels that did not indicate the herds being "likely infected" according to the existing classification system in the present quarter. The alarm herd concept can be used as a new early warning element in the existing surveillance program. Additionally, to improve accuracy of herd classification, the alarm herd concept could be incorporated into a model including other known risk factors for change in herd classification. Furthermore, the model could be extended to other diseases monitored in similar ways.
Collapse
|
10
|
Evaluation of milk yield losses associated with Salmonella antibodies in bulk tank milk in bovine dairy herds. J Dairy Sci 2013; 95:4873-4885. [PMID: 22916892 DOI: 10.3168/jds.2011-4332] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2011] [Accepted: 04/16/2012] [Indexed: 11/19/2022]
Abstract
The effect of Salmonella on milk production is not well established in cattle. The objective of this study was to investigate whether introduction of Salmonella into dairy cattle herds was associated with reduced milk yield and determine the duration of any such effect. Longitudinal data from 2005 through 2009 were used, with data from 12 mo before until 18 mo after the estimated date of infection. Twenty-eight case herds were selected based on an increase in the level of Salmonella-specific antibodies in bulk-tank milk from <10 corrected optical density percentage (ODC%) to ≥70 ODC% between 2 consecutive three-monthly measurements in the Danish Salmonella surveillance program. All selected case herds were conventional Danish Holstein herds. Control herds (n=40) were selected randomly from Danish Holstein herds with Salmonella antibody levels consistently <10 ODC%. A date of herd infection was randomly allocated to the control herds. Hierarchical mixed effect models with the outcome test-day yield of energy-corrected milk (ECM)/cow were used to investigate daily milk yield before and after the estimated herd infection date for cows in parities 1, 2, and 3+. Control herds were used to evaluate whether the effects in the case herds could be reproduced in herds without Salmonella infection. Herd size, days in milk, somatic cell count, season, and year were included in the models. Yield in first-parity cows was reduced by a mean of 1.4 kg (95% confidence interval: 0.5 to 2.3) of ECM/cow per day from 7 to 15 mo after the estimated herd infection date, compared with that of first-parity cows in the same herds in the 12 mo before the estimated herd infection date. Yield for parity 3+ cows was reduced by a mean of 3.0 kg (95% confidence interval: 1.3 to 4.8) of ECM/cow per day from 7 to 15 mo after herd infection compared with that of parity 3+ cows in the 12 mo before the estimated herd infection. We observed minor differences in yield in second-parity cows before and after herd infection and observed no difference between cows in control herds before and after the simulated infection date. Milk yield decreased significantly in affected herds and the reduction was detectable several months after the increase in bulk tank milk Salmonella antibodies. It took more than 1 yr for milk yield to return to preinfection levels.
Collapse
|
11
|
Abstract
Bovine cysticercosis (BC) is a zoonotic, parasitic infection in cattle. Under the current EU meat inspection regulation, every single carcass from all bovines above 6 weeks of age is examined for BC. This method is costly and makes more sense in countries with higher number of BC-infected animals than in countries with few lightly infected cases per year. The aim of the present case-control study was to quantify associations between potential herd-level risk factors and BC in Danish cattle herds. Risk factors can be used in the design of a risk-based meat inspection system targeted towards the animals with the highest risk of BC. Cases (n = 77) included herds that hosted at least one animal diagnosed with BC at meat inspection, from 2006 to 2010. Control herds (n = 231) consisted of randomly selected herds that had not hosted any animals diagnosed with BC between 2004 and 2010. The answers from a questionnaire and register data from the Danish Cattle Database were grouped into meaningful variables and used to investigate the risk factors for BC using a multivariable logistic regression model. Case herds were almost three times more likely than control herds to let all or most animals out grazing. Case herds were more than five times more likely than control herds to allow their animals access to risky water sources with sewage treatment plant effluent in proximity. Case herds were also more likely to share machinery or hire contractors than control herds. The risk decreased with increasing herd size probably because the larger herds generally tend to keep cattle indoors in Denmark. The results are useful to guide future data recording that can be supplied by the farmer as food chain information and then be used for differentiated meat inspection in low- and high-risk groups, enabling development of risk-based meat inspection systems.
Collapse
|
12
|
Kidney failure related to broad-spectrum antibiotics in critically ill patients: secondary end point results from a 1200 patient randomised trial. BMJ Open 2012; 2:e000635. [PMID: 22411933 PMCID: PMC3307126 DOI: 10.1136/bmjopen-2011-000635] [Citation(s) in RCA: 70] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/01/2023] Open
Abstract
OBJECTIVES To explore whether a strategy of more intensive antibiotic therapy leads to emergence or prolongation of renal failure in intensive care patients. DESIGN Secondary analysis from a randomised antibiotic strategy trial (the Procalcitonin And Survival Study). The randomised arms were conserved from the primary trial for the main analysis. SETTING Nine mixed surgical/medical intensive care units across Denmark. PARTICIPANTS 1200 adult intensive care patients, 18+ years, expected to stay +24 h. EXCLUSION CRITERIA bilirubin >40 mg/dl, triglycerides >1000 mg/dl, increased risk from blood sampling, pregnant/breast feeding and psychiatric patients. INTERVENTIONS Patients were randomised to guideline-based therapy ('standard-exposure' arm) or to guideline-based therapy supplemented with antibiotic escalation whenever procalcitonin increased on daily measurements ('high-exposure' arm). MAIN OUTCOME MEASURES Primary end point: estimated glomerular filtration rate (eGFR) <60 ml/min/1.73 m(2). Secondary end points: (1) delta eGFR after starting/stopping a drug and (2) RIFLE criterion Risk 'R', Injury 'I' and Failure 'F'. Analysis was by intention to treat. RESULTS 28-day mortality was 31.8% and comparable (Jensen et al, Crit Care Med 2011). A total of 3672/7634 (48.1%) study days during follow-up in the high-exposure versus 3016/6949 (43.4%) in the 'standard-exposure arm were spent with eGFR <60 ml/min/1.73 m(2), p<0.001. In a multiple effects model, 3 piperacillin/tazobactam was identified as causing the lowest rate of renal recovery of all antibiotics used: 1.0 ml/min/1.73 m(2)/24 h while exposed to this drug (95% CI 0.7 to 1.3 ml/min/1.73 m(2)/24 h) vs meropenem: 2.9 ml/min/1.73 m(2)/24 h (2.5 to 3.3 ml/min/1.73 m(2)/24 h)); after discontinuing piperacillin/tazobactam, the renal recovery rate increased: 2.7 ml/min/1.73 m(2)/24 h (2.3 to 3.1 ml/min/1.73 m(2) /24 h)). eGFR <60 ml/min/1.73 m(2) in the two groups at entry and at last day of follow-up was 57% versus 55% and 41% versus 39%, respectively. CONCLUSIONS Piperacillin/tazobactam was identified as a cause of delayed renal recovery in critically ill patients. This nephrotoxicity was not observed when using other beta-lactam antibiotics. TRIAL REGISTRATION ClinicalTrials.gov identifier: NCT00271752.
Collapse
|
13
|
Presence of natural genetic resistance in Fraxinus excelsior (Oleraceae) to Chalara fraxinea (Ascomycota): an emerging infectious disease. Heredity (Edinb) 2011; 106:788-97. [PMID: 20823903 PMCID: PMC3186218 DOI: 10.1038/hdy.2010.119] [Citation(s) in RCA: 96] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2010] [Revised: 06/29/2010] [Accepted: 07/15/2010] [Indexed: 11/09/2022] Open
Abstract
Fraxinus excelsior, common ash native to Europe, is threatened by a recently identified pathogenic fungus Chalara fraxinea, which causes extensive damage on ash trees across Europe. In Denmark, most stands are severely affected leaving many trees with dead crowns. However, single trees show notably fewer symptoms. In this study, the impact of the emerging infectious disease on native Danish ash trees is assessed by estimating presence of inherent resistance in natural populations. Disease symptoms were assessed from 2007 to 2009 at two different sites with grafted ramets of 39 selected clones representing native F. excelsior trees. A strong genetic variation in susceptibility to C. fraxinea infections was observed. No genetic or geographic structure can explain the differences, but strong genetic correlations to leaf senescence were observed. The results suggest that a small fraction of trees in the Danish population of ash possess substantial resistance against the damage. Though this fraction is probably too low to avoid population collapse in most natural or managed ash forests, the observed presence of putative resistance against the emerging infectious disease in natural stands is likely to be of evolutionary importance. This provides prospects of future maintenance of the species through natural or artificial selection in favour of remaining healthy individuals.
Collapse
|
14
|
Culling decisions of dairy farmers during a 3-year Salmonella control study. Prev Vet Med 2011; 100:29-37. [PMID: 21481960 DOI: 10.1016/j.prevetmed.2011.03.001] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2010] [Revised: 03/10/2011] [Accepted: 03/10/2011] [Indexed: 11/28/2022]
Abstract
Salmonella enterica subsp. enterica-serotypes lead to periodically increased morbidity and mortality in cattle herds. The bacteria can also lead to serious infections in humans. Consequently, Denmark has started a surveillance and control programme in 2002. The programme focuses on Salmonella Dublin which is the most prevalent and most persistent serotype in the Danish cattle population. A field study in 10 dairy herds with persistent Salmonella infections was carried out over three years to gain experience with control procedures including risk assessment, targeted control actions and test-and-cull procedures. From autumn 2003 until end of 2006 quarterly milk quality control samples from all lactating cows and biannual blood samples from all young stock above the age of three months were tested using an indirect antibody ELISA. The most recent and previous test results were used to categorise all animals into risk groups. These risk groups and all individual ELISA-results were communicated to the farmers as colour-coded lists four to six times per year. Farmers were advised to manage the risk of Salmonella transmission from cattle with repeatedly high ELISA results (flagged as "red") or cows with at least one recent moderately high ELISA result (flagged as "yellow") on the lists. Risk management included, e.g. culling or separation of the cows at calving. We analysed culling decisions using two models. For heifers a hierarchical multivariable logistic model with herd as random effect evaluated if animals with red and yellow flags had higher probability of being slaughtered or sold before first calving than animals without any risk flags. For adult cows a semi-parametric proportional hazard survival model was used to test the effect of number of red and yellow flags on hazards of culling at different time points and interactions with prevalence in the herd while accounting for parity, stage of lactation, milk yield, somatic cell count and the hierarchical structure of the data with animals clustered at herd level. This study illustrates how investigation of culling decisions made by herd managers when they have access to test-status of individual animals and overall apparent prevalence during control of an infection can lead to useful new knowledge. Overall herd managers were more likely to cull cattle with increasing number of yellow and red flags than animals with no flags. However, cattle were more likely to be culled with yellow and red flags during times with low or medium high within-herd seroprevalence than at times with high seroprevalence. These results are valuable knowledge for modelling and planning of control strategies and for making recommendations to farmers about control options.
Collapse
|
15
|
The range of influence between cattle herds is of importance for the local spread of Salmonella Dublin in Denmark. Prev Vet Med 2008; 84:277-90. [PMID: 18242741 DOI: 10.1016/j.prevetmed.2007.12.005] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
The objective of the study was to estimate the range of influence between cattle herds with positive Salmonella Dublin herd status. Herd status was a binary outcome of high/low antibody levels to Salmonella Dublin in bulk-tank milk and blood samples collected from all cattle herds in Denmark for surveillance purposes. Two methods were used. Initially, a spatial generalised linear mixed model was developed with an exponential correlation function to estimate the range of influence simultaneously with the effect of potential risk factors. An iteratively reweighted generalised least squares procedure was used as a second method for verifying the range of influence estimates. With this iterative procedure, deviance residuals were calculated based on a generalised linear model and the range of influence was estimated based on the residuals using an exponential semivariogram. The range of influence was estimated for six different regions in Denmark using both methods. The analyses were performed on data collected during 1 year after initiation of the Salmonella Dublin surveillance program providing herd classifications for the 4th year-quarter of 2003 and 2 years later for the 4th year-quarter of 2005. The prevalence of dairy herds with a positive Salmonella Dublin herd classification status in this period had decreased from 22.1 to 17.0%. In non-dairy herds, the prevalence was nearly unchanged during the same period (3.4 and 3.7% in 4th quarter of 2003 and 2005, respectively). For all cattle herds, the range of influence was 2.3-6.4 km in 2003 and 1.5-8.3 km in 2005. There seemed to be no association between the range of influence and the density of herds in the different regions.
Collapse
|
16
|
Growth inhibitory factors in bovine faeces impairs detection of Salmonella Dublin by conventional culture procedure. J Appl Microbiol 2007; 103:650-6. [PMID: 17714398 DOI: 10.1111/j.1365-2672.2007.03292.x] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
AIMS To analyse the relative importance of different biological and technical factors on the analytical sensitivity of conventional culture methods for detection of Salmonella Dublin in cattle faeces. METHODS AND RESULTS Faeces samples collected from six adult bovines from different salmonella-negative herds were split into subpools and spiked with three strains of S. Dublin at a concentration level of c. 10 CFU g(-1) faeces. Each of the 18 strain-pools was divided into two sets of triplicates of four volumes of faecal matter (1, 5, 10 and 25 g). The two sets were pre-enriched with and without novobiocin, followed by combinations of culture media (three types) and selective media (two types). The sensitivity of each combination and sources of variation in detection were determined by a generalized linear mixed model using a split-plot design. CONCLUSIONS Biological factors, such as faecal origin and S. Dublin strain influenced the sensitivity more than technical factors. Overall, the modified semi-solid Rappaport Vassiliadis (MSRV)-culture medium had the most reliable detection capability, whereas detection with selenite cystine broth and Mueller Kauffman tetrathionate broth combinations varied more in sensitivity and rarely reached the same level of detection as MSRV in this experiment. SIGNIFICANCE AND IMPACT OF THE STUDY The study showed that for MSRV-culture medium and xylose lysine decarboxylase agar as the indicative medium, the sensitivity of the faecal culture method may be improved by focusing on the strain variations and the ecology of the faecal sample. Detailed investigation of the faecal flora (pathogens and normal flora) and the interaction with chemical factors may result in developing an improved method for detection of S. Dublin.
Collapse
|
17
|
Risk Factors for Changing Test Classification in the Danish Surveillance Program for Salmonella in Dairy Herds. J Dairy Sci 2007; 90:2815-25. [PMID: 17517722 DOI: 10.3168/jds.2006-314] [Citation(s) in RCA: 38] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
A surveillance program in which all cattle herds in Denmark are classified into Salmonella infection categories has been in place since 2002. Dairy herds were considered test negative and thus most likely free of infection if Salmonella antibody measurements were consistently low in bulk tank milk samples collected every 3 mo. Herds were considered test positive and thus most likely infected if the 4-quarter moving average bulk tank milk antibody concentration was high or if there was a large increase in the most recent measurement compared with the average value from the previous 3 samples. The objective of this study was to evaluate risk factors for changing from test negative to positive, which was indicative of herds becoming infected from one quarter of the year to the next, and risk factors for changing from test positive to negative, which was indicative of herds recovering from infection between 2 consecutive quarters of the year. The Salmonella serotypes in question were Salmonella Dublin or other serotypes that cross-react with the Salmonella Dublin antigen in the ELISA (e.g., some Salmonella Typhimurium types). Two logistic regression models that accounted for repeated measurements at the herd level and controlled for herd size and regional effects were used. Data from 2003 was used for the analyses. A change from test negative to positive occurred in 2.0% of the quarterly observations (n = 21,007) from test negative dairy herds. A change from test positive to negative occurred in 10.0% of quarterly observations (n = 6,168) available from test positive dairy herds. The higher the number of test-positive neighbor herds in the previous year-quarter, the more likely herds were to become test positive for Salmonella. The number of purchased cattle from test-positive herds was also associated with changing from test negative to positive. The bigger the herd, the more likely it was to change from negative to test positive. The effect of herd size on recovery was less clear. Large herds consisting mainly of large breeds or having test-positive neighbors in a 2-km radius were less likely to change from test positive to negative, whereas the breed and neighbor factors were not found to be important for small herds. Organic production was associated with remaining test positive, but not with becoming test positive. The results emphasize the importance of external and internal biosecurity measures to control Salmonella infections.
Collapse
|
18
|
Use of IgG avidity ELISA to differentiate acute from persistent infection with Salmonella Dublin in cattle. J Appl Microbiol 2006; 100:144-52. [PMID: 16405694 DOI: 10.1111/j.1365-2672.2005.02758.x] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
AIMS To investigate whether an immunoglobulin (Ig)G avidity ELISA can be used to differentiate between acute and persistent infection with Salmonella (S.) Dublin in cattle. To determine whether the IgG isotype, IgG(1) and IgG(2) responses in acute and persistent infections differ. METHODS AND RESULTS Animals were selected from two herds with long-term infection (years) and two herds recently infected (<3 months). Forty-seven animals were categorized into groups based on the persistence of their antibody level in milk. Based on titre from two serial dilutions the avidity index (AI) was calculated for IgG (IgG-AI), IgG(1) (IgG(1)-AI) and IgG(2) (IgG(2)-AI). The mean IgG-AI for suspected carrier animals with either persistently high (group 1) or persistently high to medium high (group 2) antibody levels was significantly (P = 0.003) higher (32.1% and 38.4%) than for acutely infected animals (21.7% and 22.3%). The probability of being a suspect carrier was associated with IgG-AI, antibody level in the sample and age. However, the effect of age could be the result of a biased sample selection. Specificities and sensitivities were calculated at a range of cut-off values for IgG-AI and IgG(1)-AI. Overall, IgG(2)-AI was high compared with IgG(1)-AI, and there was no difference in IgG(2)-AI between infection groups. There was no difference in the ratio IgG(2):IgG(1) for acute and persistent infection groups. CONCLUSIONS Assuming that a persistently high antibody response is indicative of persistent infection with S. Dublin in cattle, it can be concluded that the IgG-AI can aid in differentiating between acute and long-term infection on herd level. However, for the test to be useful as an alternative tool to repeated sampling over time for detection of persistently infected carriers during control strategies in cattle herds, the test needs to be optimized and studied further in a larger sample of well-characterized infections in cattle. The affinity of IgG(2) is higher than IgG(1) early in the S. Dublin infection. There appears to be no difference in the IgG(2)-AI between the acute and chronic infection stages. SIGNIFICANCE AND IMPACT OF THE STUDY For decades the strategies for detection of persistently infected cattle in S. Dublin infected herds have involved repeated bacteriological culture of faecal samples or repeated antibody measurements over several months. Both methods are time consuming and costly, leaving a new method for detection of carrier animals based on a single sampling highly desirable. This study illustrates a tool, IgG-AI, which may prove useful, although more validation of the method is required before it is used in practice.
Collapse
|
19
|
Simulation model estimates of test accuracy and predictive values for the Danish Salmonella surveillance program in dairy herds. Prev Vet Med 2006; 77:284-303. [PMID: 16979767 DOI: 10.1016/j.prevetmed.2006.08.001] [Citation(s) in RCA: 28] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2005] [Revised: 06/29/2006] [Accepted: 08/10/2006] [Indexed: 10/24/2022]
Abstract
The Danish government and cattle industry instituted a Salmonella surveillance program in October 2002 to help reduce Salmonella enterica subsp. enterica serotype Dublin (S. Dublin) infections. All dairy herds are tested by measuring antibodies in bulk tank milk at 3-month intervals. The program is based on a well-established ELISA, but the overall test program accuracy and misclassification was not previously investigated. We developed a model to simulate repeated bulk tank milk antibody measurements for dairy herds conditional on true infection status. The distributions of bulk tank milk antibody measurements for infected and noninfected herds were determined from field study data. Herd infection was defined as having either >or=1 Salmonella culture-positive fecal sample or >or=5% within-herd prevalence based on antibody measurements in serum or milk from individual animals. No distinction was made between Dublin and other Salmonella serotypes which cross-react in the ELISA. The simulation model was used to estimate the accuracy of herd classification for true herd-level prevalence values ranging from 0.02 to 0.5. Test program sensitivity was 0.95 across the range of prevalence values evaluated. Specificity was inversely related to prevalence and ranged from 0.83 to 0.98. For a true herd-level infection prevalence of 15%, the estimate for specificity (Sp) was 0.96. Also at the 15% herd-level prevalence, approximately 99% of herds classified as negative in the program would be truly noninfected and 80% of herds classified as positive would be infected. The predictive values were consistent with the primary goal of the surveillance program which was to have confidence that herds classified negative would be free of Salmonella infection.
Collapse
|
20
|
Reduced prevalence of early preterm delivery in women with Type 1 diabetes and microalbuminuria--possible effect of early antihypertensive treatment during pregnancy. Diabet Med 2006; 23:426-31. [PMID: 16620272 DOI: 10.1111/j.1464-5491.2006.01831.x] [Citation(s) in RCA: 47] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
AIMS In normotensive women with Type 1 diabetes and microalbuminuria we previously found preterm delivery (< 34 weeks) in 23% of the pregnancies. Antihypertensive treatment was initiated in late pregnancy when preeclampsia was diagnosed and diastolic blood pressure > 90 mmHg. From April 2000 our routine was changed and early antihypertensive treatment with methyldopa was initiated if antihypertensive treatment was given prior to pregnancy, if urinary albumin excretion (UAE) was > 2 g/24 h, or blood pressure > 140/90 mmHg. The present study describes the impact of this more aggressive antiypertensive treatment in the prevalence of preterm delivery. METHODS The old cohort (1995-1999) consisted of 26 and the new cohort (2000-2003) of 20 pregnant women with Type 1 diabetes and microalbuminuria. All were referred before gestational week 17. RESULTS The cohorts were comparable with regard to age, diabetes duration, prepregnancy body mass index, HbA1c, blood pressure 121 (13)/71 (8) vs. 121 (14)/73 (8) mmHg [mean (sd)] and early UAE 69 (16-278) vs. 74 (30-287) mg/24 h (geometric mean and range). Antihypertensive treatment was initiated in the old cohort at 29 (20-33) weeks, n = 9, and in the new at 13 (0-34) weeks, n = 10. The prevalence of preterm delivery before 34 weeks was reduced from 23% to zero (P = 0.02), preterm delivery before 37 weeks from 62% to 40% (P = 0.15) and preeclampsia from 42% to 20% (P = 0.11). Perinatal mortality occurred in 4% vs. 0%. Birth weight was 3124 (767) g vs. 3279 (663) g. CONCLUSION Introduction of early antihypertensive treatment with methyldopa in normotensive pregnant women with Type 1 diabetes and microalbuminuria resulted in a significant reduction in preterm delivery before gestational week 34.
Collapse
|
21
|
Molecular differentiation within and among island populations of the endemic plant Scalesia affinis (Asteraceae) from the Galápagos Islands. Heredity (Edinb) 2005; 93:434-42. [PMID: 15280895 DOI: 10.1038/sj.hdy.6800520] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022] Open
Abstract
Molecular variance was estimated in seven populations of the endemic species Scalesia affinis within and among islands of the Galapagos. The analysis, based on 157 polymorphic AFLP markers, revealed a high differentiation among populations, of which most was partitioned among islands. In addition, the information content of AFLP markers was tested with sets of discriminant analyses based on different numbers of AFLP markers. This indicated that the markers were highly informative in discriminating the populations. Although one of four populations from the island Isabela was sampled from a volcano 100 km away from the remaining populations, this population resembled the others on Isabela. The partitioning of molecular variance (AFLP) resulted in two unities, one consisting of populations from Isabela and one of populations from Santa Cruz and Floreana. The differentiation in two chloroplast microsatellites was higher than for AFLP markers and equally partitioned among populations within islands as among islands. Thus, gene flow via fruits within islands is as limited as among islands. The lower differentiation within islands in the nuclear AFLP markers may thus indicate that gene flow within islands is mostly accounted for by pollen transfer. S. affinis is the only species in the genus that is not listed in 2000 IUCN Red List of Threatened Species. However, due to prominent grazing and land exploitation, some populations have recently been reduced markedly, which was reflected in lower diversity. As inbreeding depression is present in the species, the rapid bottlenecks are threats to the populations.
Collapse
|
22
|
Salmonella Dublin infection in dairy cattle: risk factors for becoming a carrier. Prev Vet Med 2004; 65:47-62. [PMID: 15454326 DOI: 10.1016/j.prevetmed.2004.06.010] [Citation(s) in RCA: 54] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2003] [Revised: 06/02/2004] [Accepted: 06/21/2004] [Indexed: 11/15/2022]
Abstract
Long-term Salmonella Dublin carrier animals harbor the pathogen in lymph nodes and internal organs and can periodically shed bacteria through feces or milk, and contribute to transmission of the pathogen within infected herds. Thus, it is of great interest to reduce the number of new carrier animals in cattle herds. An observational field study was performed to evaluate factors affecting the risk that dairy cattle become carrier animals after infection with Salmonella Dublin. Based on repeated sampling, cattle in 12 Danish dairy herds were categorized according to course of infection, as either carriers (n = 157) or transiently infected (n = 87). The infection date for each animal was estimated from fecal excretion and antibody responses. The relationship between the course of infection (carrier versus transiently infected) and risk factors were analyzed using a random effect multilevel, multivariable logistic regression model. The animals with the highest risk of becoming carriers were heifers infected between the age of 1 year and 1st calving, and cows infected around the time of calving. The risk was higher in the first two quarters of the year (late Winter to Spring), and when the prevalence of potential shedders in the herd was low. The risk also varied between herds. The herds with the highest risk of carrier development were herds with clinical disease outbreaks during the study period. These findings are useful for future control strategies against Salmonella Dublin, because they show the importance of optimized calving management and management of heifers, and because they show that even when the herd prevalence is low, carriers are still being produced. The results raise new questions about the development of the carrier state in cattle after infection with low doses of Salmonella Dublin.
Collapse
|
23
|
Evaluation of an indirect serum ELISA and a bacteriological faecal culture test for diagnosis of Salmonella serotype Dublin in cattle using latent class models. J Appl Microbiol 2004; 96:311-9. [PMID: 14723692 DOI: 10.1046/j.1365-2672.2004.02151.x] [Citation(s) in RCA: 41] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
AIMS To evaluate a conventional bacteriological test based on faecal culture and an indirect serum ELISA for detection of S. Dublin infected cattle. To compare the predictive values of the two tests in relation to the prevalence. METHODS AND RESULTS A total of 4531 paired samples from cattle in 29 dairy herds were analysed for presence of S. Dublin bacteria in faeces and immunoglobulins directed against S. Dublin lipopolysaccharide in an indirect serum ELISA. Sensitivity and specificity were estimated at two ELISA cut-off values using a validation method based on latent class models, which presumably provides less biased results than traditional validation methods. Stratification of data into three age groups gave significantly better estimates of test performance of the ELISA. Receiver operating characteristic (ROC) curves were constructed for comparison of overall performance of the ELISA between the three age groups. The sensitivity of the faecal culture test was low (6-14%). ELISA appeared to have a higher validity for animals aged 100-299 days of age than older or younger animals. Overall, the negative predictive value of the ELISA was 2-10 times higher than for the faecal culture test at realistic prevalence of infection in the test population. CONCLUSIONS The diagnostic sensitivity of the faecal culture test for detection of S. Dublin is poor, the specificity is 1. The superior sensitivity and negative predictive value of the serum ELISA makes this test preferable to faecal culture as an initial screening test and for certification of herds not infected with S. Dublin. SIGNIFICANCE AND IMPACT OF THE STUDY A quantitative estimate of the sensitivity of a faecal culture test for S. Dublin in a general population was provided. ELISA was shown to be an appropriate alternative diagnostic test. Preferably, samples from animals aged 100-299 days of age should be used as these give the best overall performance of the ELISA. Plots of ROC curves and predictive values in relation to prevalence facilitates optimisation of the ELISA cut-off value.
Collapse
|
24
|
[Statins. A new osteoporosis prophylaxis?]. Ugeskr Laeger 2001; 163:2007-9. [PMID: 11307362] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/19/2023]
|
25
|
K-ras mutations in sinonasal adenocarcinomas in patients occupationally exposed to wood or leather dust. Cancer Lett 1998; 126:59-65. [PMID: 9563649 DOI: 10.1016/s0304-3835(97)00536-3] [Citation(s) in RCA: 52] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
Of 39 males diagnosed with sinonasal adenocarcinomas over 30 years in the Lund University Hospital catchment area (1.5 million inhabitants), archival tumor tissue was available from 29. Of these, 16 had been exposed to wood dust and three had been exposed to leather dust. The intestinal-type and papillary adenocarcinomas were more common in the exposed patients (P = 0.0002, Fisher's exact test). The tumors from all but one of the 29 sinonasal adenocarcinomas could be analyzed for point mutations at codons 12, 13 and 61 of the K-ras gene. Four mutations were detected in the 28 tumors. The three mutations in the patients exposed to wood and leather dust were all G:C --> A:T transitions, with two at position 2 of codon 12 and one at position 2 of codon 13. The high proportion of G:C --> A:T mutations in this rare tumor may reflect a genotoxic agent in wood and leather dust.
Collapse
|
26
|
Abstract
Somatic cell gene mutation arising in vivo may be considered to be a biomarker for genotoxicity. Assays detecting mutations of the haemoglobin and glycophorin A genes in red blood cells and of the hypoxanthine-guanine phosphoribosyltransferase and human leucocyte antigenes in T-lymphocytes are available in humans. This MiniReview describes these assays and their application to studies of individuals exposed to genotoxic agents. Moreover, with the implementation of techniques of molecular biology mutation spectra can now be defined in addition to the quantitation of in vivo mutant frequencies. We describe current screening methods for unknown mutations, including the denaturing gradient gel electrophoresis, single strand conformation polymorphism analysis, heteroduplex analysis, chemical modification techniques and enzymatic cleavage methods. The advantage of mutation detection as a biomarker is that it integrates exposure and sensitivity in one measurement. With the analysis of mutation spectra it may thus be possible to identify the causative genotoxic agent.
Collapse
|
27
|
Detection of the plasma cholinesterase K variant by PCR using an amplification-created restriction site. Hum Hered 1996; 46:26-31. [PMID: 8825459 DOI: 10.1159/000154321] [Citation(s) in RCA: 35] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/02/2023] Open
Abstract
Ten individuals registered at the Danish Cholinesterase Research Unit were examined at the DNA level for the presence of the K allele of plasma cholinesterase, using amplification-created restriction sites (ACRSs). A further nine members of a family registered at the unit were tested for mutations of the K and atypical variants. The frequency of the K allele was calculated from examination of normal material from 25 individuals, representing 50 random alleles. The results show that the ACRS method successfully demonstrates the presence of the K variant, whose frequency in the Danish population was found to be 0.18. We conclude that this technique is a reliable and rapid non-radioactive diagnostic assay for detecting the plasma cholinesterase K variant.
Collapse
|
28
|
Detection of ten new mutations by screening the gene encoding factor IX of Danish hemophilia B patients. Thromb Haemost 1995; 73:774-8. [PMID: 7482402] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
Hemophilia B is caused by a wide range of mutations. In order to characterize the mutations among patients in Denmark, we have systematically screened the entire coding region, the promoter region and exon flanking sequences of the gene encoding factor IX using single strand conformation and heteroduplex analyses. Patients from 32 different families were examined, and point mutations (23 different) were found in all of them. Ten of the mutations have not been reported by others; they include a splice site mutation, a single base pair deletion, and missense mutations. Notably, the study contains a female patient and a previously described Leyden mutation. In ten families with sporadic cases of hemophilia B, all 10 mothers were found to be carriers. The origin of two of these mutations was established.
Collapse
|
29
|
Abstract
Glutathione peroxidase, one of the major antioxidants in the human brain, has been found to have decreased activity in patients suffering from multiple sclerosis (MS). This study compares the activity of lymphocyte glutathione peroxidase (L-GSH-px) in MS patients suffering from acute relapses with clinically stable MS patients and with control patients referred with nondemyelinating neurological diseases. All three groups showed an increase of mean enzymatic activity (MEA) during the observation period. The highest MEA in this study was observed in the MS groups. However, there were no significant differences in the L-GSH-px activity in the three groups. These results are not in accordance with previous investigations, and the need for further research in this field is emphasized.
Collapse
|
30
|
|
31
|
Analysis of the decreased NK (natural killer) activity in lung cancer patients, using whole blood versus separated mononuclear cells. JOURNAL OF CLINICAL & LABORATORY IMMUNOLOGY 1989; 29:71-7. [PMID: 2632804] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 01/01/2023]
Abstract
The aim of this study was to analyze whether a whole blood assay would give a more correct measure of NK activity than assays using separated mononuclear cells (SMNC). We found that the NK activity of whole blood was higher than the NK activity of SMNC in the 28 lung cancer patients investigated (p = 0.01), whereas this difference between the assays could not be demonstrated in the 29 healthy controls. Since no differences were found between the NK activity of washed blood, SMNC, and monocyte-depleted lymphoid cells, there was no indication that the lower NK activity of SMNC in comparison with whole blood was due to cell loss or to a systematic disturbing effect due to monocytes. The possible effect of plasma factors on the whole blood NK activity was analyzed by comparing whole blood and washed blood. The NK activity of whole blood was increased in comparison with washed blood in the lung cancer patients (p less than 0.0001) indicating a stimulatory effect of plasma. Further, the finding that the reactive capability of lymphocytes from cancer patients was higher than in controls could indicate preactivation of the lymphocytes from the cancer patients due to the presence of stimulatory plasma factors. The NK activity of lung cancer patients was lower than the NK activity of healthy controls. The difference was found to be smaller with whole blood than with SMNC as effector cells, although both differences were significant. The decreased NK activity of cancer patients could be due to blocking immune complexes (IC), but we found no evidence for circulating or cell-bound IC in the lung cancer patients.
Collapse
|
32
|
A polyclonal IgM-RF enzyme-linked immunosorbent assay for the detection of circulating immune complexes. JOURNAL OF CLINICAL & LABORATORY IMMUNOLOGY 1988; 26:195-200. [PMID: 3199429] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
Abstract
A microplate-adapted polyclonal IgM-rheumatoid factor enzyme-linked immunosorbent assay (pIgM-RF ELISA) for the detection of circulating immune complexes (cIC) is presented. The assay involves the competitive binding of cIC and horseradish peroxidase conjugated aggregated human IgG (HRP-AHG) to solid-phase bound polyclonal IgM-RF (pIgM-RF). Aggregated human IgG (AHG) inhibited the binding of HRP-AHG to pIgM-RF in a dose-dependent way. The detection limit of the assay was about 125 ng AHG/ml diluted serum. The coefficients of variation for the assay varied from 5.0 to 14.7% for intra-assay runs and from 4.5 to 13.8% for inter-assay runs. The levels of cIC in sera from 29 patients with systemic lupus erythematosus (SLE), 85 untreated patients with breast cancer and 105 blood bank donors were studied by the pIgM-RF ELISA. Increased levels of cIC were demonstrated in 41.4% of the SLE group, in 8.2% of the breast cancer group, and in 1.9% of the normal control group. The difference in cIC activity between the SLE group and the normal control group was statistically significant.
Collapse
|