1
|
Quantitative microbial risk assessment for ingestion of antibiotic resistance genes from private wells contaminated by human and livestock fecal sources. Appl Environ Microbiol 2024; 90:e0162923. [PMID: 38335112 PMCID: PMC10952444 DOI: 10.1128/aem.01629-23] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2023] [Accepted: 01/16/2024] [Indexed: 02/12/2024] Open
Abstract
We used quantitative microbial risk assessment to estimate ingestion risk for intI1, erm(B), sul1, tet(A), tet(W), and tet(X) in private wells contaminated by human and/or livestock feces. Genes were quantified with five human-specific and six bovine-specific microbial source-tracking (MST) markers in 138 well-water samples from a rural Wisconsin county. Daily ingestion risk (probability of swallowing ≥1 gene) was based on daily water consumption and a Poisson exposure model. Calculations were stratified by MST source and soil depth over the aquifer where wells were drilled. Relative ingestion risk was estimated using wells with no MST detections and >6.1 m soil depth as a referent category. Daily ingestion risk varied from 0 to 8.8 × 10-1 by gene and fecal source (i.e., human or bovine). The estimated number of residents ingesting target genes from private wells varied from 910 (tet(A)) to 1,500 (intI1 and tet(X)) per day out of 12,000 total. Relative risk of tet(A) ingestion was significantly higher in wells with MST markers detected, including wells with ≤6.1 m soil depth contaminated by bovine markers (2.2 [90% CI: 1.1-4.7]), wells with >6.1 m soil depth contaminated by bovine markers (1.8 [1.002-3.9]), and wells with ≤6.1 m soil depth contaminated by bovine and human markers simultaneously (3.1 [1.7-6.5]). Antibiotic resistance genes (ARGs) were not necessarily present in viable microorganisms, and ingestion is not directly associated with infection. However, results illustrate relative contributions of human and livestock fecal sources to ARG exposure and highlight rural groundwater as a significant point of exposure.IMPORTANCEAntibiotic resistance is a global public health challenge with well-known environmental dimensions, but quantitative analyses of the roles played by various natural environments in transmission of antibiotic resistance are lacking, particularly for drinking water. This study assesses risk of ingestion for several antibiotic resistance genes (ARGs) and the class 1 integron gene (intI1) in drinking water from private wells in a rural area of northeast Wisconsin, United States. Results allow comparison of drinking water as an exposure route for antibiotic resistance relative to other routes like food and recreational water. They also enable a comparison of the importance of human versus livestock fecal sources in the study area. Our study demonstrates the previously unrecognized importance of untreated rural drinking water as an exposure route for antibiotic resistance and identifies bovine fecal material as an important exposure factor in the study setting.
Collapse
|
2
|
Rebuttal to Correspondence on "The Environmental Microbiology Minimum Information (EMMI) Guidelines: qPCR and dPCR Quality and Reporting for Environmental Microbiology". ENVIRONMENTAL SCIENCE & TECHNOLOGY 2023; 57:20450-20451. [PMID: 37932026 DOI: 10.1021/acs.est.3c08824] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2023]
|
3
|
Community intervention trial for estimating risk of acute gastrointestinal illness from groundwater-supplied non-disinfected drinking water. JOURNAL OF WATER AND HEALTH 2023; 21:1209-1227. [PMID: 37756190 PMCID: wh_2023_071 DOI: 10.2166/wh.2023.071] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/29/2023]
Abstract
By community intervention in 14 non-disinfecting municipal water systems, we quantified sporadic acute gastrointestinal illness (AGI) attributable to groundwater. Ultraviolet (UV) disinfection was installed on all supply wells of intervention communities. In control communities, residents continued to drink non-disinfected groundwater. Intervention and control communities switched treatments by moving UV disinfection units at the study midpoint (crossover design). Study participants (n = 1,659) completed weekly health diaries during four 12-week surveillance periods. Water supply wells were analyzed monthly for enteric pathogenic viruses. Using the crossover design, groundwater-borne AGI was not observed. However, virus types and quantity in supply wells changed through the study, suggesting that exposure was not constant. Alternatively, we compared AGI incidence between intervention and control communities within the same surveillance period. During Period 1, norovirus contaminated wells and AGI attributable risk from well water was 19% (95% CI, -4%, 36%) for children <5 years and 15% (95% CI, -9%, 33%) for adults. During Period 3, echovirus 11 contaminated wells and UV disinfection slightly reduced AGI in adults. Estimates of AGI attributable risks from drinking non-disinfected groundwater were highly variable, but appeared greatest during times when supply wells were contaminated with specific AGI-etiologic viruses.
Collapse
|
4
|
Addition to "The Environmental Microbiology Minimum Information (EMMI) Guidelines: qPCR and dPCR Quality and Reporting for Environmental Microbiology". ENVIRONMENTAL SCIENCE & TECHNOLOGY 2023; 57:12931. [PMID: 37594384 DOI: 10.1021/acs.est.3c06441] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 08/19/2023]
|
5
|
Microbial source tracking and land use associations for antibiotic resistance genes in private wells influenced by human and livestock fecal sources. JOURNAL OF ENVIRONMENTAL QUALITY 2023; 52:270-286. [PMID: 36479898 DOI: 10.1002/jeq2.20443] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/22/2022] [Accepted: 11/30/2022] [Indexed: 06/17/2023]
Abstract
Antimicrobial resistance is a growing public health problem that requires an integrated approach among human, agricultural, and environmental sectors. However, few studies address all three components simultaneously. We investigated the occurrence of five antibiotic resistance genes (ARGs) and the class 1 integron gene (intI1) in private wells drawing water from a vulnerable aquifer influenced by residential septic systems and land-applied dairy manure. Samples (n = 138) were collected across four seasons from a randomized sample of private wells in Kewaunee County, Wisconsin. Measurements of ARGs and intI1 were related to microbial source tracking (MST) markers specific to human and bovine feces; they were also related to 54 risk factors for contamination representing land use, rainfall, hydrogeology, and well construction. ARGs and intI1 occurred in 5%-40% of samples depending on target. Detection frequencies for ARGs and intI1 were lowest in the absence of human and bovine MST markers (1%-30%), highest when co-occurring with human and bovine markers together (11%-78%), and intermediate when co-occurring with just one type of MST marker (4%-46%). Gene targets were associated with septic system density more often than agricultural land, potentially because of the variable presence of manure on the landscape. Determining ARG prevalence in a rural setting with mixed land use allowed an assessment of the relative contribution of human and bovine fecal sources. Because fecal sources co-occurred with ARGs at similar rates, interventions intended to reduce ARG occurrence may be most effective if both sources are considered.
Collapse
|
6
|
Statewide Quantitative Microbial Risk Assessment for Waterborne Viruses, Bacteria, and Protozoa in Public Water Supply Wells in Minnesota. ENVIRONMENTAL SCIENCE & TECHNOLOGY 2022; 56:6315-6324. [PMID: 35507527 PMCID: PMC9118547 DOI: 10.1021/acs.est.1c06472] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/24/2021] [Revised: 04/21/2022] [Accepted: 04/22/2022] [Indexed: 05/22/2023]
Abstract
Infection risk from waterborne pathogens can be estimated via quantitative microbial risk assessment (QMRA) and forms an important consideration in the management of public groundwater systems. However, few groundwater QMRAs use site-specific hazard identification and exposure assessment, so prevailing risks in these systems remain poorly defined. We estimated the infection risk for 9 waterborne pathogens based on a 2-year pathogen occurrence study in which 964 water samples were collected from 145 public wells throughout Minnesota, USA. Annual risk across all nine pathogens combined was 3.3 × 10-1 (95% CI: 2.3 × 10-1 to 4.2 × 10-1), 3.9 × 10-2 (2.3 × 10-2 to 5.4 × 10-2), and 1.2 × 10-1 (2.6 × 10-2 to 2.7 × 10-1) infections person-1 year-1 for noncommunity, nondisinfecting community, and disinfecting community wells, respectively. Risk estimates exceeded the U.S. benchmark of 10-4 infections person-1 year-1 in 59% of well-years, indicating that the risk was widespread. While the annual risk for all pathogens combined was relatively high, the average daily doses for individual pathogens were low, indicating that significant risk results from sporadic pathogen exposure. Cryptosporidium dominated annual risk, so improved identification of wells susceptible to Cryptosporidium contamination may be important for risk mitigation.
Collapse
|
7
|
Fate and seasonality of antimicrobial resistance genes during full-scale anaerobic digestion of cattle manure across seven livestock production facilities. JOURNAL OF ENVIRONMENTAL QUALITY 2022; 51:352-363. [PMID: 35388483 DOI: 10.1002/jeq2.20350] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/24/2021] [Accepted: 03/22/2022] [Indexed: 06/14/2023]
Abstract
Anaerobic digestion has been suggested as an intervention to attenuate antibiotic resistance genes (ARGs) in livestock manure but supporting data have typically been collected at laboratory scale. Few studies have quantified ARG fate during full-scale digestion of livestock manure. We sampled untreated manure and digestate from seven full-scale mesophilic dairy manure digesters to assess ARG fate through each system. Samples were collected biweekly from December through August (i.e., winter, spring, and summer; n = 235 total) and analyzed by quantitative polymerase chain reaction for intI1, erm(B), sul1, tet(A), and tet(W). Concentrations of intI1, sul1, and tet(A) decreased during anaerobic digestion, but their removal was less extensive than expected based on previous laboratory studies. Removal for intI1 during anaerobic digestion equaled 0.28 ± 0.03 log10 units (mean ± SE), equivalent to only 48% removal and notable given intI1's role in horizontal gene transfer and multiple resistance. Furthermore, tet(W) concentrations were unchanged during anaerobic digestion (p > 0.05), and erm(B) concentrations increased by 0.52 ± 0.03 log10 units (3.3-fold), which is important given erythromycin's status as a critically important antibiotic for human medicine. Seasonal log10 changes in intI1, sul1, and tet(A) concentrations were ≥50% of corresponding log10 removals by anaerobic digestion, and variation in ARG and intI1 concentrations among digesters was quantitatively comparable to anaerobic digestion effects. These results suggest that mesophilic anaerobic digestion may be limited as an intervention for ARGs in livestock manure and emphasize the need for multiple farm-level interventions to attenuate antibiotic resistance.
Collapse
|
8
|
Optical Properties of Water for Prediction of Wastewater Contamination, Human-Associated Bacteria, and Fecal Indicator Bacteria in Surface Water at Three Watershed Scales. ENVIRONMENTAL SCIENCE & TECHNOLOGY 2021; 55:13770-13782. [PMID: 34591452 DOI: 10.1021/acs.est.1c02644] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Relations between spectral absorbance and fluorescence properties of water and human-associated and fecal indicator bacteria were developed for facilitating field sensor applications to estimate wastewater contamination in waterways. Leaking wastewater conveyance infrastructure commonly contaminates receiving waters. Methods to quantify such contamination can be time consuming, expensive, and often nonspecific. Human-associated bacteria are wastewater specific but require discrete sampling and laboratory analyses, introducing latency. Human sewage has fluorescence and absorbance properties different than those of natural waters. To assist real-time field sensor development, this study investigated optical properties for use as surrogates for human-associated bacteria to estimate wastewater prevalence in environmental waters. Three spatial scales were studied: Eight watershed-scale sites, five subwatershed-scale sites, and 213 storm sewers and open channels within three small watersheds (small-scale sites) were sampled (996 total samples) for optical properties, human-associated bacteria, fecal indicator bacteria, and, for selected samples, human viruses. Regression analysis indicated that bacteria concentrations could be estimated by optical properties used in existing field sensors for watershed and subwatershed scales. Human virus occurrence increased with modeled human-associated bacteria concentration, providing confidence in these regressions as surrogates for wastewater contamination. Adequate regressions were not found for small-scale sites to reliably estimate bacteria concentrations likely due to inconsistent local sanitary sewer inputs.
Collapse
|
9
|
The Environmental Microbiology Minimum Information (EMMI) Guidelines: qPCR and dPCR Quality and Reporting for Environmental Microbiology. ENVIRONMENTAL SCIENCE & TECHNOLOGY 2021; 55:10210-10223. [PMID: 34286966 DOI: 10.1021/acs.est.1c01767/suppl_file/es1c01767_si_001.pdf] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
Real-time quantitative polymerase chain reaction (qPCR) and digital PCR (dPCR) methods have revolutionized environmental microbiology, yielding quantitative organism-specific data of nucleic acid targets in the environment. Such data are essential for characterizing interactions and processes of microbial communities, assessing microbial contaminants in the environment (water, air, fomites), and developing interventions (water treatment, surface disinfection, air purification) to curb infectious disease transmission. However, our review of recent qPCR and dPCR literature in our field of health-related environmental microbiology showed that many researchers are not reporting necessary and sufficient controls and methods, which would serve to strengthen their study results and conclusions. Here, we describe the application, utility, and interpretation of the suite of controls needed to make high quality qPCR and dPCR measurements of microorganisms in the environment. Our presentation is organized by the discrete steps and operations typical of this measurement process. We propose systematic terminology to minimize ambiguity and aid comparisons among studies. Example schemes for batching and combining controls for efficient work flow are demonstrated. We describe critical reporting elements for enhancing data credibility, and we provide an element checklist in the Supporting Information. Additionally, we present several key principles in metrology as context for laboratories to devise their own quality assurance and quality control reporting framework. Following the EMMI guidelines will improve comparability and reproducibility among qPCR and dPCR studies in environmental microbiology, better inform engineering and public health actions for preventing disease transmission through environmental pathways, and for the most pressing issues in the discipline, focus the weight of evidence in the direction toward solutions.
Collapse
|
10
|
The Environmental Microbiology Minimum Information (EMMI) Guidelines: qPCR and dPCR Quality and Reporting for Environmental Microbiology. ENVIRONMENTAL SCIENCE & TECHNOLOGY 2021; 55:10210-10223. [PMID: 34286966 DOI: 10.1021/acs.est.1c01767] [Citation(s) in RCA: 71] [Impact Index Per Article: 23.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
Real-time quantitative polymerase chain reaction (qPCR) and digital PCR (dPCR) methods have revolutionized environmental microbiology, yielding quantitative organism-specific data of nucleic acid targets in the environment. Such data are essential for characterizing interactions and processes of microbial communities, assessing microbial contaminants in the environment (water, air, fomites), and developing interventions (water treatment, surface disinfection, air purification) to curb infectious disease transmission. However, our review of recent qPCR and dPCR literature in our field of health-related environmental microbiology showed that many researchers are not reporting necessary and sufficient controls and methods, which would serve to strengthen their study results and conclusions. Here, we describe the application, utility, and interpretation of the suite of controls needed to make high quality qPCR and dPCR measurements of microorganisms in the environment. Our presentation is organized by the discrete steps and operations typical of this measurement process. We propose systematic terminology to minimize ambiguity and aid comparisons among studies. Example schemes for batching and combining controls for efficient work flow are demonstrated. We describe critical reporting elements for enhancing data credibility, and we provide an element checklist in the Supporting Information. Additionally, we present several key principles in metrology as context for laboratories to devise their own quality assurance and quality control reporting framework. Following the EMMI guidelines will improve comparability and reproducibility among qPCR and dPCR studies in environmental microbiology, better inform engineering and public health actions for preventing disease transmission through environmental pathways, and for the most pressing issues in the discipline, focus the weight of evidence in the direction toward solutions.
Collapse
|
11
|
Microbial pathogens and contaminants of emerging concern in groundwater at an urban subsurface stormwater infiltration site. THE SCIENCE OF THE TOTAL ENVIRONMENT 2021; 775:145738. [PMID: 33631564 DOI: 10.1016/j.scitotenv.2021.145738] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/28/2020] [Revised: 12/29/2020] [Accepted: 02/04/2021] [Indexed: 06/12/2023]
Abstract
Urban stormwater may contain a variety of pollutants, including viruses and other pathogens, and contaminants of emerging concern (pharmaceuticals, artificial sweeteners, and personal care products). In vulnerable geologic settings, the potential exists for these contaminants to reach underlying aquifers and contaminate drinking water wells. Viruses and other pathogens, as well as other contaminants of emerging concern, were measured in stormwater and groundwater at an urban site containing a stormwater cistern and related subsurface infiltration gallery, three shallow lysimeter wells, and a monitoring well. Five of 12 microbial targets were detected more than once across the eight rounds of sampling and at multiple sampling points, with human-specific Bacteroides detected most frequently. The microbial and chemical contaminants present in urban stormwater were much lower in the water table monitoring well than the vadose zone lysimeters. There may be numerous causes for these reductions, but they are most likely related to transit across fine-grained sediments that separate the water table from the vadose zone at this location. Precipitation amount prior to sample collection was significantly associated with microbial load. A significant relation between microbial load and chloride-bromide ratio was also observed. The reduction in number and concentrations of contaminants found in the monitoring well indicates that although geologically sensitive aquifers receiving urban stormwater effluent in the subsurface may be prone to contamination, those with a protective cap of fine-grained sediments are less vulnerable. These results can inform stormwater infiltration guidance relative to drinking water wells, with an emphasis on restricting infiltration near water supply wells finished in geologically sensitive aquifers to reduce public health risks.
Collapse
|
12
|
Quantitative Microbial Risk Assessment for Contaminated Private Wells in the Fractured Dolomite Aquifer of Kewaunee County, Wisconsin. ENVIRONMENTAL HEALTH PERSPECTIVES 2021; 129:67003. [PMID: 34160247 PMCID: PMC8221031 DOI: 10.1289/ehp7815] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/02/2020] [Revised: 03/23/2021] [Accepted: 05/07/2021] [Indexed: 05/20/2023]
Abstract
BACKGROUND Private wells are an important source of drinking water in Kewaunee County, Wisconsin. Due to the region's fractured dolomite aquifer, these wells are vulnerable to contamination by human and zoonotic gastrointestinal pathogens originating from land-applied cattle manure and private septic systems. OBJECTIVE We determined the magnitude of the health burden associated with contamination of private wells in Kewaunee County by feces-borne gastrointestinal pathogens. METHODS This study used data from a year-long countywide pathogen occurrence study as inputs into a quantitative microbial risk assessment (QMRA) to predict the total cases of acute gastrointestinal illness (AGI) caused by private well contamination in the county. Microbial source tracking was used to associate predicted cases of illness with bovine, human, or unknown fecal sources. RESULTS Results suggest that private well contamination could be responsible for as many as 301 AGI cases per year in Kewaunee County, and that 230 and 12 cases per year were associated with a bovine and human fecal source, respectively. Furthermore, Cryptosporidium parvum was predicted to cause 190 cases per year, the most out of all 8 pathogens included in the QMRA. DISCUSSION This study has important implications for land use and water resource management in Kewaunee County and informs the public health impacts of consuming drinking water produced in other similarly vulnerable hydrogeological settings. https://doi.org/10.1289/EHP7815.
Collapse
|
13
|
Sources and Risk Factors for Nitrate and Microbial Contamination of Private Household Wells in the Fractured Dolomite Aquifer of Northeastern Wisconsin. ENVIRONMENTAL HEALTH PERSPECTIVES 2021; 129:67004. [PMID: 34160249 PMCID: PMC8221036 DOI: 10.1289/ehp7813] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
BACKGROUND Groundwater quality in the Silurian dolomite aquifer in northeastern Wisconsin, USA, has become contentious as dairy farms and exurban development expand. OBJECTIVES We investigated private household wells in the region, determining the extent, sources, and risk factors of nitrate and microbial contamination. METHODS Total coliforms, Escherichia coli, and nitrate were evaluated by synoptic sampling during groundwater recharge and no-recharge periods. Additional seasonal sampling measured genetic markers of human and bovine fecal-associated microbes and enteric zoonotic pathogens. We constructed multivariable regression models of detection probability (log-binomial) and concentration (gamma) for each contaminant to identify risk factors related to land use, precipitation, hydrogeology, and well construction. RESULTS Total coliforms and nitrate were strongly associated with depth-to-bedrock at well sites and nearby agricultural land use, but not septic systems. Both human wastewater and cattle manure contributed to well contamination. Rotavirus group A, Cryptosporidium, and Salmonella were the most frequently detected pathogens. Wells positive for human fecal markers were associated with depth-to-groundwater and number of septic system drainfield within 229m. Manure-contaminated wells were associated with groundwater recharge and the area size of nearby agricultural land. Wells positive for any fecal-associated microbe, regardless of source, were associated with septic system density and manure storage proximity modified by bedrock depth. Well construction was generally not related to contamination, indicating land use, groundwater recharge, and bedrock depth were the most important risk factors. DISCUSSION These findings may inform policies to minimize contamination of the Silurian dolomite aquifer, a major water supply for the U.S. and Canadian Great Lakes region. https://doi.org/10.1289/EHP7813.
Collapse
|
14
|
Viral, bacterial, and protozoan pathogens and fecal markers in wells supplying groundwater to public water systems in Minnesota, USA. WATER RESEARCH 2020; 178:115814. [PMID: 32325219 DOI: 10.1016/j.watres.2020.115814] [Citation(s) in RCA: 33] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/09/2019] [Revised: 03/25/2020] [Accepted: 04/08/2020] [Indexed: 05/04/2023]
Abstract
Drinking water supply wells can be contaminated by a broad range of waterborne pathogens. However, groundwater assessments frequently measure microbial indicators or a single pathogen type, which provides a limited characterization of potential health risk. This study assessed contamination of wells by testing for viral, bacterial, and protozoan pathogens and fecal markers. Wells supplying groundwater to community and noncommunity public water systems in Minnesota, USA (n = 145) were sampled every other month over one or two years and tested using 23 qPCR assays. Eighteen genetic targets were detected at least once, and microbiological contamination was widespread (96% of 145 wells, 58% of 964 samples). The sewage-associated microbial indicators HF183 and pepper mild mottle virus were detected frequently. Human or zoonotic pathogens were detected in 70% of wells and 21% of samples by qPCR, with Salmonella and Cryptosporidium detected more often than viruses. Samples positive by qPCR for adenovirus (HAdV), enterovirus, or Salmonella were analyzed by culture and for genotype or serotype. qPCR-positive Giardia and Cryptosporidium samples were analyzed by immunofluorescent assay (IFA), and IFA and qPCR concentrations were correlated. Comparisons of indicator and pathogen occurrence at the time of sampling showed that total coliforms, HF183, and Bacteroidales-like HumM2 had high specificity and negative predictive values but generally low sensitivity and positive predictive values. Pathogen-HF183 ratios in sewage have been used to estimate health risks from HF183 concentrations in surface water, but in our groundwater samples Cryptosporidium oocyst:HF183 and HAdV:HF183 ratios were approximately 10,000 times higher than ratios reported for sewage. qPCR measurements provided a robust characterization of microbiological water quality, but interpretation of qPCR data in a regulatory context is challenging because few studies link qPCR measurements to health risk.
Collapse
|
15
|
Septic Systems and Rainfall Influence Human Fecal Marker and Indicator Organism Occurrence in Private Wells in Southeastern Pennsylvania. ENVIRONMENTAL SCIENCE & TECHNOLOGY 2020; 54:3159-3168. [PMID: 32073835 DOI: 10.1021/acs.est.9b05405] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/20/2023]
Abstract
In the United States, approximately 48 million people are served by private wells. Unlike public water systems, private well water quality is not monitored, and there are few studies on the extent and sources of contamination of private wells. We extensively investigated five private wells to understand the variability in microbial contamination, the role of septic systems as sources of contamination, and the effect of rainfall on well water quality. From 2016 to 2017, weekly or biweekly samples (n = 105) were collected from five private wells in rural Pennsylvania. Samples were tested for general water quality parameters, conventional and sewage-associated microbial indicators, and human pathogens. Total coliforms, human Bacteroides (HF183), and pepper mild mottle virus were detected at least once in all wells. Regression revealed significant relationships between HF183 and rainfall 8-14 days prior to sampling and between total coliforms and rainfall 8-14 or 0-14 days prior to sampling. Dye tracer studies at three wells confirmed the impact of household septic systems on well contamination. Microbiological measurements, chemical water quality data, and dye tracer tests provide evidence of human fecal contamination in the private wells studied, suggesting that household septic systems are the source of this contamination.
Collapse
|
16
|
Confirming the need for virus disinfection in municipal subsurface drinking water supplies. WATER RESEARCH 2019; 157:356-364. [PMID: 30970285 DOI: 10.1016/j.watres.2019.03.057] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/25/2018] [Revised: 03/08/2019] [Accepted: 03/26/2019] [Indexed: 06/09/2023]
Abstract
Enteric viruses pose the greatest acute human health risks associated with subsurface drinking water supplies, yet quantitative risk assessment tools have rarely been used to develop health-based targets for virus treatment in drinking water sourced from these supplies. Such efforts have previously been hampered by a lack of consensus concerning a suitable viral reference pathogen and dose-response model as well as difficulties in quantifying pathogenic viruses in water. A reverse quantitative microbial risk assessment (QMRA) framework and quantitative polymerase chain reaction data for norovirus genogroup I in subsurface drinking water supplies were used herein to evaluate treatment needs for such water supplies. Norovirus was not detected in over 90% of samples, which emphasizes the need to consider the spatially and/or temporally intermittent patterns of enteric pathogen contamination in subsurface water supplies. Collectively, this analysis reinforces existing recommendations that a minimum 4-log treatment goal is needed for enteric viruses in groundwater in absence of well-specific monitoring information. This result is sensitive to the virus dose-response model used as there is approximately a 3-log discrepancy among virus dose-response models in the existing literature. This emphasizes the need to address the uncertainties and lack of consensus related to various QMRA modelling approaches and the analytical limitations that preclude more accurate description of virus risks.
Collapse
|
17
|
Efficacy of recycled sand or organic solids as bedding sources for lactating cows housed in freestalls. J Dairy Sci 2019; 102:6682-6698. [PMID: 31128869 DOI: 10.3168/jds.2018-15851] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2018] [Accepted: 03/28/2019] [Indexed: 11/19/2022]
Abstract
Our objective was to compare the composition of bedding materials and manure, cow welfare and hygiene assessments, measures of milk production and quality, and incidence of mastitis during a 3-yr trial with lactating Holstein cows housed in a freestall barn containing 4 identical pens with 32 freestalls/pen. Bedding systems evaluated included deep-bedded organic manure solids (DBOS), shallow-bedded manure solids spread over mattresses (MAT), deep-bedded recycled sand (RSA), and deep-bedded new sand (NSA). The experiment was designed as a 4 × 4 Latin square with 4 bedding systems and 4 experimental periods, but was terminated after 3 yr following discussions with the consulting statistician; therefore, data were analyzed as an incomplete Latin square. A total of n = 734 mostly primiparous cows (n = 725 primiparous, n = 9 multiparous; 224 to 267 cows/yr) were enrolled in the trial. Before placement in freestalls, organic solids (OS) exhibited lower concentrations of dry matter (36.5 vs. 94.3%), and greater concentrations of volatile solids, C, N, NH4-N, P, water-extractable P, K, and S compared with RSA or NSA. Cow comfort index was greater for sand-bedded systems compared with those using OS (88.4 vs. 82.8%). Cows bedded in systems using OS (DBOS and MAT) exhibited greater mean hock scores (1 = no swelling, no hair loss; 2 = no swelling, bald area on hock) than those bedded in sand (1.25 vs. 1.04), but this effect was entirely associated with use of mattresses (MAT), which differed sharply from DBOS (1.42 vs. 1.07). Generally, hygiene scores for legs, flanks, and udders were numerically similar for DBOS, NSA, and RSA bedding systems, and differences between bedding systems were associated entirely with MAT, yielding detectable contrasts between MAT and DBOS for legs (2.94 vs. 2.20), flanks (2.34 vs. 1.68), and udders (1.83 vs. 1.38). No significant contrast comparing bedding systems was detected for measures of milk production or quality. Documented cases of clinical mastitis requiring treatment ranged from a low rate of 7.4 cases/yr for RSA to a high of 23.1 cases/yr for DBOS, based on a mean enrollment of 60.7 to 63.0 cows/treatment per yr. Cows bedded with OS exhibited a greater incidence of mastitis than those bedded with sand (19.0 vs. 8.4 cases/yr), but no differences were observed for comparisons within individual bedding-material types. Collectively, these results generally favored use of sand-bedding materials over systems using OS.
Collapse
|
18
|
Cryptosporidium Incidence and Surface Water Influence of Groundwater Supplying Public Water Systems in Minnesota, USA. ENVIRONMENTAL SCIENCE & TECHNOLOGY 2019; 53:3391-3398. [PMID: 30895775 DOI: 10.1021/acs.est.8b05446] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
Regulations for public water systems (PWS) in the U.S. consider Cryptosporidium a microbial contaminant of surface water supplies. Groundwater is assumed free of Cryptosporidium unless surface water is entering supply wells. We determined the incidence of Cryptosporidium in PWS wells varying in surface water influence. Community and noncommunity PWS wells ( n = 145) were sampled ( n = 964) and analyzed for Cryptosporidium by qPCR and immunofluorescence assay (IFA). Surface water influence was assessed by stable isotopes and the expert judgment of hydrogeologists using site-specific data. Fifty-eight wells (40%) and 107 samples (11%) were Cryptosporidium-positive by qPCR, and of these samples 67 were positive by IFA. Cryptosporidium concentrations measured by qPCR and IFA were significantly correlated ( p < 0.001). Cryptosporidium incidence was not associated with surface water influence as assessed by stable isotopes or expert judgment. We successfully sequenced 45 of the 107 positive samples to identify species, including C. parvum (41), C. andersoni (2), and C. hominis (2), and the predominant subtype was C. parvum IIa A17G2R1. Assuming USA regulations for surface water-supplied PWS were applicable to the study wells, wells positive for Cryptosporidium by IFA would likely be required to add treatment. Cryptosporidium is not uncommon in groundwater, even when surface water influence is absent.
Collapse
|
19
|
Automated Time Series Measurement of Microbial Concentrations in Groundwater-Derived Water Supplies. GROUND WATER 2019; 57:329-336. [PMID: 30155887 PMCID: PMC7379695 DOI: 10.1111/gwat.12822] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/13/2018] [Revised: 08/13/2018] [Accepted: 08/25/2018] [Indexed: 05/10/2023]
Abstract
Fecal contamination by human and animal pathogens, including viruses, bacteria, and protozoa, is a potential human health hazard, especially with regards to drinking water. Pathogen occurrence in groundwater varies considerably in space and time, which can be difficult to characterize as sampling typically requires hundreds of liters of water to be passed through a filter. Here we describe the design and deployment of an automated sampler suited for hydrogeologically and chemically dynamic groundwater systems. Our design focused on a compact form to facilitate transport and quick deployment to municipal and domestic water supplies. We deployed a sampler to characterize water quality from a household well tapping a shallow fractured dolomite aquifer in northeast Wisconsin. The sampler was deployed from January to April 2017, and monitored temperature, nitrate, chloride, specific conductance, and fluorescent dissolved organic matter on a minute time step; water was directed to sequential microbial filters during three recharge periods that ranged from 5 to 20 days. Results from the automated sampler demonstrate the dynamic nature of the household water quality, especially with regard to microbial targets, which were shown to vary 1 to 2 orders of magnitude during a single sampling event. We believe assessments of pathogen occurrence and concentration, and related assessments of drinking well vulnerability, would be improved by the time-integrated characterization provided by this sampler.
Collapse
|
20
|
Human-Associated Indicator Bacteria and Human-Specific Viruses in Surface Water: A Spatial Assessment with Implications on Fate and Transport. ENVIRONMENTAL SCIENCE & TECHNOLOGY 2018; 52:12162-12171. [PMID: 30991470 DOI: 10.1021/acs.est.8b03481] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Hydrologic, seasonal, and spatial variability of sewage contamination was studied at six locations within a watershed upstream from water reclamation facility (WRF) effluent to define relative loadings of sewage from different portions of the watershed. Fecal pollution from human sources was spatially quantified by measuring two human-associated indicator bacteria (HIB) and eight human-specific viruses (HSV) at six stream locations in the Menomonee River watershed in Milwaukee, Wisconsin from April 2009 to March 2011. A custom, automated water sampler, which included HSV filtration, was deployed at each location and provided unattended, flow-weighted, large-volume (30-913 L) sampling. In addition, wastewater influent samples were composited over discrete 7 day periods from the two Milwaukee WRFs. Of the 8 HSV, only 3 were detected, present in up to 38% of the 228 stream samples, while at least 1 HSV was detected in all WRF influent samples. HIB occurred more often with significantly higher concentrations than the HSV in stream and WRF influent samples ( p < 0.05). HSV yield calculations showed a loss from upstream to the most-downstream sub-watershed of the Menomonee River, and in contrast, a positive HIB yield from this same sub-watershed emphasizes the complexity in fate and transport properties between HSV and HIB. This study demonstrates the utility of analyzing multiple HSV and HIB to provide a weight-of-evidence approach for assessment of fecal contamination at the watershed level, provides an assessment of relative loadings for prioritizing areas within a watershed, and demonstrates how loadings of HSV and HIB can be inconsistent, inferring potential differences in fate and transport between the two indicators of human fecal presence.
Collapse
|
21
|
Voluntary intake and digestibility by sheep of alfalfa ensiled at different moisture concentrations following fertilization with dairy slurry. J Anim Sci 2018; 96:964-974. [PMID: 29401268 DOI: 10.1093/jas/skx061] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2017] [Accepted: 12/19/2017] [Indexed: 11/13/2022] Open
Abstract
Dairy slurry is used commonly as an animal-sourced fertilizer in agronomic production. However, residual effects of slurry application on intake and digestibility of alfalfa (Medicago sativa L.) silage from subsequent harvests are not well known. The objective of this study was to determine if moisture concentration of alfalfa silage and timing of dairy slurry application relative to subsequent harvest affected intake and digestibility by sheep. Katahdin crossbred ewes (n = 18; 48 ± 5.3 kg) in mid-gestation were stratified by BW and allocated randomly in each of two periods to one of six treatments arranged in a two × three factorial arrangement. Treatments consisted of recommended (RM; 46.8%) or low (LM; 39.7%) moisture at baling after either no slurry application (NS), slurry application to stubble immediately after removal of the previous cutting (S0), or slurry application 14 d after removal of the previous cutting (S14). Silages were chopped through a commercial straw chopper, packed into plastic trash cans, and then offered to ewes within 4 d of chopping. Period 1 of the intake and digestion study consisted of a 14-d adaptation followed by a 7-d fecal collection period. Period 2 followed period 1 after a 4-d rest and consisted of an 11-d adaptation followed by 7 d of fecal collection. Ewes were housed individually in 1.4 × 4.3-m pens equipped with rubber mat flooring. Feces were swept from the floor twice daily, weighed, and dried at 50 °C. Ewes had ad libitum access to water and were offered chopped silage for a minimum of 10% refusal (DM). Blood samples were collected immediately prior to feeding, and 4 and 8 h after feeding on the day prior to the end of each period. Organic matter intake (g/kg BW) and OM digestibility tended (P < 0.10) to be, and digestible OM intake (g/kg BW) was reduced by slurry application. Lymphocytes (% of total white blood cells) were greater (P < 0.05) from LM vs. RM and from NS vs. S0 and S14. Red blood cell concentrations were greater (P < 0.05) from S14 vs. S0 and from S0 and S14 vs. NS. Serum urea N concentrations did not differ (P > 0.17) across treatments. Therefore, moisture concentration of alfalfa silage within the range used in this study may not affect voluntary intake or digestibility, but slurry application may have an effect on digestible OM intake. Also, moisture concentration of alfalfa silage and time of dairy slurry application may affect specific blood hemograms.
Collapse
|
22
|
Using Integrated Environmental Modeling to Assess Sources of Microbial Contamination in Mixed-Use Watersheds. JOURNAL OF ENVIRONMENTAL QUALITY 2018; 47:1103-1114. [PMID: 30272785 PMCID: PMC6545896 DOI: 10.2134/jeq2018.02.0071] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
Microbial fate and transport in watersheds should include a microbial source apportionment analysis that estimates the importance of each source, relative to each other and in combination, by capturing their impacts spatially and temporally under various scenarios. A loosely configured software infrastructure was used in microbial source-to-receptor modeling by focusing on animal- and human-impacted mixed-use watersheds. Components include data collection software, a microbial source module that determines loading rates from different sources, a watershed model, an inverse model for calibrating flows and microbial densities, tabular and graphical viewers, software to convert output to different formats, and a model for calculating risk from pathogen exposure. The system automates, as much as possible, the manual process of accessing and retrieving data and completes input data files of the models. The workflow considers land-applied manure from domestic animals on undeveloped areas; direct shedding (excretion) on undeveloped lands by domestic animals and wildlife; pastureland, cropland, forest, and urban or engineered areas; sources that directly release to streams from leaking septic systems; and shedding by domestic animals directly to streams. The infrastructure also considers point sources from regulated discharges. An application is presented on a real-world watershed and helps answer questions such as: What are the major microbial sources? What practices contribute to contamination at the receptor location? What land-use types influence contamination at the receptor location? and Under what conditions do these sources manifest themselves? This research aims to improve our understanding of processes related to pathogen and indicator dynamics in mixed-use watershed systems.
Collapse
|
23
|
Fate of Manure-Borne Pathogens during Anaerobic Digestion and Solids Separation. JOURNAL OF ENVIRONMENTAL QUALITY 2018; 47:336-344. [PMID: 29634802 PMCID: PMC7166490 DOI: 10.2134/jeq2017.07.0285] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/15/2023]
Abstract
Anaerobic digestion can inactivate zoonotic pathogens present in cattle manure, which reduces transmission of these pathogens from farms to humans through the environment. However, the variability of inactivation across farms and over time is unknown because most studies have examined pathogen inactivation under ideal laboratory conditions or have focused on only one or two full-scale digesters at a time. In contrast, we sampled seven full-scale digesters treating cattle manure in Wisconsin for 9 mo on a biweekly basis ( = 118 pairs of influent and effluent samples) and used real-time quantitative polymerase chain reaction to analyze these samples for 19 different microbial genetic markers. Overall, inactivation of pathogens and fecal indicators was highly variable. When aggregated across digester and season, log-removal values for several representative microorganisms-bovine , -like CowM3, and bovine polyomavirus-were 0.78 ± 0.34, 0.70 ± 0.50, and 0.53 ± 0.58, respectively (mean ± SD). These log-removal values were up to two times lower than expected based on the scientific literature. Thus, our study indicates that full-scale anaerobic digestion of cattle manure requires optimization with regard to pathogen inactivation. Future studies should focus on identifying the potential causes of this suboptimal performance (e.g., overloading, poor mixing, poor temperature control). Our study also examined the fate of pathogens during manure separation and found that the majority of microbes we detected ended up in the liquid fraction of separated manure. This finding has important implications for the transmission of zoonotic pathogens through the environment to humans.
Collapse
|
24
|
Sewage loading and microbial risk in urban waters of the Great Lakes. ELEMENTA (WASHINGTON, D.C.) 2018; 6:46. [PMID: 30393748 PMCID: PMC6211557 DOI: 10.1525/elementa.301] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
Despite modern sewer system infrastructure, the release of sewage from deteriorating pipes and sewer overflows is a major water pollution problem in US cities, particularly in coastal watersheds that are highly developed with large human populations. We quantified fecal pollution sources and loads entering Lake Michigan from a large watershed of mixed land use using host-associated indicators. Wastewater treatment plant influent had stable concentrations of human Bacteroides and human Lachnospiraceae with geometric mean concentrations of 2.77 × 107 and 5.94 × 107 copy number (by quantitative PCR) per 100 ml, respectively. Human-associated indicator levels were four orders of magnitude higher than norovirus concentrations, suggesting that these human-associated bacteria could be sensitive indicators of pathogen risk. Norovirus concentrations in these same samples were used in calculations for quantitative microbial risk assessment. Assuming a typical recreational exposure to untreated sewage in water, concentrations of 7,800 copy number of human Bacteroides per 100 mL or 14,000 copy number of human Lachnospiraceae per 100 mL corresponded to an illness risk of 0.03. These levels were exceeded in estuarine waters during storm events with greater than 5 cm of rainfall. Following overflows from combined sewer systems (which must accommodate both sewage and stormwater), concentrations were 10-fold higher than under rainfall conditions. Automated high frequency sampling allowed for loads of human-associated markers to be determined, which could then be related back to equivalent volumes of untreated sewage that were released. Evidence of sewage contamination decreased as ruminant-associated indicators increased approximately one day post-storm, demonstrating the delayed impact of upstream agricultural sources on the estuary. These results demonstrate that urban areas are a diffuse source of sewage contamination to urban waters and that storm-driven release of sewage, particularly when sewage overflows occur, creates a serious though transient human health risk.
Collapse
|
25
|
Capturing Microbial Sources Distributed in a Mixed-use Watershed within an Integrated Environmental Modeling Workflow. ENVIRONMENTAL MODELLING & SOFTWARE : WITH ENVIRONMENT DATA NEWS 2018; 99:126-146. [PMID: 30078989 PMCID: PMC6069999 DOI: 10.1016/j.envsoft.2017.08.002] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/30/2023]
Abstract
Many watershed models simulate overland and instream microbial fate and transport, but few provide loading rates on land surfaces and point sources to the waterbody network. This paper describes the underlying equations for microbial loading rates associated with 1) land-applied manure on undeveloped areas from domestic animals; 2) direct shedding (excretion) on undeveloped lands by domestic animals and wildlife; 3) urban or engineered areas; and 4) point sources that directly discharge to streams from septic systems and shedding by domestic animals. A microbial source module, which houses these formulations, is part of a workflow containing multiple models and databases that form a loosely configured modeling infrastructure which supports watershed-scale microbial source-to-receptor modeling by focusing on animal- and human-impacted catchments. A hypothetical application - accessing, retrieving, and using real-world data - demonstrates how the infrastructure can automate many of the manual steps associated with a standard watershed assessment, culminating in calibrated flow and microbial densities at the watershed's pour point.
Collapse
|
26
|
Quantitative Microbial Risk Assessment for Spray Irrigation of Dairy Manure Based on an Empirical Fate and Transport Model. ENVIRONMENTAL HEALTH PERSPECTIVES 2017; 125:087009. [PMID: 28885976 PMCID: PMC5884668 DOI: 10.1289/ehp283] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/30/2016] [Revised: 03/07/2017] [Accepted: 03/13/2017] [Indexed: 05/05/2023]
Abstract
BACKGROUND Spray irrigation for land-applying livestock manure is increasing in the United States as farms become larger and economies of scale make manure irrigation affordable. Human health risks from exposure to zoonotic pathogens aerosolized during manure irrigation are not well understood. OBJECTIVES We aimed to a) estimate human health risks due to aerosolized zoonotic pathogens downwind of spray-irrigated dairy manure; and b) determine which factors (e.g., distance, weather conditions) have the greatest influence on risk estimates. METHODS We sampled downwind air concentrations of manure-borne fecal indicators and zoonotic pathogens during 21 full-scale dairy manure irrigation events at three farms. We fit these data to hierarchical empirical models and used model outputs in a quantitative microbial risk assessment (QMRA) to estimate risk [probability of acute gastrointestinal illness (AGI)] for individuals exposed to spray-irrigated dairy manure containing Campylobacter jejuni, enterohemorrhagic Escherichia coli (EHEC), or Salmonella spp. RESULTS Median risk estimates from Monte Carlo simulations ranged from 10-5 to 10-2 and decreased with distance from the source. Risk estimates for Salmonella or EHEC-related AGI were most sensitive to the assumed level of pathogen prevalence in dairy manure, while risk estimates for C. jejuni were not sensitive to any single variable. Airborne microbe concentrations were negatively associated with distance and positively associated with wind speed, both of which were retained in models as a significant predictor more often than relative humidity, solar irradiation, or temperature. CONCLUSIONS Our model-based estimates suggest that reducing pathogen prevalence and concentration in source manure would reduce the risk of AGI from exposure to manure irrigation, and that increasing the distance from irrigated manure (i.e., setbacks) and limiting irrigation to times of low wind speed may also reduce risk. https://doi.org/10.1289/EHP283.
Collapse
|
27
|
Human virus and microbial indicator occurrence in public-supply groundwater systems: meta-analysis of 12 international studies. HYDROGEOLOGY JOURNAL 2017; 25:903-919. [PMID: 30245581 PMCID: PMC6145489 DOI: 10.1007/s10040-017-1581-5] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/13/2016] [Accepted: 03/23/2017] [Indexed: 05/06/2023]
Abstract
Groundwater quality is often evaluated using microbial indicators. This study examines data from 12 international groundwater studies (conducted 1992-2013) of 718 public drinking-water systems located in a range of hydrogeological settings. Focus was on testing the value of indicator organisms for identifying virus-contaminated wells. One or more indicators and viruses were present in 37 and 15% of 2,273 samples and 44 and 27% of 746 wells, respectively. Escherichia coli (E. coli) and somatic coliphage are 7-9 times more likely to be associated with culturable virus-positive samples when the indicator is present versus when it is absent, while F-specific and somatic coliphages are 8-9 times more likely to be associated with culturable virus-positive wells. However, single indicators are only marginally associated with viruses detected by molecular methods, and all microbial indicators have low sensitivity and positive predictive values for virus occurrence, whether by culturable or molecular assays, i.e., indicators are often absent when viruses are present and the indicators have a high false-positive rate. Wells were divided into three susceptibility subsets based on presence of (1) total coliform bacteria or (2) multiple indicators, or (3) location of wells in karst, fractured bedrock, or gravel/cobble settings. Better associations of some indicators with viruses were observed for (1) and (3). Findings indicate the best indicators are E. coli or somatic coliphage, although both indicators may underestimate virus occurrence. Repeat sampling for indicators improves evaluation of the potential for viral contamination in a well.
Collapse
|
28
|
Hydrologic, land cover, and seasonal patterns of waterborne pathogens in Great Lakes tributaries. WATER RESEARCH 2017; 113:11-21. [PMID: 28187346 PMCID: PMC7126339 DOI: 10.1016/j.watres.2017.01.060] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/13/2016] [Revised: 01/11/2017] [Accepted: 01/29/2017] [Indexed: 05/06/2023]
Abstract
Great Lakes tributaries are known to deliver waterborne pathogens from a host of sources. To examine the hydrologic, land cover, and seasonal patterns of waterborne pathogens (i.e. protozoa (2), pathogenic bacteria (4) human viruses, (8) and bovine viruses (8)) eight rivers were monitored in the Great Lakes Basin over 29 months from February 2011 to June 2013. Sampling locations represented a wide variety of land cover classes from urban to agriculture to forest. A custom automated pathogen sampler was deployed at eight sampling locations which provided unattended, flow-weighted, large-volume (120-1630 L) sampling. Human and bovine viruses and pathogenic bacteria were detected by real-time qPCR in 16%, 14%, and 1.4% of 290 samples collected while protozoa were never detected. The most frequently detected pathogens were: bovine polyomavirus (11%), and human adenovirus C, D, F (9%). Human and bovine viruses were present in 16.9% and 14.8% of runoff-event samples (n = 189) resulting from precipitation and snowmelt, and 13.9% and 12.9% of low-flow samples (n = 101), respectively, indicating multiple delivery mechanisms could be influential. Data indicated human and bovine virus prevalence was different depending on land cover within the watershed. Occurrence, concentration, and flux of human viruses were greatest in samples from the three sampling locations with greater than 25% urban influence than those with less than 25% urban influence. Similarly, occurrence, concentration, and flux of bovine viruses were greatest in samples from the two sampling locations with greater than 50 cattle/km2 than those with less than 50 cattle/km2. In seasonal analysis, human and bovine viruses occurred more frequently in spring and winter seasons than during the fall and summer. Concentration, occurrence, and flux in the context of hydrologic condition, seasonality, and land use must be considered for each watershed individually to develop effective watershed management strategies for pathogen reduction.
Collapse
|
29
|
Detection of hepatitis E virus and other livestock-related pathogens in Iowa streams. THE SCIENCE OF THE TOTAL ENVIRONMENT 2016; 566-567:1042-1051. [PMID: 27318519 PMCID: PMC7111295 DOI: 10.1016/j.scitotenv.2016.05.123] [Citation(s) in RCA: 27] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/21/2016] [Revised: 05/17/2016] [Accepted: 05/17/2016] [Indexed: 04/14/2023]
Abstract
Manure application is a source of pathogens to the environment. Through overland runoff and tile drainage, zoonotic pathogens can contaminate surface water and streambed sediment and could affect both wildlife and human health. This study examined the environmental occurrence of gene markers for livestock-related bacterial, protozoan, and viral pathogens and antibiotic resistance in surface waters within the South Fork Iowa River basin before and after periods of swine manure application on agricultural land. Increased concentrations of indicator bacteria after manure application exceeding Iowa's state bacteria water quality standards suggest that swine manure contributes to diminished water quality and may pose a risk to human health. Additionally, the occurrence of HEV and numerous bacterial pathogen genes for Escherichia coli, Enterococcus spp., Salmonella sp., and Staphylococcus aureus in both manure samples and in corresponding surface water following periods of manure application suggests a potential role for swine in the spreading of zoonotic pathogens to the surrounding environment. During this study, several zoonotic pathogens were detected including Shiga-toxin producing E. coli, Campylobacter jejuni, pathogenic enterococci, and S. aureus; all of which can pose mild to serious health risks to swine, humans, and other wildlife. This research provides the foundational understanding required for future assessment of the risk to environmental health from livestock-related zoonotic pathogen exposures in this region. This information could also be important for maintaining swine herd biosecurity and protecting the health of wildlife near swine facilities.
Collapse
|
30
|
Effects of Climate and Sewer Condition on Virus Transport to Groundwater. ENVIRONMENTAL SCIENCE & TECHNOLOGY 2016; 50:8497-504. [PMID: 27434550 DOI: 10.1021/acs.est.6b01422] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/06/2023]
Abstract
Pathogen contamination from leaky sanitary sewers poses a threat to groundwater quality in urban areas, yet the spatial and temporal dimensions of this contamination are not well understood. In this study, 16 monitoring wells and six municipal wells were repeatedly sampled for human enteric viruses. Viruses were detected infrequently, in 17 of 455 samples, compared to previous sampling at these wells. Thirteen of the 22 wells sampled were virus-positive at least once. While the highest virus concentrations occurred in shallower wells, shallow and deep wells were virus-positive at similar rates. Virus presence in groundwater was temporally coincident, with 16 of 17 virus-positive samples collected in a six-month period. Detections were associated with precipitation and occurred infrequently during a prolonged drought. The study purposely included sites with sewers of differing age and material. The rates of virus detections in groundwater were similar at all study sites during this study. However, a relationship between sewer age and virus detections emerged when compared to data from an earlier study, conducted during high precipitation conditions. Taken together, these data indicate that sewer condition and climate affect urban groundwater contamination by human enteric viruses.
Collapse
|
31
|
Determining the 95% limit of detection for waterborne pathogen analyses from primary concentration to qPCR. WATER RESEARCH 2016; 96:105-13. [PMID: 27023926 DOI: 10.1016/j.watres.2016.03.026] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/14/2015] [Revised: 03/09/2016] [Accepted: 03/10/2016] [Indexed: 05/04/2023]
Abstract
The limit of detection (LOD) for qPCR-based analyses is not consistently defined or determined in studies on waterborne pathogens. Moreover, the LODs reported often reflect the qPCR assay alone rather than the entire sample process. Our objective was to develop an approach to determine the 95% LOD (lowest concentration at which 95% of positive samples are detected) for the entire process of waterborne pathogen detection. We began by spiking the lowest concentration that was consistently positive at the qPCR step (based on its standard curve) into each procedural step working backwards (i.e., extraction, secondary concentration, primary concentration), which established a concentration that was detectable following losses of the pathogen from processing. Using the fraction of positive replicates (n = 10) at this concentration, we selected and analyzed a second, and then third, concentration. If the fraction of positive replicates equaled 1 or 0 for two concentrations, we selected another. We calculated the LOD using probit analysis. To demonstrate our approach we determined the 95% LOD for Salmonella enterica serovar Typhimurium, adenovirus 41, and vaccine-derived poliovirus Sabin 3, which were 11, 12, and 6 genomic copies (gc) per reaction (rxn), respectively (equivalent to 1.3, 1.5, and 4.0 gc L(-1) assuming the 1500 L tap-water sample volume prescribed in EPA Method 1615). This approach limited the number of analyses required and was amenable to testing multiple genetic targets simultaneously (i.e., spiking a single sample with multiple microorganisms). An LOD determined this way can facilitate study design, guide the number of required technical replicates, aid method evaluation, and inform data interpretation.
Collapse
|
32
|
Correction to Human and Bovine Viruses and Bacteria at Three Great Lakes Beaches: Environmental Variable Associations and Health Risk. ENVIRONMENTAL SCIENCE & TECHNOLOGY 2016; 50:4143. [PMID: 27008452 DOI: 10.1021/acs.est.6b00948] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
|
33
|
Human and Bovine Viruses and Bacteria at Three Great Lakes Beaches: Environmental Variable Associations and Health Risk. ENVIRONMENTAL SCIENCE & TECHNOLOGY 2016; 50:987-95. [PMID: 26720156 DOI: 10.1021/acs.est.5b04372] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
Waterborne pathogens were measured at three beaches in Lake Michigan, environmental factors for predicting pathogen concentrations were identified, and the risk of swimmer infection and illness was estimated. Waterborne pathogens were detected in 96% of samples collected at three Lake Michigan beaches in summer, 2010. Samples were quantified for 22 pathogens in four microbial categories (human viruses, bovine viruses, protozoa, and pathogenic bacteria). All beaches had detections of human and bovine viruses and pathogenic bacteria indicating influence of multiple contamination sources at these beaches. Occurrence ranged from 40 to 87% for human viruses, 65-87% for pathogenic bacteria, and 13-35% for bovine viruses. Enterovirus, adenovirus A, Salmonella spp., Campylobacter jejuni, bovine polyomavirus, and bovine rotavirus A were present most frequently. Variables selected in multiple regression models used to explore environmental factors that influence pathogens included wave direction, cloud cover, currents, and water temperature. Quantitative Microbial Risk Assessment was done for C. jejuni, Salmonella spp., and enteroviruses to estimate risk of infection and illness. Median infection risks for one-time swimming events were approximately 2 × 10(-5), 8 × 10(-6), and 3 × 10(-7) [corrected] for C. jejuni, Salmonella spp., and enteroviruses, respectively. Results highlight the importance of investigating multiple pathogens within multiple categories to avoid underestimating the prevalence and risk of waterborne pathogens.
Collapse
|
34
|
Simultaneous Concentration of Bovine Viruses and Agricultural Zoonotic Bacteria from Water Using Sodocalcic Glass Wool Filters. FOOD AND ENVIRONMENTAL VIROLOGY 2014; 6:253-9. [PMID: 25059211 PMCID: PMC7091103 DOI: 10.1007/s12560-014-9159-z] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/14/2014] [Accepted: 07/07/2014] [Indexed: 05/18/2023]
Abstract
Infiltration and runoff from manured agricultural fields can result in livestock pathogens reaching groundwater and surface waters. Here, we measured the effectiveness of glass wool filters to simultaneously concentrate enteric viruses and bacteria of bovine origin from water. The recovery efficiencies were determined for bovine viral diarrhea virus types 1 and 2, bovine rotavirus group A, bovine coronavirus, poliovirus Sabin III, toxigenic Escherichia coli ,and Campylobacter jejuni seeded into water with three different turbidity levels (0.5, 215, and 447 NTU). Twenty liters of dechlorinated tap water (pH 7) were seeded with the test organisms, and then passed through a glass wool filter using a peristaltic pump (flow rate = 1 liter min(-1)). Retained organisms were eluted from the filters by passing beef extract-glycine buffer (pH 9.5) in the direction opposite of sample flow. Recovered organisms were enumerated by qPCR except for C. jejuni, which was quantified by culture. Mean recovery efficiencies ranged from 55 to 33% for the bacteria and 58 to 16% for the viruses. Using bootstrapping techniques combined with Analysis of Variance, recovery efficiencies were found to differ among the pathogen types tested at the two lowest turbidity levels; however, for a given pathogen type turbidity did not affect recovery except for C. jejuni. Glass wool filtration is a cost-effective method for concentrating several waterborne pathogens of bovine origin simultaneously, although recovery may be low for some specific taxa such as bovine viral diarrhea virus 1.
Collapse
|
35
|
Human and bovine viruses in the Milwaukee River watershed: hydrologically relevant representation and relations with environmental variables. THE SCIENCE OF THE TOTAL ENVIRONMENT 2014; 490:849-60. [PMID: 24908645 PMCID: PMC7125695 DOI: 10.1016/j.scitotenv.2014.05.072] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/04/2014] [Revised: 05/15/2014] [Accepted: 05/16/2014] [Indexed: 04/14/2023]
Abstract
To examine the occurrence, hydrologic variability, and seasonal variability of human and bovine viruses in surface water, three stream locations were monitored in the Milwaukee River watershed in Wisconsin, USA, from February 2007 through June 2008. Monitoring sites included an urban subwatershed, a rural subwatershed, and the Milwaukee River at the mouth. To collect samples that characterize variability throughout changing hydrologic periods, a process control system was developed for unattended, large-volume (56-2800 L) filtration over extended durations. This system provided flow-weighted mean concentrations during runoff and extended (24-h) low-flow periods. Human viruses and bovine viruses were detected by real-time qPCR in 49% and 41% of samples (n=63), respectively. All human viruses analyzed were detected at least once including adenovirus (40% of samples), GI norovirus (10%), enterovirus (8%), rotavirus (6%), GII norovirus (1.6%) and hepatitis A virus (1.6%). Three of seven bovine viruses analyzed were detected including bovine polyomavirus (32%), bovine rotavirus (19%), and bovine viral diarrhea virus type 1 (5%). Human viruses were present in 63% of runoff samples resulting from precipitation and snowmelt, and 20% of low-flow samples. Maximum human virus concentrations exceeded 300 genomic copies/L. Bovine viruses were present in 46% of runoff samples resulting from precipitation and snowmelt and 14% of low-flow samples. The maximum bovine virus concentration was 11 genomic copies/L. Statistical modeling indicated that stream flow, precipitation, and season explained the variability of human viruses in the watershed, and hydrologic condition (runoff event or low-flow) and season explained the variability of the sum of human and bovine viruses; however, no model was identified that could explain the variability of bovine viruses alone. Understanding the factors that affect virus fate and transport in rivers will aid watershed management for minimizing human exposure and disease transmission.
Collapse
|
36
|
Viruses as groundwater tracers: using ecohydrology to characterize short travel times in aquifers. GROUND WATER 2014; 52:187-193. [PMID: 24433472 DOI: 10.1111/gwat.12158] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/26/2013] [Accepted: 12/03/2013] [Indexed: 06/03/2023]
Abstract
Viruses are attractive tracers of short (<3 year) travel times in aquifers because they have unique genetic signatures, are detectable in trace quantities, and are mobile in groundwater. Virus "snaphots" result from infection and disappearance in a population over time; therefore, the virus snapshot shed in the fecal wastes of an infected population at a specific point in time can serve as a marker for tracking virus and groundwater movement. The virus tracing approach and an example application are described to illustrate their ability to characterize travel times in high-groundwater velocity settings, and provide insight unavailable from standard hydrogeologic approaches. Although characterization of preferential flowpaths does not usually characterize the majority of other travel times occurring in the groundwater system (e.g., center of plume mass; tail of the breakthrough curve), virus approaches can trace very short times of transport, and thus can fill an important gap in our current hydrogeology toolbox.
Collapse
|
37
|
Drinking water systems, hydrology, and childhood gastrointestinal illness in Central and Northern Wisconsin. Am J Public Health 2014; 104:639-46. [PMID: 24524509 DOI: 10.2105/ajph.2013.301659] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
OBJECTIVES This study investigated if the type of drinking water source (treated municipal, untreated municipal, and private well water) modifies the effect of hydrology on childhood (aged < 5 years) gastrointestinal illness. METHODS We conducted a time series study to assess the relationship between hydrologic and weather conditions with childhood gastrointestinal illness from 1991 to 2010. The Central and Northern Wisconsin study area includes households using all 3 types of drinking water systems. Separate time series models were created for each system and half-year period (winter/spring, summer/fall). RESULTS More precipitation (summer/fall) systematically increased childhood gastrointestinal illness in municipalities accessing untreated water. The relative risk of contracting gastrointestinal illness was 1.4 in weeks with 3 centimeters of precipitation and 2.4 in very wet weeks with 12 centimeters of precipitation. By contrast, gastrointestinal illness in private well and treated municipal areas was not influenced by hydrologic conditions, although warmer winter temperatures slightly increased incidence. CONCLUSIONS Our study suggests that improved drinking water protection, treatment, and delivery infrastructure may improve public health by specifically identifying municipal water systems lacking water treatment that may transmit waterborne disease.
Collapse
|
38
|
Source and transport of human enteric viruses in deep municipal water supply wells. ENVIRONMENTAL SCIENCE & TECHNOLOGY 2013; 47:4096-103. [PMID: 23570447 DOI: 10.1021/es400509b] [Citation(s) in RCA: 51] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
Until recently, few water utilities or researchers were aware of possible virus presence in deep aquifers and wells. During 2008 and 2009 we collected a time series of virus samples from six deep municipal water-supply wells. The wells range in depth from approximately 220 to 300 m and draw water from a sandstone aquifer. Three of these wells draw water from beneath a regional aquitard, and three draw water from both above and below the aquitard. We also sampled a local lake and untreated sewage as potential virus sources. Viruses were detected up to 61% of the time in each well sampled, and many groundwater samples were positive for virus infectivity. Lake samples contained viruses over 75% of the time. Virus concentrations and serotypes observed varied markedly with time in all samples. Sewage samples were all extremely high in virus concentration. Virus serotypes detected in sewage and groundwater were temporally correlated, suggesting very rapid virus transport, on the order of weeks, from the source(s) to wells. Adenovirus and enterovirus levels in the wells were associated with precipitation events. The most likely source of the viruses in the wells was leakage of untreated sewage from sanitary sewer pipes.
Collapse
|
39
|
Risk of viral acute gastrointestinal illness from nondisinfected drinking water distribution systems. ENVIRONMENTAL SCIENCE & TECHNOLOGY 2012; 46:9299-307. [PMID: 22839570 DOI: 10.1021/es3015925] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
Acute gastrointestinal illness (AGI) resulting from pathogens directly entering the piping of drinking water distribution systems is insufficiently understood. Here, we estimate AGI incidence from virus intrusions into the distribution systems of 14 nondisinfecting, groundwater-source, community water systems. Water samples for virus quantification were collected monthly at wells and households during four 12-week periods in 2006-2007. Ultraviolet (UV) disinfection was installed on the communities' wellheads during one study year; UV was absent the other year. UV was intended to eliminate virus contributions from the wells and without residual disinfectant present in these systems, any increase in virus concentration downstream at household taps represented virus contributions from the distribution system (Approach 1). During no-UV periods, distribution system viruses were estimated by the difference between well water and household tap virus concentrations (Approach 2). For both approaches, a Monte Carlo risk assessment framework was used to estimate AGI risk from distribution systems using study-specific exposure-response relationships. Depending on the exposure-response relationship selected, AGI risk from the distribution systems was 0.0180-0.0661 and 0.001-0.1047 episodes/person-year estimated by Approaches 1 and 2, respectively. These values represented 0.1-4.9% of AGI risk from all exposure routes, and 1.6-67.8% of risk related to drinking water exposure. Virus intrusions into nondisinfected drinking water distribution systems can contribute to sporadic AGI.
Collapse
|
40
|
Measuring and mitigating inhibition during quantitative real time PCR analysis of viral nucleic acid extracts from large-volume environmental water samples. WATER RESEARCH 2012; 46:4281-91. [PMID: 22673345 DOI: 10.1016/j.watres.2012.04.030] [Citation(s) in RCA: 84] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/21/2011] [Revised: 04/13/2012] [Accepted: 04/19/2012] [Indexed: 05/02/2023]
Abstract
Naturally-occurring inhibitory compounds are a major concern during qPCR and RT-qPCR analysis of environmental samples, particularly large volume water samples. Here, a standardized method for measuring and mitigating sample inhibition in environmental water concentrates is described. Specifically, the method 1) employs a commercially available standard RNA control; 2) defines inhibition by the change in the quantification cycle (C(q)) of the standard RNA control when added to the sample concentrate; and 3) calculates a dilution factor using a mathematical formula applied to the change in C(q) to indicate the specific volume of nuclease-free water necessary to dilute the effect of inhibitors. The standardized inhibition method was applied to 3,193 large-volume water (surface, groundwater, drinking water, agricultural runoff, sewage) concentrates of which 1,074 (34%) were inhibited. Inhibition level was not related to sample volume. Samples collected from the same locations over a one to two year period had widely variable inhibition levels. The proportion of samples that could have been reported as false negatives if inhibition had not been mitigated was between 0.3% and 71%, depending on water source. These findings emphasize the importance of measuring and mitigating inhibition when reporting qPCR results for viral pathogens in environmental waters to minimize the likelihood of reporting false negatives and under-quantifying virus concentration.
Collapse
|
41
|
Viruses in nondisinfected drinking water from municipal wells and community incidence of acute gastrointestinal illness. ENVIRONMENTAL HEALTH PERSPECTIVES 2012; 120:1272-9. [PMID: 22659405 PMCID: PMC3440111 DOI: 10.1289/ehp.1104499] [Citation(s) in RCA: 84] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/15/2011] [Accepted: 05/31/2012] [Indexed: 05/04/2023]
Abstract
BACKGROUND Groundwater supplies for drinking water are frequently contaminated with low levels of human enteric virus genomes, yet evidence for waterborne disease transmission is lacking. OBJECTIVES We related quantitative polymerase chain reaction (qPCR)-measured enteric viruses in the tap water of 14 Wisconsin communities supplied by nondisinfected groundwater to acute gastrointestinal illness (AGI) incidence. METHODS AGI incidence was estimated from health diaries completed weekly by households within each study community during four 12-week periods. Water samples were collected monthly from five to eight households per community. Viruses were measured by qPCR, and infectivity assessed by cell culture. AGI incidence was related to virus measures using Poisson regression with random effects. RESULTS Communities and time periods with the highest virus measures had correspondingly high AGI incidence. This association was particularly strong for norovirus genogroup I (NoV-GI) and between adult AGI and enteroviruses when echovirus serotypes predominated. At mean concentrations of 1 and 0.8 genomic copies/L of NoV-GI and enteroviruses, respectively, the AGI incidence rate ratios (i.e., relative risk) increased by 30%. Adenoviruses were common, but tap-water concentrations were low and not positively associated with AGI. The estimated fraction of AGI attributable to tap-water-borne viruses was between 6% and 22%, depending on the virus exposure-AGI incidence model selected, and could have been as high as 63% among children < 5 years of age during the period when NoV-GI was abundant in drinking water. CONCLUSIONS The majority of groundwater-source public water systems in the United States produce water without disinfection, and our findings suggest that populations served by such systems may be exposed to waterborne viruses and consequent health risks.
Collapse
|
42
|
Comparative effectiveness of membrane bioreactors, conventional secondary treatment, and chlorine and UV disinfection to remove microorganisms from municipal wastewaters. WATER RESEARCH 2012; 46:4164-78. [PMID: 22682268 DOI: 10.1016/j.watres.2012.04.044] [Citation(s) in RCA: 41] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/23/2011] [Revised: 04/19/2012] [Accepted: 04/30/2012] [Indexed: 05/03/2023]
Abstract
Log removals of bacterial indicators, coliphage, and enteric viruses were studied in three membrane bioreactor (MBR) activated-sludge and two conventional secondary activated-sludge municipal wastewater treatment plants during three recreational seasons (May-Oct.) when disinfection of effluents is required. In total, 73 regular samples were collected from key locations throughout treatment processes: post-preliminary, post-MBR, post-secondary, post-tertiary, and post-disinfection (UV or chlorine). Out of 19 post-preliminary samples, adenovirus by quantitative polymerase chain reaction (qPCR) was detected in all 19, enterovirus by quantitative reverse transcription polymerase chain reaction (qRT-PCR) was detected in 15, and norovirus GI by qRT-PCR was detected in 11. Norovirus GII and Hepatitis A virus were not detected in any samples, and rotavirus was detected in one sample but could not be quantified. Although culturable viruses were found in 12 out of 19 post-preliminary samples, they were not detected in any post-secondary, post-MBR, post-ultraviolet, or post-chlorine samples. Median log removals for all organisms were higher for MBR secondary treatment (3.02 to >6.73) than for conventional secondary (1.53-4.19) treatment. Ultraviolet disinfection after MBR treatment provided little additional log removal of any organism except for somatic coliphage (>2.18), whereas ultraviolet or chlorine disinfection after conventional secondary treatment provided significant log removals (above the analytical variability) of all bacterial indicators (1.18-3.89) and somatic and F-specific coliphage (0.71 and >2.98). Median log removals of adenovirus across disinfection were low in both MBR and conventional secondary plants (no removal detected and 0.24), and few removals of individual samples were near or above the analytical variability of 1.2 log genomic copies per liter. Based on qualitative examinations of plots showing reductions of organisms throughout treatment processes, somatic coliphage may best represent the removal of viruses across secondary treatment in both MBR and conventional secondary plants. F-specific coliphage and Escherichia coli may best represent the removal of viruses across the disinfection process in MBR facilities, but none of the indicators represented the removal of viruses across disinfection in conventional secondary plants.
Collapse
|
43
|
Glass wool filters for concentrating waterborne viruses and agricultural zoonotic pathogens. J Vis Exp 2012:e3930. [PMID: 22415031 DOI: 10.3791/3930] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/05/2023] Open
Abstract
The key first step in evaluating pathogen levels in suspected contaminated water is concentration. Concentration methods tend to be specific for a particular pathogen group, for example US Environmental Protection Agency Method 1623 for Giardia and Cryptosporidium, which means multiple methods are required if the sampling program is targeting more than one pathogen group. Another drawback of current methods is the equipment can be complicated and expensive, for example the VIRADEL method with the 1MDS cartridge filter for concentrating viruses. In this article we describe how to construct glass wool filters for concentrating waterborne pathogens. After filter elution, the concentrate is amenable to a second concentration step, such as centrifugation, followed by pathogen detection and enumeration by cultural or molecular methods. The filters have several advantages. Construction is easy and the filters can be built to any size for meeting specific sampling requirements. The filter parts are inexpensive, making it possible to collect a large number of samples without severely impacting a project budget. Large sample volumes (100s to 1,000s L) can be concentrated depending on the rate of clogging from sample turbidity. The filters are highly portable and with minimal equipment, such as a pump and flow meter, they can be implemented in the field for sampling finished drinking water, surface water, groundwater, and agricultural runoff. Lastly, glass wool filtration is effective for concentrating a variety of pathogen types so only one method is necessary. Here we report on filter effectiveness in concentrating waterborne human enterovirus, Salmonella enterica, Cryptosporidium parvum, and avian influenza virus.
Collapse
|
44
|
Virus contamination from operation and maintenance events in small drinking water distribution systems. JOURNAL OF WATER AND HEALTH 2011; 9:799-812. [PMID: 22048438 DOI: 10.2166/wh.2011.018] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/22/2023]
Abstract
We tested the association of common events in drinking water distribution systems with contamination of household tap water with human enteric viruses. Viruses were enumerated by qPCR in the tap water of 14 municipal systems that use non-disinfected groundwater. Ultraviolet disinfection was installed at all active wellheads to reduce virus contributions from groundwater to the distribution systems. As no residual disinfectant was added to the water, any increase in virus levels measured downstream at household taps would be indicative of distribution system intrusions. Utility operators reported events through written questionnaires. Virus outcome measures were related to distribution system events using binomial and gamma regression. Virus concentrations were elevated in the wells, reduced or eliminated by ultraviolet disinfection, and elevated again in distribution systems, showing that viruses were, indeed, directly entering the systems. Pipe installation was significantly associated with higher virus levels, whereas hydrant flushing was significantly associated with lower virus levels. Weak positive associations were observed for water tower maintenance, valve exercising, and cutting open a water main. Coliform bacteria detections from routine monitoring were not associated with viruses. Understanding when distribution systems are most vulnerable to virus contamination, and taking precautionary measures, will ensure delivery of safe drinking water.
Collapse
|
45
|
Lachnospiraceae and Bacteroidales alternative fecal indicators reveal chronic human sewage contamination in an urban harbor. Appl Environ Microbiol 2011. [PMID: 21803887 DOI: 10.1128/aem.05480-11noaa.2013a] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/21/2023] Open
Abstract
The complexity of fecal microbial communities and overlap among human and other animal sources have made it difficult to identify source-specific fecal indicator bacteria. However, the advent of next-generation sequencing technologies now provides increased sequencing power to resolve microbial community composition within and among environments. These data can be mined for information on source-specific phylotypes and/or assemblages of phylotypes (i.e., microbial signatures). We report the development of a new genetic marker for human fecal contamination identified through microbial pyrotag sequence analysis of the V6 region of the 16S rRNA gene. Sequence analysis of 37 sewage samples and comparison with database sequences revealed a human-associated phylotype within the Lachnospiraceae family, which was closely related to the genus Blautia. This phylotype, termed Lachno2, was on average the second most abundant fecal bacterial phylotype in sewage influent samples from Milwaukee, WI. We developed a quantitative PCR (qPCR) assay for Lachno2 and used it along with the qPCR-based assays for human Bacteroidales (based on the HF183 genetic marker), total Bacteroidales spp., and enterococci and the conventional Escherichia coli and enterococci plate count assays to examine the prevalence of fecal and human fecal pollution in Milwaukee's harbor. Both the conventional fecal indicators and the human-associated indicators revealed chronic fecal pollution in the harbor, with significant increases following heavy rain events and combined sewer overflows. The two human-associated genetic marker abundances were tightly correlated in the harbor, a strong indication they target the same source (i.e., human sewage). Human adenoviruses were routinely detected under all conditions in the harbor, and the probability of their occurrence increased by 154% for every 10-fold increase in the human indicator concentration. Both Lachno2 and human Bacteroidales increased specificity to detect sewage compared to general indicators, and the relationship to a human pathogen group suggests that the use of these alternative indicators will improve assessments for human health risks in urban waters.
Collapse
|
46
|
Abstract
Septic systems that are built in compliance with regulations are generally not expected to be the cause of groundwater borne disease outbreaks, especially in areas with thick vadose zones. However, this case study demonstrates that a disease outbreak can occur in such a setting and outlines the combination of epidemiological, microbiological, and hydrogeological methods used to confirm the source of the outbreak. In early June 2007, 229 patrons and employees of a new restaurant in northeastern Wisconsin were affected by acute gastroenteritis; 6 people were hospitalized. Epidemiological case-control analysis indicated that drinking the restaurant's well water was associated with illness (odds ratio = 3.2, 95% confidence interval = 0.9 to 11.4, P = 0.06). Microbiological analysis (quantitative reverse transcription-polymerase chain reaction) measured 50 genomic copies per liter of norovirus genogroup I in the well water. Nucleotide sequencing determined the genotype as GI.2 and further showed the identical virus was present in patrons' stool specimens and in the septic tank. Tracer tests using dyes injected at two points in the septic system showed that effluent was traveling from the tanks (through a leaking fitting) and infiltration field to the well in 6 and 15 d, respectively. The restaurant septic system and well (85-m deep, in a fractured dolomite aquifer) both conformed to state building codes. The early arrival of dye in the well, which was 188 m from the septic field and located beneath a 35-m thick vadose zone, demonstrates that in highly vulnerable hydrogeological settings, compliance with regulations may not provide adequate protection from fecal pathogens.
Collapse
|
47
|
Assessment of sewer source contamination of drinking water wells using tracers and human enteric viruses. ENVIRONMENTAL SCIENCE & TECHNOLOGY 2010; 44:7956-63. [PMID: 20822128 DOI: 10.1021/es100698m] [Citation(s) in RCA: 42] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/20/2023]
Abstract
This study investigated the source, transport, and occurrence of human enteric viruses in municipal well water, focusing on sanitary sewer sources. A total of 33 wells from 14 communities were sampled once for wastewater tracers and viruses. Wastewater tracers were detected in four of these wells, and five wells were virus- positive by qRT-PCR. These results, along with exclusion of wells with surface water sources, were used to select three wells for additional investigation. Viruses and wastewater tracers were found in the groundwater at all sites. Some wastewater tracers, such as ionic detergents, flame retardants, and cholesterol, were considered unambiguous evidence of wastewater. Sampling at any given time may not show concurrent virus and tracer presence; however, given sufficient sampling over time, a relation between wastewater tracers and virus occurrence was identified. Presence of infectious viruses at the wellhead demonstrates that high-capacity pumping induced sufficiently short travel times for the transport of infectious viruses. Therefore, drinking-water wells are vulnerable to contaminants that travel along fast groundwater flowpaths even if they contribute a small amount of virus-laden water to the well. These results suggest that vulnerability assessments require characterization of "low yield-fast transport" in addition to traditional "high yield-slow transport", pathways.
Collapse
|
48
|
New mathematical approaches to quantify human infectious viruses from environmental media using integrated cell culture-qPCR. J Virol Methods 2009; 163:244-52. [PMID: 19835913 DOI: 10.1016/j.jviromet.2009.10.002] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2009] [Revised: 09/28/2009] [Accepted: 10/06/2009] [Indexed: 10/20/2022]
Abstract
Quantifying infectious viruses by cell culture depends on visualizing cytopathic effect, or for integrated cell culture-PCR, attaining confidence a PCR-positive signal is the result of virus growth and not inoculum carryover. This study developed mathematical methods to calculate infectious virus numbers based on viral growth kinetics in cell culture. Poliovirus was inoculated into BGM cell monolayers at 10 concentrations from 0.001 to 1000 PFU/ml. Copy numbers of negative-strand RNA, a marker of infectivity for single-stranded positive RNA viruses, were measured over time by qRT-PCR. Growth data were analyzed by two approaches. First, data were fit with a continuous function to estimate directly the initial virus number, expressed as genomic copies. Such estimates correlated with actual inoculum numbers across all concentrations (R(2)=0.62, n=17). Second, the length of lag phase appeared to vary inversely with inoculum titers; hence, standard curves to predict inoculum virus numbers were derived based on three definitions of lag time: (1) time of first detection of (-)RNA, (2) second derivative maximum of the fitted continuous function, and (3) time when the fitted curve crossed a threshold (-)RNA concentration. All three proxies yielded standard curves with R(2)=0.69-0.90 (n=17). The primary advantage of these growth kinetics approaches is being able to quantify virions that are unambiguously infectious, a particular advantage for viruses that do not produce CPE.
Collapse
|
49
|
Effects of etiological agent and bather shedding of pathogens on interpretation of epidemiological data used to establish recreational water quality standards. RISK ANALYSIS : AN OFFICIAL PUBLICATION OF THE SOCIETY FOR RISK ANALYSIS 2009; 29:257-266. [PMID: 19144071 DOI: 10.1111/j.1539-6924.2008.01184.x] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
The overall goal of the study reported herein was to use techniques in the field of risk assessment (specifically a state-space population dynamic model of disease transmission within recreational waters) to explore the relative significance of (1) active shedding of microorganisms from bathers themselves, and (2) the type and concentration of etiological agent on the observed heterogeneity of the incidence of illness in epidemiological studies that have been used to develop ambient water quality criteria. The etiological agent and corresponding dose ingested during recreational contact was found to significantly impact the observed incidence of illness in an epidemiological study conducted in recreational water. In addition, the observed incidence of illness was found not to necessarily reflect background concentrations of indicator organisms, but rather microorganisms shed during recreational contact. Future revisions to ambient water quality criteria should address the etiological agent, dose, and the significance of microbial shedding relative to background concentrations of pathogens and indicator organisms in addition to the incidence of illness and concentration of indicator organisms. Without a quantitative assessment of these additional variables, study findings may potentially be site specific and not representative of the health risks associated with specific indicator concentrations in all recreational waters.
Collapse
|
50
|
Effect of pathogen concentrations on removal of Cryptosporidium and Giardia by conventional drinking water treatment. WATER RESEARCH 2008; 42:2678-2690. [PMID: 18313095 DOI: 10.1016/j.watres.2008.01.021] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/09/2007] [Revised: 01/21/2008] [Accepted: 01/22/2008] [Indexed: 05/26/2023]
Abstract
The presence of waterborne enteric pathogens in municipal water supplies contributes risk to public health. To evaluate the removal of these pathogens in drinking water treatment processes, previous researchers have spiked raw waters with up to 10(6) pathogens/L in order to reliably detect the pathogens in treated water. These spike doses are 6-8 orders of magnitude higher than pathogen concentrations routinely observed in practice. In the present study, experiments were conducted with different sampling methods (i.e., grab versus continuous sampling) and initial pathogen concentrations ranging from 10(1) to 10(6) pathogens/L. Results showed that Cryptosporidium oocyst and Giardia cyst removal across conventional treatment were dependent on initial pathogen concentrations, with lower pathogen removals observed when lower initial pathogen spike doses were used. In addition, higher raw water turbidity appeared to result in higher log removal for both Cryptosporidium oocysts and Giardia cysts.
Collapse
|