1
|
Bumbăcilă B, Putz MV. Neurotoxicity of Pesticides: The Roadmap for the Cubic Mode of Action. Curr Med Chem 2020; 27:54-77. [DOI: 10.2174/0929867326666190704142354] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2018] [Revised: 05/08/2019] [Accepted: 06/20/2019] [Indexed: 01/15/2023]
Abstract
Pesticides are used today on a planetary-wide scale. The rising need for substances with this
biological activity due to an increasing consumption of agricultural and animal products and to the
development of urban areas makes the chemical industry to constantly investigate new molecules or to
improve the physicochemical characteristics, increase the biological activities and improve the toxicity
profiles of the already known ones. Molecular databases are increasingly accessible for in vitro and in
vivo bioavailability studies. In this context, structure-activity studies, by their in silico - in cerebro
methods, are used to precede in vitro and in vivo studies in plants and experimental animals because
they can indicate trends by statistical methods or biological activity models expressed as mathematical
equations or graphical correlations, so a direction of study can be developed or another can be abandoned,
saving financial resources, time and laboratory animals. Following this line of research the present paper
reviews the Structure-Activity Relationship (SAR) studies and proposes a correlation between a topological
connectivity index and the biological activity or toxicity made as a result of a study performed on 11 molecules
of organophosphate compounds, randomly chosen, with a basic structure including a Phosphorus atom
double bounded to an Oxygen atom or to a Sulfur one and having three other simple covalent bonds with two
alkoxy (-methoxy or -ethoxy) groups and to another functional group different from the alkoxy groups. The
molecules were packed on a cubic structure consisting of three adjacent cubes, respecting a principle of topological
efficiency, that of occupying a minimal space in that cubic structure, a method that was called the Clef
Method. The central topological index selected for correlation was the Wiener index, since it was possible
this way to discuss different adjacencies between the nodes in the graphs corresponding to the organophosphate
compounds molecules packed on the cubic structure; accordingly, "three dimensional" variants of these
connectivity indices could be considered and further used for studying the qualitative-quantitative relationships
for the specific molecule-enzyme interaction complexes, including correlation between the Wiener
weights (nodal specific contributions to the total Wiener index of the molecular graph) and the biochemical
reactivity of some of the atoms. Finally, when passing from SAR to Q(uantitative)-SAR studies, especially by
the present advanced method of the cubic molecule (Clef Method) and its good assessment of the
(neuro)toxicity of the studied molecules and of their inhibitory effect on the target enzyme - acetylcholinesterase,
it can be seen that a predictability of the toxicity and activity of different analogue compounds can
be ensured, facilitating the in vivo experiments or improving the usage of pesticides.
Collapse
Affiliation(s)
- Bogdan Bumbăcilă
- Laboratory of Computational and Structural Physical-Chemistry for Nanosciences and QSAR, Biology- Chemistry Department, Faculty of Chemistry, Biology, Geography at West University of Timisoara, Pestalozzi Street No.16, Timisoara RO-300115, Romania
| | - Mihai V. Putz
- Laboratory of Computational and Structural Physical-Chemistry for Nanosciences and QSAR, Biology- Chemistry Department, Faculty of Chemistry, Biology, Geography at West University of Timisoara, Pestalozzi Street No.16, Timisoara RO-300115, Romania
| |
Collapse
|
2
|
Tan YM, Leonard JA, Edwards S, Teeguarden J, Paini A, Egeghy P. Aggregate Exposure Pathways in Support of Risk Assessment. CURRENT OPINION IN TOXICOLOGY 2018; 9:8-13. [PMID: 29736486 DOI: 10.1016/j.cotox.2018.03.006] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022]
Abstract
Over time, risk assessment has shifted from establishing relationships between exposure to a single chemical and a resulting adverse health outcome, to evaluation of multiple chemicals and disease outcomes simultaneously. As a result, there is an increasing need to better understand the complex mechanisms that influence risk of chemical and non-chemical stressors, beginning at their source and ending at a biological endpoint relevant to human or ecosystem health risk assessment. Just as the Adverse Outcome Pathway (AOP) framework has emerged as a means of providing insight into mechanism-based toxicity, the exposure science community has seen the recent introduction of the Aggregate Exposure Pathway (AEP) framework. AEPs aid in making exposure data applicable to the FAIR (i.e., findable, accessible, interoperable, and reusable) principle, especially by (1) organizing continuous flow of disjointed exposure information;(2) identifying data gaps, to focus resources on acquiring the most relevant data; (3) optimizing use and repurposing of existing exposure data; and (4) facilitating interoperability among predictive models. Herein, we discuss integration of the AOP and AEP frameworks and how such integration can improve confidence in both traditional and cumulative risk assessment approaches.
Collapse
Affiliation(s)
- Yu-Mei Tan
- National Exposure Research Laboratory, U.S. Environmental Protection Agency, Durham, North Carolina 27709, United States
| | - Jeremy A Leonard
- Oak Ridge Institute for Science and Education, Oak Ridge, Tennessee 37831, United States
| | - Stephen Edwards
- National Health and Environmental Effects Research Laboratory, U.S. Environmental Protection Agency, Durham, North Carolina 27709, United States
| | - Justin Teeguarden
- Health Effects and Exposure Science, Pacific Northwest National Laboratory, Richland, Washington 99352, United States
| | - Alicia Paini
- European Commission, Joint Research Centre, Directorate Health, Consumers and Reference Materials, Via E Fermi 2749, 21027 Ispra, Italy
| | - Peter Egeghy
- National Exposure Research Laboratory, U.S. Environmental Protection Agency, Durham, North Carolina 27709, United States
| |
Collapse
|
3
|
MacGillivray BH. Heuristics structure and pervade formal risk assessment. RISK ANALYSIS : AN OFFICIAL PUBLICATION OF THE SOCIETY FOR RISK ANALYSIS 2014; 34:771-787. [PMID: 24152168 DOI: 10.1111/risa.12136] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
Abstract
Lay perceptions of risk appear rooted more in heuristics than in reason. A major concern of the risk regulation literature is that such "error-strewn" perceptions may be replicated in policy, as governments respond to the (mis)fears of the citizenry. This has led many to advocate a relatively technocratic approach to regulating risk, characterized by high reliance on formal risk and cost-benefit analysis. However, through two studies of chemicals regulation, we show that the formal assessment of risk is pervaded by its own set of heuristics. These include rules to categorize potential threats, define what constitutes valid data, guide causal inference, and to select and apply formal models. Some of these heuristics lay claim to theoretical or empirical justifications, others are more back-of-the-envelope calculations, while still more purport not to reflect some truth but simply to constrain discretion or perform a desk-clearing function. These heuristics can be understood as a way of authenticating or formalizing risk assessment as a scientific practice, representing a series of rules for bounding problems, collecting data, and interpreting evidence (a methodology). Heuristics are indispensable elements of induction. And so they are not problematic per se, but they can become so when treated as laws rather than as contingent and provisional rules. Pitfalls include the potential for systematic error, masking uncertainties, strategic manipulation, and entrenchment. Our central claim is that by studying the rules of risk assessment qua rules, we develop a novel representation of the methods, conventions, and biases of the prior art.
Collapse
Affiliation(s)
- Brian H MacGillivray
- Sustainable Places Research Institute, Cardiff University, 33 Park Pl., Cardiff, CF10 3BA and CRASSH, University of Cambridge.
| |
Collapse
|
4
|
Patlewicz G, Simon T, Goyak K, Phillips RD, Rowlands JC, Seidel SD, Becker RA. Use and validation of HT/HC assays to support 21st century toxicity evaluations. Regul Toxicol Pharmacol 2013; 65:259-68. [PMID: 23291301 DOI: 10.1016/j.yrtph.2012.12.008] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2012] [Revised: 12/20/2012] [Accepted: 12/23/2012] [Indexed: 01/08/2023]
Abstract
Advances in high throughput and high content (HT/HC) methods such as those used in the fields of toxicogenomics, bioinformatics, and computational toxicology have the potential to improve both the efficiency and effectiveness of toxicity evaluations and risk assessments. However, prior to use, scientific confidence in these methods should be formally established. Traditional validation approaches that define relevance, reliability, sensitivity and specificity may not be readily applicable. HT/HC methods are not exact replacements for in vivo testing, and although run individually, these assays are likely to be used as a group or battery for decision making and use robotics, which may be unique in each laboratory setting. Building on the frameworks developed in the 2010 Institute of Medicine Report on Biomarkers and the OECD 2007 Report on (Q)SAR Validation, we present constructs that can be adapted to address the validation challenges of HT/HC methods. These are flexible, transparent, and require explicit specification of context and purpose of use such that scientific confidence (validation) can be defined to meet different regulatory applications. Using these constructs, we discuss how anchoring the assays and their prediction models to Adverse Outcome Pathways (AOPs) could facilitate the interpretation of results and support scientifically defensible fit-for-purpose applications.
Collapse
Affiliation(s)
- Grace Patlewicz
- DuPont Haskell Global Centers for Health and Environmental Sciences, 1090 Elkton Road, Newark, DE 19711, USA.
| | | | | | | | | | | | | |
Collapse
|
5
|
Judson R, Kavlock R, Martin M, Reif D, Houck K, Knudsen T, Richard A, Tice RR, Whelan M, Xia M, Huang R, Austin C, Daston G, Hartung T, Fowle JR, Wooge W, Tong W, Dix D. Perspectives on validation of high-throughput assays supporting 21st century toxicity testing. ALTEX 2013; 30:51-6. [PMID: 23338806 PMCID: PMC3934015 DOI: 10.14573/altex.2013.1.051] [Citation(s) in RCA: 106] [Impact Index Per Article: 9.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
In vitro high-throughput screening (HTS) assays are seeing increasing use in toxicity testing. HTS assays can simultaneously test many chemicals but have seen limited use in the regulatory arena, in part because of the need to undergo rigorous, time-consuming formal validation. Here we discuss streamlining the validation process, specifically for prioritization applications. By prioritization, we mean a process in which less complex, less expensive, and faster assays are used to prioritize which chemicals are subjected first to more complex, expensive, and slower guideline assays. Data from the HTS prioritization assays is intended to provide a priori evidence that certain chemicals have the potential to lead to the types of adverse effects that the guideline tests are assessing. The need for such prioritization approaches is driven by the fact that there are tens of thousands of chemicals to which people are exposed, but the yearly throughput of most guideline assays is small in comparison. The streamlined validation process would continue to ensure the reliability and relevance of assays for this application. We discuss the following practical guidelines: (1) follow current validation practice to the extent possible and practical; (2) make increased use of reference compounds to better demonstrate assay reliability and relevance; (3) de-emphasize the need for cross-laboratory testing; and (4) implement a web-based, transparent, and expedited peer review process.
Collapse
Affiliation(s)
- Richard Judson
- National Center for Computational Toxicology, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711, USA.
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
Collapse
|
6
|
Abstract
Developmental toxicity may be estimated using commercial and noncommercial software that is already available in the market and/or literature, or models may be built from scratch using both commercial and noncommercial software packages. In this chapter, commonly available software programs that can predict the developmental toxicity of chemicals are described. In addition, a method for developing qualitative structure-activity relationship (SAR) models to predict the developmental toxicity of chemicals qualitatively (yes/no prediction) and quantitative structure-activity relationship (QSAR) models to predict quantitative estimates (e.g., LOAEL) of developmental toxicants is also described in this chapter. Additional information described in this chapter include methods to predict physicochemical properties of chemicals that can be used as descriptor variables in the model building process, statistical methods that be used to build QSAR models as well as methods to validate the models that are developed. Most of the methods described in this chapter can be used to develop models for health endpoints other than developmental toxicity as well.
Collapse
|
7
|
Makinson KA, Hamby DM, Edwards JA. A review of contemporary methods for the presentation of scientific uncertainty. HEALTH PHYSICS 2012; 103:714-731. [PMID: 23281507 DOI: 10.1097/hp.0b013e31824e6f6f] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/01/2023]
Abstract
Graphic methods for displaying uncertainty are often the most concise and informative way to communicate abstract concepts. Presentation methods currently in use for the display and interpretation of scientific uncertainty are reviewed. Numerous subjective and objective uncertainty display methods are presented, including qualitative assessments, node and arrow diagrams, standard statistical methods, box-and-whisker plots,robustness and opportunity functions, contribution indexes, probability density functions, cumulative distribution functions, and graphical likelihood functions.
Collapse
Affiliation(s)
- K A Makinson
- Department of NuclearEngineering and Radiation Health Physics Oregon State University,116 Radiation Center Corvallis, OR 97331-5303, USA.
| | | | | |
Collapse
|
8
|
Plunkett LM, Kaplan AM, Becker RA. An enhanced tiered toxicity testing framework with triggers for assessing hazards and risks of commodity chemicals. Regul Toxicol Pharmacol 2010; 58:382-94. [DOI: 10.1016/j.yrtph.2010.08.003] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2010] [Revised: 08/01/2010] [Accepted: 08/09/2010] [Indexed: 01/12/2023]
|
9
|
Dekant W, Melching-Kollmuß S, Kalberlah F. Toxicity assessment strategies, data requirements, and risk assessment approaches to derive health based guidance values for non-relevant metabolites of plant protection products. Regul Toxicol Pharmacol 2010; 56:135-42. [DOI: 10.1016/j.yrtph.2009.10.003] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2009] [Revised: 09/28/2009] [Accepted: 10/22/2009] [Indexed: 10/20/2022]
|
10
|
Dellarco V, Henry T, Sayre P, Seed J, Bradbury S. Meeting the common needs of a more effective and efficient testing and assessment paradigm for chemical risk management. JOURNAL OF TOXICOLOGY AND ENVIRONMENTAL HEALTH. PART B, CRITICAL REVIEWS 2010; 13:347-360. [PMID: 20574907 DOI: 10.1080/10937404.2010.483950] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/29/2023]
Abstract
Significant advances have been made in human health and ecological risk assessment over the last decade. Substantial challenges, however, remain in providing credible scientific information in a timely and efficient manner to support chemical risk assessment and management decisions. A major challenge confronting risk managers is the need for critical information to address risk uncertainties in large chemical inventories such as high- and medium-production-volume industrial chemicals or pesticide inert ingredients. From a strategic and tactical viewpoint, an integrated approach that relies on all existing knowledge and uses a range of methods, including those from emerging and novel technologies, is needed to advance progressive and focused testing strategies, as well as to advance the utility and predictability of the risk assessment by providing more relevant information. A hypothesis-based approach that draws on all relevant information is consistent with the vision articulated in the 2007 report by the National Research Council, Toxicity Testing in the 21st Century: A Vision and a Strategy. This article describes the current practices in evaluating chemical risks and ongoing efforts to enhance the quality and efficiency of risk assessment and risk management decisions within the Office of Prevention, Pesticides, and Toxic Substances at the U.S. Environmental Protection Agency.
Collapse
Affiliation(s)
- Vicki Dellarco
- Office of Pesticides, Prevention and Toxic Substances, US Environmental Protection Agency, Washington, DC, USA.
| | | | | | | | | |
Collapse
|
11
|
Mansour SA, Gad MF. Risk assessment of pesticides and heavy metals contaminants in vegetables: a novel bioassay method using Daphnia magna Straus. Food Chem Toxicol 2009; 48:377-89. [PMID: 19853633 DOI: 10.1016/j.fct.2009.10.026] [Citation(s) in RCA: 24] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2009] [Revised: 10/13/2009] [Accepted: 10/19/2009] [Indexed: 11/28/2022]
Abstract
Cucumber and potato samples of known levels of pesticides and heavy metal residues, as respectively measured by gas chromatography and atomic absorption, were subjected to a bioassay method using Daphnia magna in order to assess the potential of the toxic hazard of their contaminants. Based on the estimated lethal time for 50% mortality (LT50) in daphnids, we suggested a classification to categorize toxic hazards in six definite ratings. Either samples of cucumbers (from conventional, greenhouse and organic farming) or potatoes (from conventional and organic farming) were evaluated for toxic hazard of the mixtures of pesticide residues and heavy metals, as well as mixtures of both. Accordingly, a 53.7% of cucumber samples were ranked as "Highly Toxic: HT"; a 18.5% "Moderately Toxic: MT); a 9.3% "Slightly Toxic: ST"; and a 18.5% "Practically Non-Toxic: NT". For potato samples, the ranking pattern to different classes was: Extremely Toxic: ET (LT50=<1h) for 11.1%; Very Toxic: VT (LT50=1-<3h) for 50.0%; HT (LT50=3-<12h) for 13.9%; MT (LT50=12-<24h) for 11.1%; ST (LT50=24-48 h) for 0.0%; and NT (LT50= >48 h) for 13.9% of the samples bioassayed.
Collapse
Affiliation(s)
- Sameeh A Mansour
- Environmental Toxicology Research Unit, Pesticide Chemistry Department, National Research Centre, Tahrir Str, Dokki, Cairo, Egypt.
| | | |
Collapse
|
12
|
Benigni R, Bossa C. Predictivity and Reliability of QSAR Models: The Case of Mutagens and Carcinogens. Toxicol Mech Methods 2008; 18:137-47. [PMID: 20020910 DOI: 10.1080/15376510701857056] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
13
|
Systems toxicology: using the systems biology approach to assess chemical pollutants in the environment. COMPARATIVE TOXICOGENOMICS 2008. [DOI: 10.1016/s1872-2423(08)00007-0] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/30/2022]
|