1
|
Arbulú I, Razumova M, Rey-Maquieira J, Sastre F. Measuring risks and vulnerability of tourism to the COVID-19 crisis in the context of extreme uncertainty: The case of the Balearic Islands. TOURISM MANAGEMENT PERSPECTIVES 2021; 39:100857. [PMID: 34580625 PMCID: PMC8458003 DOI: 10.1016/j.tmp.2021.100857] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/23/2020] [Revised: 06/04/2021] [Accepted: 07/08/2021] [Indexed: 05/22/2023]
Abstract
The COVID-19 crisis is dramatically affecting the world economy and, particularly, the tourism sector. In the context of extreme uncertainty, the use of probabilistic forecasting models is especially suitable. We use Monte Carlo simulations to evaluate the outcomes of four possible tourism demand recovery scenarios in the Balearic Islands, which are further used to measure the risks and vulnerability of Balearic economy to the COVID-19 crisis. Our results show that fear of contagion and loss of income in tourism emitting countries will result in a maximum 89% drop in arrivals in the Balearic Islands in 2020.Given that most tourism-related occupations are not highly skilled and are characterized by lower salaries, there are greater risks of loss of welfare, especially for women, who are a major share of the tourism labour force.The model shows important differences among minimum, average and maximum estimates for tourism sector production in 2021, reflecting considerable uncertainty regarding the speed of the sector's recovery. The results serve as a basis to prepare a range of policies to reduce destination vulnerability under different crisis outcomes.
Collapse
Affiliation(s)
- Italo Arbulú
- Department of Applied Economics, University of the Balearic Islands, Cra. deValldemossa, km 7.5. Palma de Mallorca, Baleares 07122, Spain
- Department of Economics, Universidad del Pacífico, 2020 Salaverry Ave, Jesús María, Lima, Peru
| | - Maria Razumova
- Felipe Moreno University College of Tourism, University of the Balearic Islands, C/ Sol, 3. Palma de Mallorca, Balearic Islands 07001, Spain
| | - Javier Rey-Maquieira
- Department of Applied Economics, University of the Balearic Islands, Cra. deValldemossa, km 7.5. Palma de Mallorca, Baleares 07122, Spain
| | - Francesc Sastre
- Department of Applied Economics, University of the Balearic Islands, Cra. deValldemossa, km 7.5. Palma de Mallorca, Baleares 07122, Spain
| |
Collapse
|
2
|
Williams BK, Brown ED. Scenarios for valuing sample information in natural resources. Methods Ecol Evol 2020. [DOI: 10.1111/2041-210x.13487] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/05/2023]
Affiliation(s)
| | - Eleanor D. Brown
- Science and Decisions Center U.S. Geological Survey Reston VA USA
| |
Collapse
|
3
|
Guillou S, Membré JM. Inactivation of Listeria monocytogenes, Staphylococcus aureus, and Salmonella enterica under High Hydrostatic Pressure: A Quantitative Analysis of Existing Literature Data. J Food Prot 2019; 82:1802-1814. [PMID: 31545104 DOI: 10.4315/0362-028x.jfp-19-132] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
High hydrostatic pressure processing (HPP) is a mild preservation technique, and its use for processing foods has been widely documented in the literature. However, very few quantitative synthesis studies have been conducted to gather and analyze bacterial inactivation data to identify the mechanisms of HPP-induced bacterial inactivation. The purpose of this study was to conduct a quantitative analysis of three-decimal reduction times (t3δ) from a large set of existing studies to determine the main influencing factors of HPP-induced inactivation of three foodborne pathogens (Listeria monocytogenes, Staphylococcus aureus, and Salmonella enterica) in various foods. Inactivation kinetics data sets from 1995 to 2017 were selected, and t3δ values were first estimated by using the nonlinear Weibull model. Bayesian inference was then used within a metaregression analysis to build and test several models and submodels. The best model (lowest error and most parsimonious) was a hierarchical mixed-effects model including pressure intensity, temperature, study, pH, species, and strain as explicative variables and significant factors. Values for t3δ and ZP associated with inactivation under HPP were estimated for each bacterial pathogen, with their associated variability. Interstudy variability explained most of the variability in t3δ values. Strain variability was also important and exceeded interstudy variability for S. aureus, which prevented the development of an overall model for this pathogen. Meta-analysis is not often used in food microbiology but was a valuable quantitative tool for modeling inactivation of L. monocytogenes and Salmonella in response to HPP treatment. Results of this study could be useful for refining quantitative assessment of the effects of HPP on vegetative foodborne pathogens or for more precisely designing costly and labor-intensive experiments with foodborne pathogens.
Collapse
Affiliation(s)
- Sandrine Guillou
- SECALIM, INRA, Oniris, Université Bretagne Loire, Nantes 44307, France (ORCID: https://orcid.org/0000-0002-0607-9229 [S.G.])
| | - Jeanne-Marie Membré
- SECALIM, INRA, Oniris, Université Bretagne Loire, Nantes 44307, France (ORCID: https://orcid.org/0000-0002-0607-9229 [S.G.])
| |
Collapse
|
4
|
Williams BK, Johnson FA. Value of sample information in dynamic, structurally uncertain resource systems. PLoS One 2018; 13:e0199326. [PMID: 29958290 PMCID: PMC6025880 DOI: 10.1371/journal.pone.0199326] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2018] [Accepted: 06/05/2018] [Indexed: 11/19/2022] Open
Abstract
Few if any natural resource systems are completely understood and fully observed. Instead, there almost always is uncertainty about the way a system works and its status at any given time, which can limit effective management. A natural approach to uncertainty is to allocate time and effort to the collection of additional data, on the reasonable assumption that more information will facilitate better understanding and lead to better management. But the collection of more data, either through observation or investigation, requires time and effort that often can be put to other conservation activities. An important question is whether the use of limited resources to improve understanding is justified by the resulting potential for improved management. In this paper we address directly a change in value from new information collected through investigation. We frame the value of information in terms of learning through the management process itself, as well as learning through investigations that are external to the management process but add to our base of understanding. We provide a conceptual framework and metrics for this issue, and illustrate them with examples involving Florida scrub-jays (Aphelocoma coerulescens).
Collapse
Affiliation(s)
- Byron K. Williams
- Renewable Resources Associates, Oakton, VA, United States of America
| | - Fred A. Johnson
- Wetland and Aquatic Research Center, U.S. Geological Survey, Gainesville, FL, United States of America
| |
Collapse
|
5
|
Ades AE, Lu G, Claxton K. Expected Value of Sample Information Calculations in Medical Decision Modeling. Med Decis Making 2016; 24:207-27. [PMID: 15090106 DOI: 10.1177/0272989x04263162] [Citation(s) in RCA: 274] [Impact Index Per Article: 34.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
There has been an increasing interest in using expected value of information (EVI) theory in medical decision making, to identify the need for further research to reduce uncertainty in decision and as a tool for sensitivity analysis. Expected value of sample information (EVSI) has been proposed for determination of optimum sample size and allocation rates in randomized clinical trials. This article derives simple Monte Carlo, or nested Monte Carlo, methods that extend the use of EVSI calculations to medical decision applications with multiple sources of uncertainty, with particular attention to the form in which epidemiological data and research findings are structured. In particular, information on key decision parameters such as treatment efficacy are invariably available on measures of relative efficacy such as risk differences or odds ratios, but not on model parameters themselves. In addition, estimates of model parameters and of relative effect measures in the literature may be heterogeneous, reflecting additional sources of variation besides statistical sampling error. The authors describe Monte Carlo procedures for calculating EVSI for probability, rate, or continuous variable parameters in multi parameter decision models and approximate methods for relative measures such as risk differences, odds ratios, risk ratios, and hazard ratios. Where prior evidence is based on a random effects meta-analysis, the authors describe different ESVI calculations, one relevant for decisions concerning a specific patient group and the other for decisions concerning the entire population of patient groups. They also consider EVSI methods for new studies intended to update information on both baseline treatment efficacy and the relative efficacy of 2 treatments. Although there are restrictions regarding models with prior correlation between parameters, these methods can be applied to the majority of probabilistic decision models. Illustrative worked examples of EVSI calculations are given in an appendix.
Collapse
Affiliation(s)
- A E Ades
- Medical Research Council Health Services Research Collaboration, Bristol, United Kingdom.
| | | | | |
Collapse
|
6
|
Dagnas S, Gougouli M, Onno B, Koutsoumanis KP, Membré JM. Quantifying the effect of water activity and storage temperature on single spore lag times of three moulds isolated from spoiled bakery products. Int J Food Microbiol 2016; 240:75-84. [PMID: 27325576 DOI: 10.1016/j.ijfoodmicro.2016.06.013] [Citation(s) in RCA: 38] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2016] [Revised: 05/27/2016] [Accepted: 06/11/2016] [Indexed: 10/21/2022]
Abstract
The inhibitory effect of water activity (aw) and storage temperature on single spore lag times of Aspergillus niger, Eurotium repens (Aspergillus pseudoglaucus) and Penicillium corylophilum strains isolated from spoiled bakery products, was quantified. A full factorial design was set up for each strain. Data were collected at levels of aw varying from 0.80 to 0.98 and temperature from 15 to 35°C. Experiments were performed on malt agar, at pH5.5. When growth was observed, ca 20 individual growth kinetics per condition were recorded up to 35days. Radius of the colony vs time was then fitted with the Buchanan primary model. For each experimental condition, a lag time variability was observed, it was characterized by its mean, standard deviation (sd) and 5th percentile, after a Normal distribution fit. As the environmental conditions became stressful (e.g. storage temperature and aw lower), mean and sd of single spore lag time distribution increased, indicating longer lag times and higher variability. The relationship between mean and sd followed a monotonous but not linear pattern, identical whatever the species. Next, secondary models were deployed to estimate the cardinal values (minimal, optimal and maximal temperatures, minimal water activity where no growth is observed anymore) for the three species. That enabled to confirm the observation made based on raw data analysis: concerning the temperature effect, A. niger behaviour was significantly different from E. repens and P. corylophilum: Topt of 37.4°C (standard deviation 1.4°C) instead of 27.1°C (1.4°C) and 25.2°C (1.2°C), respectively. Concerning the aw effect, from the three mould species, E. repens was the species able to grow at the lowest aw (awmin estimated to 0.74 (0.02)). Finally, results obtained with single spores were compared to findings from a previous study carried out at the population level (Dagnas et al., 2014). For short lag times (≤5days), there was no difference between lag time of the population (ca 2000 spores inoculated in one spot) and mean (nor 5th percentile) of single spore lag time distribution. In contrast, when lag time was longer, i.e. under more stressful conditions, there was a discrepancy between individual and population lag times (population lag times shorter than 5th percentiles of single spore lag time distribution), confirming a stochastic process. Finally, the temperature cardinal values estimated with single spores were found to be similar to those obtained at the population level, whatever the species. All these findings will be used to describe better mould spore lag time variability and then to predict more accurately bakery product shelf-life.
Collapse
Affiliation(s)
- Stéphane Dagnas
- L'Université Nantes Angers Le Mans, Oniris, Nantes F-44322 cedex 3, France
| | - Maria Gougouli
- Laboratory of Food Microbiology and Hygiene, Department of Food Science and Technology, School of Agriculture, Aristotle University of Thessaloniki, Thessaloniki 54124, Greece; Department of Food Science and Technology, Perrotis College, American Farm School, Thessaloniki 55102, Greece
| | - Bernard Onno
- L'Université Nantes Angers Le Mans, Oniris, Nantes F-44322 cedex 3, France
| | - Konstantinos P Koutsoumanis
- Laboratory of Food Microbiology and Hygiene, Department of Food Science and Technology, School of Agriculture, Aristotle University of Thessaloniki, Thessaloniki 54124, Greece
| | - Jeanne-Marie Membré
- UMR1014 SECALIM, INRA, Oniris, 44307 Nantes, France; L'Université Nantes Angers Le Mans, Oniris, Nantes F-44322 cedex 3, France.
| |
Collapse
|
7
|
Maxwell SL, Rhodes JR, Runge MC, Possingham HP, Ng CF, McDonald-Madden E. How much is new information worth? Evaluating the financial benefit of resolving management uncertainty. J Appl Ecol 2014. [DOI: 10.1111/1365-2664.12373] [Citation(s) in RCA: 61] [Impact Index Per Article: 6.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Affiliation(s)
- Sean L. Maxwell
- ARC Centre of Excellence for Environmental Decisions; The University of Queensland; St Lucia Qld 4072 Australia
- School of Geography, Planning and Environmental Management; The University of Queensland; St Lucia Qld 4072 Australia
| | - Jonathan R. Rhodes
- ARC Centre of Excellence for Environmental Decisions; The University of Queensland; St Lucia Qld 4072 Australia
- School of Geography, Planning and Environmental Management; The University of Queensland; St Lucia Qld 4072 Australia
- NERP Environmental Decisions Hub; The University of Queensland; St Lucia Qld 4072 Australia
| | - Michael C. Runge
- US Geological Survey; Patuxent Wildlife Research Center; 12100 Beech Forest Road Laurel MD 20708 USA
| | - Hugh P. Possingham
- ARC Centre of Excellence for Environmental Decisions; The University of Queensland; St Lucia Qld 4072 Australia
- NERP Environmental Decisions Hub; The University of Queensland; St Lucia Qld 4072 Australia
- Department of Life Sciences; Imperial College London; Silwood Park Ascot SL5 7PY Berkshire UK
- School of Mathematics and Physics; The University of Queensland; St Lucia Qld 4072 Australia
| | - Chooi Fei Ng
- ARC Centre of Excellence for Environmental Decisions; The University of Queensland; St Lucia Qld 4072 Australia
- School of Mathematics and Physics; The University of Queensland; St Lucia Qld 4072 Australia
- CSIRO Ecosystem Sciences; Brisbane Qld 4102 Australia
| | - Eve McDonald-Madden
- ARC Centre of Excellence for Environmental Decisions; The University of Queensland; St Lucia Qld 4072 Australia
- School of Geography, Planning and Environmental Management; The University of Queensland; St Lucia Qld 4072 Australia
- CSIRO Ecosystem Sciences; Brisbane Qld 4102 Australia
| |
Collapse
|
8
|
Dagnas S, Onno B, Membré JM. Modeling growth of three bakery product spoilage molds as a function of water activity, temperature and pH. Int J Food Microbiol 2014; 186:95-104. [DOI: 10.1016/j.ijfoodmicro.2014.06.022] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2013] [Revised: 04/07/2014] [Accepted: 06/21/2014] [Indexed: 10/25/2022]
|
9
|
Diao MM, André S, Membré JM. Meta-analysis of D-values of proteolytic Clostridium botulinum and its surrogate strain Clostridium sporogenes PA 3679. Int J Food Microbiol 2014; 174:23-30. [PMID: 24448274 DOI: 10.1016/j.ijfoodmicro.2013.12.029] [Citation(s) in RCA: 40] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2013] [Revised: 12/17/2013] [Accepted: 12/29/2013] [Indexed: 11/17/2022]
Abstract
Foodborne botulism is a serious disease resulting from ingestion of preformed Clostridium botulinum neurotoxin in foodstuff. Since the 19th century, the heat resistance of this spore forming bacteria has been extensively studied in order to guarantee the public health associated with low acidic, ambient stable products. The most largely used heat resistance parameters in thermal settings of such products are the D121.1°C values (time required to have a 10-fold decrease of the spore count, at 121.1°C) and the z-values (temperature increase to have a 10-fold decrease of D-values). To determine D121.1°C and z-values of proteolytic C. botulinum and its nontoxigenic surrogate strain C. sporogenes PA3679, a dataset of 911 D-values was collected from 38 scientific studies. Within a meta-analysis framework, a mixed-effect linear model was developed with the log D-value (min) as response and the heat treatment temperature as explicative variable. The studies (38), the C. botulinum strains (11), and the heat treatment media (liquid media and various food matrices, split into nine categories in total) were considered as co-variables having a random effect. The species (C. botulinum and C. sporogenes) and the pH (five categories) were considered as co-variables having a fixed effect. Overall, the model gave satisfactory results with a residual standard deviation of 0.22. The heat resistance of proteolytic C. botulinum was found significantly lower than the C. sporogenes PA 3679 one: the mean D-values at the reference temperature of 121.1°C, in liquid media and pH neutral, were estimated to 0.19 and 1.28min for C. botulinum and C. sporogenes, respectively. On the other hand, the mean z-values of the two species were similar: 11.3 and 11.1°C for C. botulinum and C. sporogenes, respectively. These results will be applied to thermal settings of low-acid ambient stable products.
Collapse
Affiliation(s)
- Mamadou Moctar Diao
- INRA, UMR1014 Secalim, 44322 Nantes Cedex 3, France; LUNAM Université, Oniris, Nantes, France
| | - Stéphane André
- CTCPA, Unité de microbiologie, ZA de l'aéroport, 84911 Avignon, France
| | - Jeanne-Marie Membré
- INRA, UMR1014 Secalim, 44322 Nantes Cedex 3, France; LUNAM Université, Oniris, Nantes, France.
| |
Collapse
|
10
|
Shao K, Small MJ. Potential uncertainty reduction in model-averaged benchmark dose estimates informed by an additional dose study. RISK ANALYSIS : AN OFFICIAL PUBLICATION OF THE SOCIETY FOR RISK ANALYSIS 2011; 31:1561-1575. [PMID: 21388425 DOI: 10.1111/j.1539-6924.2011.01595.x] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/30/2023]
Abstract
A methodology is presented for assessing the information value of an additional dosage experiment in existing bioassay studies. The analysis demonstrates the potential reduction in the uncertainty of toxicity metrics derived from expanded studies, providing insights for future studies. Bayesian methods are used to fit alternative dose-response models using Markov chain Monte Carlo (MCMC) simulation for parameter estimation and Bayesian model averaging (BMA) is used to compare and combine the alternative models. BMA predictions for benchmark dose (BMD) are developed, with uncertainty in these predictions used to derive the lower bound BMDL. The MCMC and BMA results provide a basis for a subsequent Monte Carlo analysis that backcasts the dosage where an additional test group would have been most beneficial in reducing the uncertainty in the BMD prediction, along with the magnitude of the expected uncertainty reduction. Uncertainty reductions are measured in terms of reduced interval widths of predicted BMD values and increases in BMDL values that occur as a result of this reduced uncertainty. The methodology is illustrated using two existing data sets for TCDD carcinogenicity, fitted with two alternative dose-response models (logistic and quantal-linear). The example shows that an additional dose at a relatively high value would have been most effective for reducing the uncertainty in BMA BMD estimates, with predicted reductions in the widths of uncertainty intervals of approximately 30%, and expected increases in BMDL values of 5-10%. The results demonstrate that dose selection for studies that subsequently inform dose-response models can benefit from consideration of how these models will be fit, combined, and interpreted.
Collapse
Affiliation(s)
- Kan Shao
- Civil and Environmental Engineering, Porter Hall 119, Frew St., Pittsburgh, PA 15213, USA.
| | | |
Collapse
|
11
|
Membré JM, Laroche M, Magras C. Assessment of levels of bacterial contamination of large wild game meat in Europe. Food Microbiol 2011; 28:1072-9. [DOI: 10.1016/j.fm.2011.02.015] [Citation(s) in RCA: 36] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2010] [Revised: 02/23/2011] [Accepted: 02/26/2011] [Indexed: 11/25/2022]
|
12
|
A probabilistic approach to determine thermal process setting parameters: application for commercial sterility of products. Int J Food Microbiol 2010; 144:413-20. [PMID: 21111502 DOI: 10.1016/j.ijfoodmicro.2010.10.028] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2010] [Revised: 09/30/2010] [Accepted: 10/24/2010] [Indexed: 11/23/2022]
Abstract
The objective of the probabilistic data analysis presented in this paper was to enable the thermal process to be set on actual data rather than on generic or conservative rules. The application was an ambient stable soup product, heated in a continuous UHT line. The data set comes from a decade of microbiological analysis: initial spore load and survival spore concentration after moderate heat-treatment (100°C for 15 min and 110°C for 15 min) have been enumerated in forty eight ingredients. The probabilistic analysis was carried out within a risk-based context, considering a Performance Objective, PO, set after the heat-treatment process and an initial spore contamination (H₀) at the ingredient mixing step. The probabilistic analysis was based upon Bayesian inference, chosen for its flexibility when dealing with censored data (some values were reported as less than 1 log cfu/g) and also for its ability to incorporate in the data analysis prior information. Beforehand, Z values around 10°C for aerobic bacterial spores, and log count error around 1 log, were assumed. The methodology and the results are reported using two ingredients (garlic and milk powder) illustrating the 'not detected' (censored data) issue and also the inter-ingredient variability. Indeed, Z was estimated to be 13.6°C (mean) for spores selected from garlic and 5.9°C for those selected from milk powder. Based upon a hypothetical soup recipe with these two ingredients, the sterilization value was estimated to be 13 min (95th percentile). The potential use of similar methodology to design and set the sterilization value for the thermal process of future products, is discussed.
Collapse
|
13
|
Abstract
The willingness to view risk as part of daily life has vanished. A risk-averse mindset among environmental regulators engenders confusion between the ethics of intention and the ethics of consequence, leading to the elevation of the precautionary principle with unintended and often unfortunate outcomes. Environmental risk assessment is conservative, but the actual level of conservatism cannot be determined. High-end exposure assumptions and current toxicity criteria from the USEPA, based on linear extrapolation for carcinogens and default uncertainty factors for systemic toxicants, obscure the degree of conservatism in risk assessments. Ideally, one could choose a percentile of the target population to include within environmental standards, but this choice is complicated by the food, pharmaceutical and advertising industries, whose activities, inadvertent or not, often promote maladaptive and unhealthy lifestyle choices. There has lately been much discussion about background exposures and disease processes and their potential to increase the risk from environmental chemicals. Should these background exposures or disease processes, especially those associated with maladaptive individual choices, be included as part of a regulatory risk evaluation? A significant ethical question is whether environmental regulation should protect those pursuing a self-destructive lifestyle that may add to or synergize with otherwise innocuous environmental exposures. Choosing a target percentile of protection would provide an increased level of transparency and the flexibility to choose a higher or lower percentile if such a choice is warranted. Transparency and flexibility will lead to more responsive environmental regulation that balances protection of public health and the stewardship of societal resources.
Collapse
Affiliation(s)
- Ted Simon
- Ted Simon LLC, Winston, GA 30187, USA.
| |
Collapse
|
14
|
Abstract
Site investigations of contaminated land are associated with high costs. From a societal perspective, just enough economic resources should be spent on investigations so that society's limited resources can be used optimally. The solution is to design investigation programs that are cost effective, which can be performed using value of information analysis (VOIA). The principle of VOIA is to compare the benefit at the present state of knowledge with the benefit that is expected after an investigation has been performed. A framework for VOIA of site investigations is presented based on Bayesian risk-cost-benefit decision analysis. The result is an estimate of the value of an investigation program and, for specific problems, the optimal number of samples. The main strength of the methodology is that it promotes clear thinking and compels the decision maker to reflect on issues that otherwise would be ignored. The main weakness is the complexity of VOIA models.
Collapse
|
15
|
Baresel C, Destouni G. Uncertainty-accounting environmental policy and management of water systems. ENVIRONMENTAL SCIENCE & TECHNOLOGY 2007; 41:3653-9. [PMID: 17547192 DOI: 10.1021/es061515e] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/15/2023]
Abstract
Environmental policies for water quality and ecosystem management do not commonly require explicit stochastic accounts of uncertainty and risk associated with the quantification and prediction of waterborne pollutant loads and abatement effects. In this study, we formulate and investigate a possible environmental policy that does require an explicit stochastic uncertainty account. We compare both the environmental and economic resource allocation performance of such an uncertainty-accounting environmental policy with that of deterministic, risk-prone and risk-averse environmental policies under a range of different hypothetical, yet still possible, scenarios. The comparison indicates that a stochastic uncertainty-accounting policy may perform better than deterministic policies over a range of different scenarios. Even in the absence of reliable site-specific data, reported literature values appear to be useful for such a stochastic account of uncertainty.
Collapse
Affiliation(s)
- Christian Baresel
- Department of Land and Water Resources Engineering, Royal Institute of Technology, Brinellvägen 32, 100 44 Stockholm, Sweden.
| | | |
Collapse
|
16
|
Back PE. A model for estimating the value of sampling programs and the optimal number of samples for contaminated soil. ACTA ACUST UNITED AC 2006. [DOI: 10.1007/s00254-006-0488-6] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
17
|
Ades AE, Claxton K, Sculpher M. Evidence synthesis, parameter correlation and probabilistic sensitivity analysis. HEALTH ECONOMICS 2006; 15:373-81. [PMID: 16389628 DOI: 10.1002/hec.1068] [Citation(s) in RCA: 57] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/06/2023]
Abstract
Over the last decade or so, there have been many developments in methods to handle uncertainty in cost-effectiveness studies. In decision modelling, it is widely accepted that there needs to be an assessment of how sensitive the decision is to uncertainty in parameter values. The rationale for probabilistic sensitivity analysis (PSA) is primarily based on a consideration of the needs of decision makers in assessing the consequences of decision uncertainty. In this paper, we highlight some further compelling reasons for adopting probabilistic methods for decision modelling and sensitivity analysis, and specifically for adopting simulation from a Bayesian posterior distribution. Our reasoning is as follows. Firstly, cost-effectiveness analyses need to be based on all the available evidence, not a selected subset, and the uncertainties in the data need to be propagated through the model in order to provide a correct analysis of the uncertainties in the decision. In many--perhaps most--cases the evidence structure requires a statistical analysis that inevitably induces correlations between parameters. Deterministic sensitivity analysis requires that models are run with parameters fixed at 'extreme' values, but where parameter correlation exists it is not possible to identify sets of parameter values that can be considered 'extreme' in a meaningful sense. However, a correct probabilistic analysis can be readily achieved by Monte Carlo sampling from the joint posterior distribution of parameters. In this paper, we review some evidence structures commonly occurring in decision models, where analyses that correctly reflect the uncertainty in the data induce correlations between parameters. Frequently, this is because the evidence base includes information on functions of several parameters. It follows that, if health technology assessments are to be based on a correct analysis of all available data, then probabilistic methods must be used both for sensitivity analysis and for estimation of expected costs and benefits.
Collapse
Affiliation(s)
- A E Ades
- MRC Health Services Collaboration, Canynge Hall, England, UK.
| | | | | |
Collapse
|
18
|
Catelinois O, Laurier D, Verger P, Rogel A, Colonna M, Ignasiak M, Hémon D, Tirmarche M. Uncertainty and sensitivity analysis in assessment of the thyroid cancer risk related to Chernobyl fallout in Eastern France. RISK ANALYSIS : AN OFFICIAL PUBLICATION OF THE SOCIETY FOR RISK ANALYSIS 2005; 25:243-52. [PMID: 15876201 DOI: 10.1111/j.1539-6924.2005.00586.x] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/02/2023]
Abstract
The increase in the thyroid cancer incidence in France observed over the last 20 years has raised public concern about its association with the 1986 nuclear power plant accident at Chernobyl. At the request of French authorities, a first study sought to quantify the possible risk of thyroid cancer associated with the Chernobyl fallout in France. This study suffered from two limitations. The first involved the lack of knowledge of spontaneous thyroid cancer incidence rates (in the absence of exposure), which was especially necessary to take their trends into account for projections over time; the second was the failure to consider the uncertainties. The aim of this article is to enhance the initial thyroid cancer risk assessment for the period 1991-2007 in the area of France most exposed to the fallout (i.e., eastern France) and thereby mitigate these limitations. We consider the changes over time in the incidence of spontaneous thyroid cancer and conduct both uncertainty and sensitivity analyses. The number of spontaneous thyroid cancers was estimated from French cancer registries on the basis of two scenarios: one with a constant incidence, the other using the trend observed. Thyroid doses were estimated from all available data about contamination in France from Chernobyl fallout. Results from a 1995 pooled analysis published by Ron et al. were used to determine the dose-response relation. Depending on the scenario, the number of spontaneous thyroid cancer cases ranges from 894 (90% CI: 869-920) to 1,716 (90% CI: 1,691-1,741). The number of excess thyroid cancer cases predicted ranges from 5 (90% UI: 1-15) to 63 (90% UI: 12-180). All of the assumptions underlying the thyroid cancer risk assessment are discussed.
Collapse
Affiliation(s)
- Olivier Catelinois
- Institute for Radiation Protection and Nuclear Safety (IRSN), BP 17, F-92262 Fontenay-aux-Roses, Cedex, France.
| | | | | | | | | | | | | | | |
Collapse
|
19
|
Yokota F, Gray G, Hammitt JK, Thompson KM. Tiered chemical testing: a value of information approach. RISK ANALYSIS : AN OFFICIAL PUBLICATION OF THE SOCIETY FOR RISK ANALYSIS 2004; 24:1625-1639. [PMID: 15660617 DOI: 10.1111/j.0272-4332.2004.00555.x] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/24/2023]
Abstract
In December 2000 the EPA initiated the Voluntary Children's Chemical Evaluation Program (VCCEP) by asking manufacturers to voluntarily sponsor toxicological testing in a tiered process for 23 chemicals selected for the pilot phase. The tiered nature of the VCCEP pilot program creates the need for clearly defined criteria for determining when information is sufficient to assess the potential risks to children. This raises questions about how to determine the "adequacy" of the existing information and assess the need to undertake efforts to reduce uncertainty (through further testing). This article applies a value of information analysis approach to determine adequacy by modeling how toxicological and exposure data collected through the VCCEP may be used to inform risk management decisions. The analysis demonstrates the importance of information about the exposure level and control costs in making decisions regarding further toxicological testing. This article accounts for the cost of delaying control action and identifies the optimal testing strategy for a constrained decisionmaker who, absent applicable human data, cannot regulate without bioassay data on a specific chemical. It also quantifies the differences in optimal testing strategy for three decision criteria: maximizing societal net benefits, ensuring maximum exposure control while net benefits are positive (i.e., benefits outweigh costs), and controlling to the maximum extent technologically feasible while the lifetime risk of cancer exceeds a specific level of risk. Finally, this article shows the large differences that exist in net benefits between the three criteria for the range of exposure levels where the optimal actions differ.
Collapse
Affiliation(s)
- Fumie Yokota
- Office of Management and Budget, Washington, DC, USA
| | | | | | | |
Collapse
|
20
|
Yokota F, Thompson KM. Value of information literature analysis: a review of applications in health risk management. Med Decis Making 2004; 24:287-98. [PMID: 15155018 DOI: 10.1177/0272989x04263157] [Citation(s) in RCA: 121] [Impact Index Per Article: 6.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
This article provides the first comprehensive review of value of information (VOI) analyses related to health risk management published in English in peer-reviewed journals by the end of 2001. VOI analysis represents a decision analytic technique that explicitly evaluates the benefit of collecting additional information to reduce or eliminate uncertainty. Through a content analysis of VOI applications, this article characterizes various attributes of VOI applications, shows the evolution of the methodology and advances in computing tools that allow analysis of increasingly complex problems, and suggests the need for some standardization of reporting methods and results. The authors' analysis shows a lack of cross-fertilization across topic areas and the tendency of articles to focus on demonstrating the usefulness of the VOI approach rather than applications to actual management decisions. This article provides important insights for VOI applications in medical decision making.
Collapse
Affiliation(s)
- Fumie Yokota
- Department of Health Policy and Management, Harvard School of Public Health, Cambridge, MA, USA
| | | |
Collapse
|
21
|
Yokota F, Thompson KM. Value of information analysis in environmental health risk management decisions: past, present, and future. RISK ANALYSIS : AN OFFICIAL PUBLICATION OF THE SOCIETY FOR RISK ANALYSIS 2004; 24:635-650. [PMID: 15209935 DOI: 10.1111/j.0272-4332.2004.00464.x] [Citation(s) in RCA: 45] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/24/2023]
Abstract
Experts agree that value of information (VOI) analyses provide useful insights in risk management decisions. However, applications in environmental health risk management (EHRM) remain largely demonstrative thus far because of the complexity in modeling and solving VOI problems. Based on this comprehensive review of all VOI applications published in the peer-reviewed literature of such applications, the complexity of solving VOI problems with continuous probability distributions as inputs in models emerges as the main barrier to greater use of VOI although simulation allows analysts to solve more complex and realistic problems. Several analytical challenges that inhibit greater use of VOI techniques include issues related to modeling decisions, valuing outcomes, and characterizing uncertain and variable model inputs appropriately. This comprehensive review of methods for modeling and solving VOI problems for applications related to EHRM provides the first synthesis of important methodological advances in the field. The insights provide risk analysts and decision scientists with some guidance on how to structure and solve VOI problems focused on evaluating opportunities to collect better information to improve EHRM decisions. They further suggest the need for some efforts to standardize approaches and develop some prescriptive guidance for VOI analysts similar to existing guidelines for conducting cost-effectiveness analyses.
Collapse
Affiliation(s)
- Fumie Yokota
- Office of Information and Regulatory Affairs, Washington, DC, USA
| | | |
Collapse
|
22
|
Sohn MD, McKone TE, Blancato JN. Reconstructing population exposures from dose biomarkers: inhalation of trichloroethylene (TCE) as a case study. JOURNAL OF EXPOSURE ANALYSIS AND ENVIRONMENTAL EPIDEMIOLOGY 2004; 14:204-13. [PMID: 15141149 DOI: 10.1038/sj.jea.7500314] [Citation(s) in RCA: 24] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 04/29/2023]
Abstract
Physiologically based pharmacokinetic (PBPK) modeling is a well-established toxicological tool designed to relate exposure to a target tissue dose. The emergence of federal and state programs for environmental health tracking and the availability of exposure monitoring through biomarkers creates the opportunity to apply PBPK models to estimate exposures to environmental contaminants from urine, blood, and tissue samples. However, reconstructing exposures for large populations is complicated by often having too few biomarker samples, large uncertainties about exposures, and large interindividual variability. In this paper, we use an illustrative case study to identify some of these difficulties, and for a process for confronting them by reconstructing population-scale exposures using Bayesian inference. The application consists of interpreting biomarker data from eight adult males with controlled exposures to trichloroethylene (TCE) as if the biomarkers were random samples from a large population with unknown exposure conditions. The TCE concentrations in blood from the individuals fell into two distinctly different groups even though the individuals were simultaneously in a single exposure chamber. We successfully reconstructed the exposure scenarios for both subgroups - although the reconstruction of one subgroup is different than what is believed to be the true experimental conditions. We were however unable to predict with high certainty the concentration of TCE in air.
Collapse
Affiliation(s)
- Michael D Sohn
- Lawrence Berkeley National Laboratory, Berkeley, California 94720, USA.
| | | | | |
Collapse
|
23
|
Ades AE, Lu G. Correlations between parameters in risk models: estimation and propagation of uncertainty by Markov Chain Monte Carlo. RISK ANALYSIS : AN OFFICIAL PUBLICATION OF THE SOCIETY FOR RISK ANALYSIS 2003; 23:1165-1172. [PMID: 14641891 DOI: 10.1111/j.0272-4332.2003.00386.x] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/24/2023]
Abstract
Monte Carlo simulation has become the accepted method for propagating parameter uncertainty through risk models. It is widely appreciated, however, that correlations between input variables must be taken into account if models are to deliver correct assessments of uncertainty in risk. Various two-stage methods have been proposed that first estimate a correlation structure and then generate Monte Carlo simulations, which incorporate this structure while leaving marginal distributions of parameters unchanged. Here we propose a one-stage alternative, in which the correlation structure is estimated from the data directly by Bayesian Markov Chain Monte Carlo methods. Samples from the posterior distribution of the outputs then correctly reflect the correlation between parameters, given the data and the model. Besides its computational simplicity, this approach utilizes the available evidence from a wide variety of structures, including incomplete data and correlated and uncorrelated repeat observations. The major advantage of a Bayesian approach is that, rather than assuming the correlation structure is fixed and known, it captures the joint uncertainty induced by the data in all parameters, including variances and covariances, and correctly propagates this through the decision or risk model. These features are illustrated with examples on emissions of dioxin congeners from solid waste incinerators.
Collapse
Affiliation(s)
- A E Ades
- MRC Health Services Research Collaboration, Department of Social Medicine, University of Bristo, Canynge Hall, Bristol, United Kingdom
| | | |
Collapse
|
24
|
Sohn MD, Sextro RG, Gadgil AJ, Daisey JM. Responding to sudden pollutant releases in office buildings: 1. Framework and analysis tools. INDOOR AIR 2003; 13:267-276. [PMID: 12950590 DOI: 10.1034/j.1600-0668.2003.00183.x] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/24/2023]
Abstract
We describe a framework for developing response recommendations to unexpected toxic pollutant releases in commercial buildings. It may be applied in conditions where limited building- and event-specific information is available. The framework is based on a screening-level methodology to develop insights, or rules-of-thumb, into the behavior of airflow and pollutant transport. A three-stage framework is presented: (1). develop a building taxonomy to identify generic, or prototypical, building configurations; (2). characterize uncertainty and conduct simulation modeling to predict typical airflow and pollutant transport behavior; and (3). rank uncertainty contributions to determine how information obtained at a site might reduce uncertainties in the model predictions. The approach is applied to study a hypothetical pollutant release on the first floor of a five-story office building. Key features that affect pollutant transport are identified and described by value ranges in the building stock. Simulation modeling provides predictions and uncertainty estimates of time-dependent pollutant concentrations, following a release, for a range of indoor and outdoor conditions. In this exercise, we predict concentrations on the fifth floor to be an order of magnitude less than on the first, coefficients of variation greater than 2, and information about the HVAC operation and window position most reducing uncertainty in predicted peak concentrations.
Collapse
Affiliation(s)
- M D Sohn
- Indoor Environment Department, Lawrence Berkeley National Laboratory, Berkeley, CA 94720, USA.
| | | | | | | |
Collapse
|
25
|
Hill RA, Sendashonga C. General principles for risk assessment of living modified organisms: Lessons from chemical risk assessment. ACTA ACUST UNITED AC 2003; 2:81-8. [PMID: 15612274 DOI: 10.1051/ebr:2003004] [Citation(s) in RCA: 68] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
Modern biotechnology has led to the development and use of Living Modified Organisms (LMOs) for agriculture and other purposes. Regulators at the national level are increasingly depending on risk assessment as a tool for assessing potential adverse effects of LMOs on the environment and human health. In addition, the Cartagena Protocol on Biosafety, an international agreement expected to enter into force in the near future, requires risk assessment as the basis for decision-making regarding import of some LMOs. While LMO risk assessment is relatively new, there are other risk assessment disciplines which have developed over longer time periods. The field of assessment of the environmental and human health risks of chemicals is particularly well developed, and is similar in application to LMO risk assessment. This paper aims to draw lessons for LMO risk assessment from the vast experience with chemical risk assessment. Seven general principles are outlined which should serve as a useful checklist to guide assessments of risks posed by LMOs.
Collapse
Affiliation(s)
- Ryan A Hill
- Biosafety Programme, Secretariat of the Convention on Biological Diversity, Montreal, Quebec, H2Y 1N9 Canada.
| | | |
Collapse
|
26
|
|
27
|
Reichert P, Schervish M, Small MJ. An Efficient Sampling Technique for Bayesian Inference With Computationally Demanding Models. Technometrics 2002. [DOI: 10.1198/004017002188618518] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022]
|
28
|
Walker KD, Evans JS, MacIntosh D. Use of expert judgment in exposure assessment. Part I. Characterization of personal exposure to benzene. JOURNAL OF EXPOSURE ANALYSIS AND ENVIRONMENTAL EPIDEMIOLOGY 2001; 11:308-22. [PMID: 11571610 DOI: 10.1038/sj.jea.7500171] [Citation(s) in RCA: 21] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/27/1998] [Accepted: 04/25/2001] [Indexed: 04/14/2023]
Abstract
This paper presents the results of the first phase of a study, conducted as an element of the National Human Exposure Assessment Survey (NHEXAS), to demonstrate the use of expert subjective judgment elicitation techniques to characterize the magnitude of and uncertainty in environmental exposure to benzene. In decisions about the value of exposure research or of regulatory controls, the characterization of uncertainty can play an influential role. Classical methods for characterizing uncertainty may be sufficient when adequate amounts of relevant data are available. Frequently, however, data are neither abundant nor directly relevant, making it necessary to rely to varying degrees on subjective judgment. Since the 1950s, methods to elicit and quantify subjective judgments have been explored but have rarely been applied to the field of environmental exposure assessment. In this phase of the project, seven experts in benzene exposure assessment were selected through a peer nomination process, participated in a 2-day workshop, and were interviewed individually to elicit their judgments about the distributions of residential ambient, residential indoor, and personal air benzene concentrations (6-day integrated average) experienced by both the non-smoking, non-occupationally exposed target and study populations of the US EPA Region V pilot study. Specifically, each expert was asked to characterize, in probabilistic form, the arithmetic means and the 90th percentiles of these distributions. This paper presents the experts' judgments about the concentrations of benzene encountered by the target population. The experts' judgments about levels of benzene in personal air were demonstrative of patterns observed in the judgments about the other distributions. They were in closest agreement about their predictions of the mean; with one exception, their best estimates of the mean fell within 7-11 microg/m(3) although they exhibited striking differences in the degree of uncertainty expressed. Their estimates of the 90th percentile were more varied with the best estimates ranging from 12 to 26 microg/m(3) for all but one expert. However, their predictions of the 90th percentile were far more uncertain. The paper demonstrates that coherent subjective judgments can be elicited from exposure assessment scientists and critically examines the challenges and potential benefits of a subjective judgment approach. The results of the second phase of the project, in which measurements from the NHEXAS field study in Region V are used to calibrate the experts' judgments about the benzene exposures in the study population, will be presented in a second paper.
Collapse
Affiliation(s)
- K D Walker
- Harvard School of Public Health, Boston, Massachusetts, USA.
| | | | | |
Collapse
|
29
|
Gelman A, Krantz DH, Lin C, Price PN. Analysis of Local Decisions Using Hierarchical Modeling, Applied to Home Radon Measurement and Remediation. Stat Sci 1999. [DOI: 10.1214/ss/1009212411] [Citation(s) in RCA: 19] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
|