1
|
Lin YS, O'Mahony JF, van Rosmalen J. A Simple Cost-Effectiveness Model of Screening: An Open-Source Teaching and Research Tool Coded in R. PHARMACOECONOMICS - OPEN 2023:10.1007/s41669-023-00414-1. [PMID: 37261616 DOI: 10.1007/s41669-023-00414-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Subscribe] [Scholar Register] [Accepted: 04/04/2023] [Indexed: 06/02/2023]
Abstract
Applied cost-effectiveness analysis models are an important tool for assessing health and economic effects of healthcare interventions but are not best suited for illustrating methods. Our objective is to provide a simple, open-source model for the simulation of disease-screening cost-effectiveness for teaching and research purposes. We introduce our model and provide an initial application to examine changes to the efficiency frontier as input parameters vary and to demonstrate face validity. We described a vectorised, discrete-event simulation of screening in R with an Excel interface to define parameters and inspect principal results. An R Shiny app permits dynamic interpretation of simulation outputs. An example with 8161 screening strategies illustrates the cost and effectiveness of varying the disease sojourn time, treatment effectiveness, and test performance characteristics and costs on screening policies. Many of our findings are intuitive and straightforward, such as a reduction in screening costs leading to decreased overall costs and improved cost-effectiveness. Others are less obvious and depend on whether we consider gross outcomes or those net to no screening. For instance, enhanced treatment of symptomatic disease increases gross effectiveness, but reduces the net effectiveness and cost-effectiveness of screening. A lengthening of the preclinical sojourn time has ambiguous effects relative to no screening, as cost-effectiveness improves for some strategies but deteriorates for others. Our simple model offers an accessible platform for methods research and teaching. We hope it will serve as a public good and promote an intuitive understanding of the cost-effectiveness of screening.
Collapse
Affiliation(s)
- Yi-Shu Lin
- Centre for Health Policy and Management, Trinity College Dublin, 2-4 Foster Place, Dublin, D02 T253, Ireland.
| | - James F O'Mahony
- Centre for Health Policy and Management, Trinity College Dublin, 2-4 Foster Place, Dublin, D02 T253, Ireland
| | - Joost van Rosmalen
- Department of Biostatistics, Erasmus Medical Centre, Rotterdam, The Netherlands
- Department of Epidemiology, Erasmus Medical Centre, Rotterdam, The Netherlands
| |
Collapse
|
2
|
Microsimulation Model Calibration with Approximate Bayesian Computation in R: A Tutorial. Med Decis Making 2022; 42:557-570. [DOI: 10.1177/0272989x221085569] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Mathematical health policy models, including microsimulation models (MSMs), are widely used to simulate complex processes and predict outcomes consistent with available data. Calibration is a method to estimate parameter values such that model predictions are similar to observed outcomes of interest. Bayesian calibration methods are popular among the available calibration techniques, given their strong theoretical basis and flexibility to incorporate prior beliefs and draw values from the posterior distribution of model parameters and hence the ability to characterize and evaluate parameter uncertainty in the model outcomes. Approximate Bayesian computation (ABC) is an approach to calibrate complex models in which the likelihood is intractable, focusing on measuring the difference between the simulated model predictions and outcomes of interest in observed data. Although ABC methods are increasingly being used, there is limited practical guidance in the medical decision-making literature on approaches to implement ABC to calibrate MSMs. In this tutorial, we describe the Bayesian calibration framework, introduce the ABC approach, and provide step-by-step guidance for implementing an ABC algorithm to calibrate MSMs, using 2 case examples based on a microsimulation model for dementia. We also provide the R code for applying these methods.
Collapse
|
3
|
DeYoreo M, Rutter CM, Ozik J, Collier N. Sequentially calibrating a Bayesian microsimulation model to incorporate new information and assumptions. BMC Med Inform Decis Mak 2022; 22:12. [PMID: 35022005 PMCID: PMC8756687 DOI: 10.1186/s12911-021-01726-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2021] [Accepted: 12/17/2021] [Indexed: 12/11/2022] Open
Abstract
BACKGROUND Microsimulation models are mathematical models that simulate event histories for individual members of a population. They are useful for policy decisions because they simulate a large number of individuals from an idealized population, with features that change over time, and the resulting event histories can be summarized to describe key population-level outcomes. Model calibration is the process of incorporating evidence into the model. Calibrated models can be used to make predictions about population trends in disease outcomes and effectiveness of interventions, but calibration can be challenging and computationally expensive. METHODS This paper develops a technique for sequentially updating models to take full advantage of earlier calibration results, to ultimately speed up the calibration process. A Bayesian approach to calibration is used because it combines different sources of evidence and enables uncertainty quantification which is appealing for decision-making. We develop this method in order to re-calibrate a microsimulation model for the natural history of colorectal cancer to include new targets that better inform the time from initiation of preclinical cancer to presentation with clinical cancer (sojourn time), because model exploration and validation revealed that more information was needed on sojourn time, and that the predicted percentage of patients with cancers detected via colonoscopy screening was too low. RESULTS The sequential approach to calibration was more efficient than recalibrating the model from scratch. Incorporating new information on the percentage of patients with cancers detected upon screening changed the estimated sojourn time parameters significantly, increasing the estimated mean sojourn time for cancers in the colon and rectum, providing results with more validity. CONCLUSIONS A sequential approach to recalibration can be used to efficiently recalibrate a microsimulation model when new information becomes available that requires the original targets to be supplemented with additional targets.
Collapse
Affiliation(s)
- Maria DeYoreo
- RAND Corporation, 1776 Main St., Santa Monica, CA, 90401, USA.
| | | | - Jonathan Ozik
- Argonne National Laboratory, Building 221, 9700 South Cass Avenue, Argonne, IL, 60439, USA
| | - Nicholson Collier
- Argonne National Laboratory, Building 221, 9700 South Cass Avenue, Argonne, IL, 60439, USA
| |
Collapse
|
4
|
Piera-Jiménez J, Winters M, Broers E, Valero-Bover D, Habibovic M, Widdershoven JWMG, Folkvord F, Lupiáñez-Villanueva F. Changing the Health Behavior of Patients With Cardiovascular Disease Through an Electronic Health Intervention in Three Different Countries: Cost-Effectiveness Study in the Do Cardiac Health: Advanced New Generation Ecosystem (Do CHANGE) 2 Randomized Controlled Trial. J Med Internet Res 2020; 22:e17351. [PMID: 32720908 PMCID: PMC7420510 DOI: 10.2196/17351] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2019] [Revised: 03/20/2020] [Accepted: 06/03/2020] [Indexed: 12/14/2022] Open
Abstract
BACKGROUND During the last few decades, preventing the development of cardiovascular disease has become a mainstay for reducing cardiovascular morbidity and mortality. It has been suggested that interventions should focus more on committed approaches of self-care, such as electronic health techniques. OBJECTIVE This study aimed to provide evidence to understand the financial consequences of implementing the "Do Cardiac Health: Advanced New Generation Ecosystem" (Do CHANGE 2) intervention, which was evaluated in a multisite randomized controlled trial to change the health behavior of patients with cardiovascular disease. METHODS The cost-effectiveness analysis of the Do CHANGE 2 intervention was performed with the Monitoring and Assessment Framework for the European Innovation Partnership on Active and Healthy Ageing tool, based on a Markov model of five health states. The following two types of costs were considered for both study groups: (1) health care costs (ie, costs associated with the time spent by health care professionals on service provision, including consultations, and associated unplanned hospitalizations, etc) and (2) societal costs (ie, costs attributed to the time spent by patients and informal caregivers on care activities). RESULTS The Do CHANGE 2 intervention was less costly in Spain (incremental cost was -€2514.90) and more costly in the Netherlands and Taiwan (incremental costs were €1373.59 and €1062.54, respectively). Compared with treatment as usual, the effectiveness of the Do CHANGE 2 program in terms of an increase in quality-adjusted life-year gains was slightly higher in the Netherlands and lower in Spain and Taiwan. CONCLUSIONS In general, we found that the incremental cost-effectiveness ratio strongly varied depending on the country where the intervention was applied. The Do CHANGE 2 intervention showed a positive cost-effectiveness ratio only when implemented in Spain, indicating that it saved financial costs in relation to the effect of the intervention. TRIAL REGISTRATION ClinicalTrials.gov NCT03178305; https://clinicaltrials.gov/ct2/show/NCT03178305.
Collapse
Affiliation(s)
- Jordi Piera-Jiménez
- Open Evidence Research Group, Universitat Oberta de Catalunya, Barcelona, Spain
- Department of R&D, Badalona Serveis Assistencials, Badalona, Spain
| | | | - Eva Broers
- Department of Medical and Clinical Psychology, Tilburg University, Tilburg, Netherlands
- Department of Cardiology, Elisabeth-TweeSteden Hospital, Tilburg, Netherlands
| | | | - Mirela Habibovic
- Department of Medical and Clinical Psychology, Tilburg University, Tilburg, Netherlands
- Department of Cardiology, Elisabeth-TweeSteden Hospital, Tilburg, Netherlands
| | - Jos W M G Widdershoven
- Department of Medical and Clinical Psychology, Tilburg University, Tilburg, Netherlands
- Department of Cardiology, Elisabeth-TweeSteden Hospital, Tilburg, Netherlands
| | - Frans Folkvord
- Open Evidence Research Group, Universitat Oberta de Catalunya, Barcelona, Spain
- Department of Communication and Cognition, Tilburg School of Humanities and Digital Sciences, Tilburg University, Tilburg, Netherlands
| | - Francisco Lupiáñez-Villanueva
- Open Evidence Research Group, Universitat Oberta de Catalunya, Barcelona, Spain
- Department of Information and Communication Sciences, Universitat Oberta de Catalunya, Barcelona, Spain
| |
Collapse
|
5
|
González-Mariño MA. [Cost-effectiveness of risk-reducing salpingo-oophorectomy in cases of BRCA1 gene mutation in Colombia]. Rev Salud Publica (Bogota) 2018; 20:232-236. [PMID: 30570007 DOI: 10.15446/rsap.v20n2.64866] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2017] [Accepted: 02/12/2018] [Indexed: 11/09/2022] Open
Abstract
OBJECTIVE To assess the usefulness of risk reduction salpingo-oophorectomy in cases with mutation of the BRCA1 gene in Colombia. MATERIAL AND METHODS Cost-effectiveness analysis in which three processes are incorporated: a. Patients with screening tests for breast and ovarian cancer. b. Risk reduction surgery in the fallopian tubes and ovaries c. Reductive surgery in the fallopian tubes and ovaries with bilateral mastectomy. The outcome is evaluated as the gain in years of survival. RESULTS The cohort with risk reduction surgery in the fallopian tubes and ovaries and bilateral mastectomy is the one with the highest gain with 13 years, while the risk reduction surgery in the fallopian tubes and ovaries gain 4.95 years with respect to the follow-up group. CONCLUSIONS The three options evaluated are acceptable, but of them the one with the greatest gain in survival is the combination of risk-reducing surgery in the fallopian tubes and ovaries with bilateral mastectomy.
Collapse
Affiliation(s)
- Mario A González-Mariño
- MG: MD. M. Sc. Senología y Patología Mamaria. Ph. D. Medicina Preventiva y Salud Pública Facultad de Medicina, Universidad Nacional de Colombia. Bogotá, Colombia.
| |
Collapse
|
6
|
Custer B, Johnson ES, Sullivan SD, Hazlet TK, Ramsey SD, Murphy EL, Busch MP. Community Blood Supply Model: Development of a New Model to Assess the Safety, Sufficiency, and Cost of the Blood Supply. Med Decis Making 2016; 25:571-82. [PMID: 16160212 DOI: 10.1177/0272989x05280557] [Citation(s) in RCA: 35] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Background. Through a combination of predonation donor screening and donated unit testing, the blood supply is safer than ever. However, as a result of increasingly stringent screening measures, one of the greatest threats may be an insufficient supply. The balance between safety and adequacy of the blood supply has not received enough attention. Study Design and Methods. The authors developed a model to allow for empirical investigation of the determinants of a safe and sufficient supply. The model is a cohort simulation of allogeneic whole-blood donation, with the population of presenting donors stratified into 8 age and gender groups because the probability of donor and donation deferral varies by these characteristics. Parameters are estimated from year 2000 Blood Centers of Pacific (BCP) data. The model includes cost parameters, which were estimated using BCP expenditure data. The main outcomes are the number of transfusable units of blood and the unit cost of procurement. Results. The model tracks the production of a supply of blood, highlighting the influence of demographic characteristics, predonation deferral, underweight collection of blood units, and associated costs. The authors sought to establish model validity by showing that modeled results closely mimic the outcomes and costs observed by blood bank administrators. Conclusion. The model was developed to evaluate blood safety and policy decisions; it can be used to assess the impact of predonation deferrals, such as expanded European travel deferral for variant Creutzfeldt-Jakob disease, or the impact of new testing strategies, such as nucleic acid testing for West Nile virus.
Collapse
Affiliation(s)
- Brian Custer
- Pharmaceutical Outcomes Research and Policy Program, University of Washington, Seattle, USA.
| | | | | | | | | | | | | |
Collapse
|
7
|
Applying evidence from economic evaluations to translate cancer survivorship research into care. J Cancer Surviv 2015; 9:560-6. [DOI: 10.1007/s11764-015-0433-3] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/24/2014] [Accepted: 01/27/2015] [Indexed: 01/16/2023]
|
8
|
Prolonged postoperative venous thrombo-embolism prophylaxis is cost-effective in advanced ovarian cancer patients. Gynecol Oncol 2012; 127:631-7. [DOI: 10.1016/j.ygyno.2012.08.032] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/25/2012] [Revised: 08/06/2012] [Accepted: 08/22/2012] [Indexed: 01/01/2023]
|
9
|
Ye W, Isaman DJ, Barhak J. Use of Secondary Data to Estimate Instantaneous Model Parameters of Diabetic Heart Disease: Lemonade Method. AN INTERNATIONAL JOURNAL ON INFORMATION FUSION 2012; 13:137-145. [PMID: 22563307 PMCID: PMC3341173 DOI: 10.1016/j.inffus.2010.08.003] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/31/2023]
Abstract
With the increasing burden of chronic diseases on the health care system, Markov-type models are becoming popular to predict the long-term outcomes of early intervention and to guide disease management. However, statisticians have not been actively involved in the development of these models. Typically, the models are developed by using secondary data analysis to find a single "best" study to estimate each transition in the model. However, due to the nature of secondary data analysis, there frequently are discrepancies between the theoretical model and the design of the studies being used. This paper illustrates a likelihood approach to correctly model the design of clinical studies under the conditions where 1) the theoretical model may include an instantaneous state of distinct interest to the researchers, and 2) the study design may be such that study data can not be used to estimate a single parameter in the theoretical model of interest. For example, a study may ignore intermediary stages of disease. Using our approach, not only can we accommodate the two conditions above, but more than one study may be used to estimate model parameters. In the spirit of "If life gives you lemon, make lemonade", we call this method "Lemonade Method". Simulation studies are carried out to evaluate the finite sample property of this method. In addition, the method is demonstrated through application to a model of heart disease in diabetes.
Collapse
Affiliation(s)
- Wen Ye
- Department of Biostatistics, School of Public Health, University of Michigan, 1415 Washington Heights, Ann Arbor, Michigan 48109-2029
| | | | | |
Collapse
|
10
|
Abstract
BACKGROUND Microsimulation models (MSMs) for health outcomes simulate individual event histories associated with key components of a disease process; these simulated life histories can be aggregated to estimate population-level effects of treatment on disease outcomes and the comparative effectiveness of treatments. Although MSMs are used to address a wide range of research questions, methodological improvements in MSM approaches have been slowed by the lack of communication among modelers. In addition, there are few resources to guide individuals who may wish to use MSM projections to inform decisions. METHODS . This article presents an overview of microsimulation modeling, focusing on the development and application of MSMs for health policy questions. The authors discuss MSM goals, overall components of MSMs, methods for selecting MSM parameters to reproduce observed or expected results (calibration), methods for MSM checking (validation), and issues related to reporting and interpreting MSM findings(sensitivity analyses, reporting of variability, and model transparency). CONCLUSIONS . MSMs are increasingly being used to provide information to guide health policy decisions. This increased use brings with it the need for both better understanding of MSMs by policy researchers, and continued improvement in methods for developing and applying MSMs.
Collapse
Affiliation(s)
- Carolyn M Rutter
- Biostatistics Unit, Group Health Research Institute, Seattle, WA USA, and Department of Biostatistics, University of Washington School of Public Health and Community Medicine, Seattle, WA USA (CMR)
| | - Alan M Zaslavsky
- Department of Health Care Policy Harvard Medical School, Boston, MA USA (AMZ)
| | - Eric J Feuer
- Statistical Research and Applications Branch, Division of Cancer Control and Population Sciences, National Cancer Institute, Bethesda MD USA (EJF)
| |
Collapse
|
11
|
Goldhaber-Fiebert JD, Stout NK, Goldie SJ. Empirically evaluating decision-analytic models. VALUE IN HEALTH : THE JOURNAL OF THE INTERNATIONAL SOCIETY FOR PHARMACOECONOMICS AND OUTCOMES RESEARCH 2010; 13:667-674. [PMID: 20230547 DOI: 10.1111/j.1524-4733.2010.00698.x] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
Abstract
OBJECTIVES Model-based cost-effectiveness analyses support decision-making. To augment model credibility, evaluation via comparison to independent, empirical studies is recommended. METHODS We developed a structured reporting format for model evaluation and conducted a structured literature review to characterize current model evaluation recommendations and practices. As an illustration, we applied the reporting format to evaluate a microsimulation of human papillomavirus and cervical cancer. The model's outputs and uncertainty ranges were compared with multiple outcomes from a study of long-term progression from high-grade precancer (cervical intraepithelial neoplasia [CIN]) to cancer. Outcomes included 5 to 30-year cumulative cancer risk among women with and without appropriate CIN treatment. Consistency was measured by model ranges overlapping study confidence intervals. RESULTS The structured reporting format included: matching baseline characteristics and follow-up, reporting model and study uncertainty, and stating metrics of consistency for model and study results. Structured searches yielded 2963 articles with 67 meeting inclusion criteria and found variation in how current model evaluations are reported. Evaluation of the cervical cancer microsimulation, reported using the proposed format, showed a modeled cumulative risk of invasive cancer for inadequately treated women of 39.6% (30.9-49.7) at 30 years, compared with the study: 37.5% (28.4-48.3). For appropriately treated women, modeled risks were 1.0% (0.7-1.3) at 30 years, study: 1.5% (0.4-3.3). CONCLUSIONS To support external and projective validity, cost-effectiveness models should be iteratively evaluated as new studies become available, with reporting standardized to facilitate assessment. Such evaluations are particularly relevant for models used to conduct comparative effectiveness analyses.
Collapse
Affiliation(s)
- Jeremy D Goldhaber-Fiebert
- Stanford Health Policy, Centers for Health Policy and Primary Care and Outcomes Research, Department of Medicine, Stanford University School of Medicine, Stanford, CA, USA.
| | | | | |
Collapse
|
12
|
Abstract
Microsimulation models that describe disease processes synthesize information from multiple sources and can be used to estimate the effects of screening and treatment on cancer incidence and mortality at a population level. These models are characterized by simulation of individual event histories for an idealized population of interest. Microsimulation models are complex and invariably include parameters that are not well informed by existing data. Therefore, a key component of model development is the choice of parameter values. Microsimulation model parameter values are selected to reproduce expected or known results though the process of model calibration. Calibration may be done by perturbing model parameters one at a time or by using a search algorithm. As an alternative, we propose a Bayesian method to calibrate microsimulation models that uses Markov chain Monte Carlo. We show that this approach converges to the target distribution and use a simulation study to demonstrate its finite-sample performance. Although computationally intensive, this approach has several advantages over previously proposed methods, including the use of statistical criteria to select parameter values, simultaneous calibration of multiple parameters to multiple data sources, incorporation of information via prior distributions, description of parameter identifiability, and the ability to obtain interval estimates of model parameters. We develop a microsimulation model for colorectal cancer and use our proposed method to calibrate model parameters. The microsimulation model provides a good fit to the calibration data. We find evidence that some parameters are identified primarily through prior distributions. Our results underscore the need to incorporate multiple sources of variability (i.e., due to calibration data, unknown parameters, and estimated parameters and predicted values) when calibrating and applying microsimulation models.
Collapse
Affiliation(s)
- Carolyn M. Rutter
- Carolyn M. Rutter is Senior Investigator, Group Health Center for Health Studies, Seattle, WA 98101 () and Affiliate Professor, Departments of Biostatistics and Health Services, University of Washington, WA 98195. Diana L. Miglioretti is Senior Investigator, Group Health Center for Health Studies, Seattle, WA 98101 and Affiliate Associate Professor, Department of Biostatistics, University of Washington, WA 98195. James E. Savarino is Programmer, Group Health Center for Health Studies, Seattle, WA 98101
| | - Diana L. Miglioretti
- Carolyn M. Rutter is Senior Investigator, Group Health Center for Health Studies, Seattle, WA 98101 () and Affiliate Professor, Departments of Biostatistics and Health Services, University of Washington, WA 98195. Diana L. Miglioretti is Senior Investigator, Group Health Center for Health Studies, Seattle, WA 98101 and Affiliate Associate Professor, Department of Biostatistics, University of Washington, WA 98195. James E. Savarino is Programmer, Group Health Center for Health Studies, Seattle, WA 98101
| | - James E. Savarino
- Carolyn M. Rutter is Senior Investigator, Group Health Center for Health Studies, Seattle, WA 98101 () and Affiliate Professor, Departments of Biostatistics and Health Services, University of Washington, WA 98195. Diana L. Miglioretti is Senior Investigator, Group Health Center for Health Studies, Seattle, WA 98101 and Affiliate Associate Professor, Department of Biostatistics, University of Washington, WA 98195. James E. Savarino is Programmer, Group Health Center for Health Studies, Seattle, WA 98101
| |
Collapse
|
13
|
Perkins JD, Halldorson JB, Bakthavatsalam R, Fix OK, Carithers RL, Reyes JD. Should liver transplantation in patients with model for end-stage liver disease scores <or= 14 be avoided? A decision analysis approach. Liver Transpl 2009; 15:242-54. [PMID: 19177441 DOI: 10.1002/lt.21703] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Studies have shown that liver transplantation offers no survival benefits to patients with Model for End-Stage Liver Disease (MELD) scores <or= 14 in comparison with remaining on the waitlist. The consensus of a 2003 transplant community national conference was that a minimum MELD score should be required for placement on the liver waitlist, but no minimum listing national policy was enacted at that time. We developed a Markov microsimulation model to compare results under the present US liver allocation policy with outcomes under a "Rule 14" policy of barring patients with a MELD score of <or=14 from the waitlist or transplantation. For probabilities in the microsimulation model, we used data on all adult patients (>or=18 years) listed for or undergoing primary liver transplantation in the United States for chronic liver disease from 1/1/2003 through 12/31/2007 with follow-up until 2/1/2008. The "Rule 14" policy gave a 3% improvement in overall patient survival over the present system at 1, 2, 3, and 4 years and predicted a 13% decrease in overall waitlist time for patients with MELD scores of 15 to 40. Patients with the greatest benefit from a "Rule 14" policy were those with MELD scores of 6 to 10, for whom a 17% survival advantage was predicted from waiting on the list versus undergoing transplantation. Our analysis supports changing the national liver allocation policy to not allow liver transplantation for patients with MELD <or= 14.
Collapse
Affiliation(s)
- James D Perkins
- Division of Transplantation, Department of Surgery, University of Washington, Seattle, WA 98195, USA.
| | | | | | | | | | | |
Collapse
|
14
|
Stout NK, Knudsen AB, Kong CY, McMahon PM, Gazelle GS. Calibration methods used in cancer simulation models and suggested reporting guidelines. PHARMACOECONOMICS 2009; 27:533-45. [PMID: 19663525 PMCID: PMC2787446 DOI: 10.2165/11314830-000000000-00000] [Citation(s) in RCA: 87] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/16/2023]
Abstract
Increasingly, computer simulation models are used for economic and policy evaluation in cancer prevention and control. A model's predictions of key outcomes, such as screening effectiveness, depend on the values of unobservable natural history parameters. Calibration is the process of determining the values of unobservable parameters by constraining model output to replicate observed data. Because there are many approaches for model calibration and little consensus on best practices, we surveyed the literature to catalogue the use and reporting of these methods in cancer simulation models. We conducted a MEDLINE search (1980 through 2006) for articles on cancer-screening models and supplemented search results with articles from our personal reference databases. For each article, two authors independently abstracted pre-determined items using a standard form. Data items included cancer site, model type, methods used for determination of unobservable parameter values and description of any calibration protocol. All authors reached consensus on items of disagreement. Reviews and non-cancer models were excluded. Articles describing analytical models, which estimate parameters with statistical approaches (e.g. maximum likelihood) were catalogued separately. Models that included unobservable parameters were analysed and classified by whether calibration methods were reported and if so, the methods used. The review process yielded 154 articles that met our inclusion criteria and, of these, we concluded that 131 may have used calibration methods to determine model parameters. Although the term 'calibration' was not always used, descriptions of calibration or 'model fitting' were found in 50% (n = 66) of the articles, with an additional 16% (n = 21) providing a reference to methods. Calibration target data were identified in nearly all of these articles. Other methodological details, such as the goodness-of-fit metric, were discussed in 54% (n = 47 of 87) of the articles reporting calibration methods, while few details were provided on the algorithms used to search the parameter space. Our review shows that the use of cancer simulation modelling is increasing, although thorough descriptions of calibration procedures are rare in the published literature for these models. Calibration is a key component of model development and is central to the validity and credibility of subsequent analyses and inferences drawn from model predictions. To aid peer-review and facilitate discussion of modelling methods, we propose a standardized Calibration Reporting Checklist for model documentation.
Collapse
Affiliation(s)
- Natasha K Stout
- Department of Ambulatory Care and Prevention, Harvard Medical School/Harvard Pilgrim Health Care, Boston, Massachusetts 02215, USA.
| | | | | | | | | |
Collapse
|
15
|
McMahon PM, Kong CY, Weinstein MC, Tramontano AC, Cipriano LE, Johnson BE, Weeks JC, Gazelle GS. Adopting helical CT screening for lung cancer: potential health consequences during a 15-year period. Cancer 2008; 113:3440-9. [PMID: 18988293 PMCID: PMC2782879 DOI: 10.1002/cncr.23962] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/20/2023]
Abstract
BACKGROUND Simulation modeling can synthesize data from single-arm studies of lung cancer screening and tumor registries to investigate computed tomography (CT) screening. This study estimated changes in lung cancer outcomes through 2005, had chest CT screening been introduced in 1990. METHODS Hypothetical individuals with smoking histories representative of 6 US cohorts (white males and females aged 50, 60, and 70 years in 1990) were simulated in the Lung Cancer Policy Model, a comprehensive patient-level simulation model of lung cancer development, screening, and treatment. A no screening scenario corresponded to observed outcomes. We simulated 3 screening scenarios in current or former smokers with > or =20 pack-years as follows: 1-time screen in 1990; and annual, and twice-annually screenings beginning in 1990 and ending in 2005. Main outcomes were days of life between 1990 and 2005 and life expectancy in 1990 (estimated by simulating life histories past 2005). RESULTS All screening scenarios yielded reductions (compared with no screening) in lung cancer-specific mortality by 2005, with larger reductions predicted for more frequent screening. Compared with no screening, annual screening of ever-smokers with at least 20 pack-years of cigarette exposure provided ever-smokers with an additional 11 to 33 days of life by 2005, or an additional 3-10 weeks of (undiscounted) life expectancy. In sensitivity analyses, the largest effects on gains from annual screening were due to reductions in screening adherence and increased smoking cessation. CONCLUSIONS The adoption of CT screening, had it been available in 1990, might have resulted in a modest gain in life expectancy.
Collapse
Affiliation(s)
- Pamela M McMahon
- Department of Radiology, Massachusetts General Hospital, Boston, MA, USA.
| | | | | | | | | | | | | | | |
Collapse
|
16
|
McMahon PM, Kong CY, Johnson BE, Weinstein MC, Weeks JC, Kuntz KM, Shepard JAO, Swensen SJ, Gazelle GS. Estimating long-term effectiveness of lung cancer screening in the Mayo CT screening study. Radiology 2008; 248:278-87. [PMID: 18458247 DOI: 10.1148/radiol.2481071446] [Citation(s) in RCA: 78] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/18/2023]
Abstract
PURPOSE To use individual-level data provided from the single-arm study of helical computed tomographic (CT) screening at the Mayo Clinic (Rochester, Minn) to estimate the long-term effectiveness of screening in Mayo study participants and to compare estimates from an existing lung cancer simulation model with estimates from a different modeling approach that used the same data. MATERIALS AND METHODS The study was approved by institutional review boards and was HIPAA compliant. Deidentified individual-level data from participants (1520 current or former smokers aged 50-85 years) in the Mayo Clinic helical CT screening study were used to populate the Lung Cancer Policy Model, a comprehensive microsimulation model of lung cancer development, screening findings, treatment results, and long-term outcomes. The model predicted diagnosed cases of lung cancer and deaths per simulated study arm (five annual screening examinations vs no screening). Main outcome measures were predicted changes in lung cancer-specific and all-cause mortality as functions of follow-up time after simulated enrollment and randomization. RESULTS At 6-year follow-up, the screening arm had an estimated 37% relative increase in lung cancer detection, compared with the control arm. At 15-year follow-up, five annual screening examinations yielded a 9% relative increase in lung cancer detection. The relative reduction in cumulative lung cancer-specific mortality from five annual screening examinations was 28% at 6-year follow-up (15% at 15 years). The relative reduction in cumulative all-cause mortality from five annual screening examinations was 4% at 6-year follow-up (2% at 15 years). CONCLUSION Screening may reduce lung cancer-specific mortality but may offer a smaller reduction in overall mortality because of increased competing mortality risks associated with smoking.
Collapse
Affiliation(s)
- Pamela M McMahon
- Institute for Technology Assessment, Massachusetts General Hospital, 101 Merrimac St, 10th Floor, Boston, MA 02114, USA.
| | | | | | | | | | | | | | | | | |
Collapse
|
17
|
Akushevich I, Kravchenko JS, Manton KG. Health-based population forecasting: effects of smoking on mortality and fertility. RISK ANALYSIS : AN OFFICIAL PUBLICATION OF THE SOCIETY FOR RISK ANALYSIS 2007; 27:467-82. [PMID: 17511712 DOI: 10.1111/j.1539-6924.2007.00898.x] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/15/2023]
Abstract
A microsimulation model, allowing one to forecast short- and long-term population changes conditional on the prevalence of a risk factor in a population, is presented. In this model, population changes result from the aggregation of changes in individual event histories, which, in turn, result from mortality and infertility rates recalculated in accordance with their known relative risks in population groups exposed to a risk factor. Smoking, being the most widespread and influential preventable public health risk factor, is chosen to demonstrate the abilities of the model to forecast the population effects of different hypothetical smoking prevalences. The demographic and population health effects on 20-, 50-, and 100-year projections with the current, hypothetically doubled, and hypothetically halved the current smoking prevalence are analyzed in detail. The model predicts an increase in life expectancy (0.99 year for males and 0.64 years for females), and an increase in population size (2.2-7.5% dependent on the age group) if smoking prevalence is reduced by half. Sensitivity analyses of all findings are performed. The generalization of the model to account for multiple risk factors (e.g., the simultaneous effects of alcohol consumption, obesity, and smoking) and effects on medical expenditures are discussed.
Collapse
Affiliation(s)
- Igor Akushevich
- Center for Demographic Studies, Duke University, Durham, NC 27708-0408, USA.
| | | | | |
Collapse
|
18
|
Hall WD, Lucke J. Assessing the impact of prescribed medicines on health outcomes. AUSTRALIA AND NEW ZEALAND HEALTH POLICY 2007; 4:1. [PMID: 17300734 PMCID: PMC1810306 DOI: 10.1186/1743-8462-4-1] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 08/25/2006] [Accepted: 02/15/2007] [Indexed: 11/22/2022]
Abstract
This paper reviews methods that can be used to assess the impact of medicine use on population health outcomes. In the absence of a gold standard, we argue that a convergence of evidence from different types of studies using multiple methods of independent imperfection provides the best bases for attributing improvements in health outcomes to the use of medicines. The major requirements are: good evidence that a safe and effective medicine is being appropriately prescribed; covariation between medicine use and improved health outcomes; and being able to discount alternative explanations of the covariation (via covariate adjustment, propensity analyses and sensitivity analyses), so that medicine use is the most plausible explanation of the improved health outcomes. The strongest possible evidence would be provided by the coherence of the following types of evidence: (1) individual linked data showing that patients are prescribed the medicine, there are reasonable levels of patient compliance, and there is a relationship between medicine use and health improvements that is not explained by other factors; (2) ecological evidence of improvements in these health outcomes in the population in which the medicine is used. Confidence in these inferences would be increased by: the replication of these results in comparable countries and consistent trends in population vital statistics in countries that have introduced the medicine; and epidemiological modelling indicating that changes observed in population health outcomes are plausible given the epidemiology of the condition being treated.
Collapse
Affiliation(s)
- Wayne D Hall
- School of Population Health, University of Queensland, Herston QLD, 4006, Australia
- Population Health and Uses of Medicines Unit, University of New South Wales, Sydney, Australia
| | - Jayne Lucke
- School of Population Health, University of Queensland, Herston QLD, 4006, Australia
| |
Collapse
|
19
|
Mueller E, Maxion-Bergemann S, Gultyaev D, Walzer S, Freemantle N, Mathieu C, Bolinder B, Gerber R, Kvasz M, Bergemann R. Development and validation of the Economic Assessment of Glycemic Control and Long-Term Effects of diabetes (EAGLE) model. Diabetes Technol Ther 2006; 8:219-36. [PMID: 16734551 DOI: 10.1089/dia.2006.8.219] [Citation(s) in RCA: 33] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
BACKGROUND The Economic Assessment of Glycemic control and Long-term Effects of diabetes (EAGLE) model was developed to provide a flexible and comprehensive tool for the simulation of the long-term effects of diabetes treatment and related costs in type 1 and type 2 diabetes. METHODS EAGLE simulations are based on risk equations, which were developed using published data from several large studies including the Diabetes Control and Complications Trial, the United Kingdom Prospective Diabetes Study, and the Wisconsin Epidemiological Study of Diabetic Retinopathy. Risk equations for the probability of complications (including hypoglycemia, retinopathy, macular edema, end-stage renal disease, neuropathy, diabetic foot syndrome, myocardial infarction, and stroke) were based on regression analyses, using linear, exponential, and quadratic regression formulae. Subsequent cost calculations are made from the simulated event rates. Internal validation of the EAGLE model was completed by comparing simulated event rates with the published event rates used as the basis for the model. RESULTS EAGLE provides microsimulations of virtual patient cohorts for type 1 and type 2 diabetes over n years in 1-year cycles. Complications include microvascular and macrovascular events and death, which are calculated over time as cumulative incidences. Glycosylated hemoglobin levels over time are simulated in relation to treatment regimen. Internal validation demonstrated that each mean event rate simulated by EAGLE overlapped with the published mean event (within a range of +/-10%). CONCLUSIONS The EAGLE model is an evidence-based, internally valid tool for the assessment of the long-term effects of diabetes treatment and related costs.
Collapse
|
20
|
Ramsey SD, Burke W, Pinsky L, Clarke L, Newcomb P, Khoury MJ. Family history assessment to detect increased risk for colorectal cancer: conceptual considerations and a preliminary economic analysis. Cancer Epidemiol Biomarkers Prev 2005; 14:2494-500. [PMID: 16284369 PMCID: PMC2692569 DOI: 10.1158/1055-9965.epi-05-0418] [Citation(s) in RCA: 18] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
BACKGROUND Although the rationale for earlier screening of persons with a family history of colorectal cancer is plausible, there is no direct evidence that earlier assessment is either effective or cost-effective. OBJECTIVE To estimate the clinical and economic effect of using family history assessment to identify persons for colorectal cancer screening before age 50. METHODS We developed a decision model to compare costs and outcomes for two scenarios: (a) standard population screening starting at age 50; (b) family history assessment at age 40, followed by screening colonoscopy at age 40 for those with a suggestive family history of colorectal cancer. The analysis was conducted using the health insurer perspective. RESULTS Using U.S. population estimates, 22 million would be eligible for family history assessment, and one million would be eligible for early colonoscopy; 2,834 invasive cancers would be detected, and 29,331 life years would be gained. The initial program cost would be USD $900 million. The discounted cost per life year gained of family history assessment versus no assessment equals USD $58,228. The results were most sensitive to the life expectancy benefit from earlier screening, the cost of colonoscopy, and the relative risk of colon cancer in those with a family history. CONCLUSIONS The cost-effectiveness of family history assessment for colorectal cancer approaches that of other widely accepted technologies; yet, the results are sensitive to several assumptions where better data are needed. Because of the relatively high prevalence of family history in the population, careful analysis and empirical data are needed.
Collapse
Affiliation(s)
- Scott D Ramsey
- Fred Hutchinson Cancer Research Center, 1100 Fairview Avenue North M2-B230, P.O. Box 19024, Seattle, WA 98109, USA.
| | | | | | | | | | | |
Collapse
|
21
|
Plevritis SK. Decision Analysis and Simulation Modeling for Evaluating Diagnostic Tests on the Basis of Patient Outcomes. AJR Am J Roentgenol 2005; 185:581-90. [PMID: 16120903 DOI: 10.2214/ajr.185.3.01850581] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
22
|
Mortenson MM, Ho HS, Bold RJ. An analysis of cost and clinical outcome in palliation for advanced pancreatic cancer. Am J Surg 2005; 190:406-11. [PMID: 16105527 DOI: 10.1016/j.amjsurg.2005.03.014] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2004] [Revised: 01/08/2005] [Indexed: 12/11/2022]
Abstract
BACKGROUND The optimal palliative method for patients with unresectable pancreatic cancer remains controversial. METHODS A retrospective chart review evaluated patients who underwent exploration for presumed resectable pancreatic cancer. Cost-based analysis was performed using relative value units (RVUs) that included the initial surgical procedure and any additional procedure required to achieve satisfactory palliation. RESULTS Of 96 patients (1993--2002), 6% had biliary bypass, 42% had duodenal bypass, 40% had double bypass, and 13% had no procedure with equivalent clinical outcomes. If biliary bypass was not initially performed, there was a significant incidence of biliary complications before definitive endoscopic stenting (P=.01). If duodenal bypass was not initially performed, 11% developed duodenal obstruction (P=.04). Total RVUs was highest for a double bypass and lowest for no initial surgical palliative procedure. CONCLUSIONS Although surgical bypass procedures at initial exploration provide durable palliation, these procedures are associated with greater costs.
Collapse
Affiliation(s)
- Melinda M Mortenson
- Department of Surgery, University of California, Davis Medical Center, Sacramento, CA 95817, USA
| | | | | |
Collapse
|
23
|
Affiliation(s)
- David J Vanness
- Division of Health Care Policy & Research Mayo Clinic Rochester, MN, USA
| |
Collapse
|