1
|
Welton NJ, Ades AE. Estimation of Markov Chain Transition Probabilities and Rates from Fully and Partially Observed Data: Uncertainty Propagation, Evidence Synthesis, and Model Calibration. Med Decis Making 2016; 25:633-45. [PMID: 16282214 DOI: 10.1177/0272989x05282637] [Citation(s) in RCA: 117] [Impact Index Per Article: 14.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Markov transition models are frequently used to model disease progression. The authors show how the solution to Kolmogorov’s forward equations can be exploited to map between transition rates and probabilities from probability data in multistate models. They provide a uniform, Bayesian treatment of estimation and propagation of uncertainty of transition rates and probabilities when 1) observations are available on all transitions and exact time at risk in each state (fully observed data) and 2) observations are on initial state and final state after a fixed interval of time but not on the sequence of transitions (partially observed data). The authors show how underlying transition rates can be recovered from partially observed data using Markov chain Monte Carlo methods in WinBUGS, and they suggest diagnostics to investigate inconsistencies between evidence from different starting states. An illustrative example for a 3-state model is given, which shows how the methods extend to more complex Markov models using the software WBDiff to compute solutions. Finally, the authors illustrate how to statistically combine data from multiple sources, including partially observed data at several follow-up times and also how to calibrate a Markov model to be consistent with data from one specific study.
Collapse
Affiliation(s)
- Nicky J Welton
- MRC Health Services Research Collaboration, Bristol, United Kingdom.
| | | |
Collapse
|
2
|
Hazen GB, Huang M. Large-Sample Bayesian Posterior Distributions for Probabilistic Sensitivity Analysis. Med Decis Making 2016; 26:512-34. [PMID: 16997928 DOI: 10.1177/0272989x06290487] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
In probabilistic sensitivity analyses, analysts assign probability distributions to uncertain model parameters and use Monte Carlo simulation to estimate the sensitivity of model results to parameter uncertainty. The authors present Bayesian methods for constructing large-sample approximate posterior distributions for probabilities, rates, and relative effect parameters, for both controlled and uncontrolled studies, and discuss how to use these posterior distributions in a probabilistic sensitivity analysis. These results draw on and extend procedures from the literature on large-sample Bayesian posterior distributions and Bayesian random effects meta-analysis. They improve on standard approaches to probabilistic sensitivity analysis by allowing a proper accounting for heterogeneity across studies as well as dependence between control and treatment parameters, while still being simple enough to be carried out on a spreadsheet. The authors apply these methods to conduct a probabilistic sensitivity analysis for a recently published analysis of zidovudine prophylaxis following rapid HIV testing in labor to prevent vertical HIV transmission in pregnant women.
Collapse
Affiliation(s)
- Gordon B Hazen
- IEMS Department, Northwestern University, Evanston, IL 60208-3119, USA.
| | | |
Collapse
|
3
|
Bachman D, Nyland J, Krupp R. Reverse-total shoulder arthroplasty cost-effectiveness: A quality-adjusted life years comparison with total hip arthroplasty. World J Orthop 2016; 7:123-127. [PMID: 26925384 PMCID: PMC4757657 DOI: 10.5312/wjo.v7.i2.123] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/03/2015] [Revised: 10/10/2015] [Accepted: 12/08/2015] [Indexed: 02/06/2023] Open
Abstract
AIM: To compare reverse-total shoulder arthroplasty (RSA) cost-effectiveness with total hip arthroplasty cost-effectiveness.
METHODS: This study used a stochastic model and decision-making algorithm to compare the cost-effectiveness of RSA and total hip arthroplasty. Fifteen patients underwent pre-operative, and 3, 6, and 12 mo post-operative clinical examinations and Short Form-36 Health Survey completion. Short form-36 Health Survey subscale scores were converted to EuroQual Group Five Dimension Health Outcome scores and compared with historical data from age-matched patients who had undergone total hip arthroplasty. Quality-adjusted life year (QALY) improvements based on life expectancies were calculated.
RESULTS: The cost/QALY was $3900 for total hip arthroplasty and $11100 for RSA. After adjusting the model to only include shoulder-specific physical function subscale items, the RSA QALY improved to 2.8 years, and its cost/QALY decreased to $8100.
CONCLUSION: Based on industry accepted standards, cost/QALY estimates supported both RSA and total hip arthroplasty cost-effectiveness. Although total hip arthroplasty remains the quality of life improvement “gold standard” among arthroplasty procedures, cost/QALY estimates identified in this study support the growing use of RSA to improve patient quality of life.
Collapse
|
4
|
Soares MO, Canto E Castro L. Continuous time simulation and discretized models for cost-effectiveness analysis. PHARMACOECONOMICS 2012; 30:1101-1117. [PMID: 23116289 DOI: 10.2165/11599380-000000000-00000] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/01/2023]
Abstract
The design of decision-analytic models for cost-effectiveness analysis has been the subject of discussion. The current work addresses this issue by noting that, when time is to be explicitly modelled, we need to represent phenomena occurring in continuous time. Models evaluated in continuous time may not have closed-form solutions, and in this case, two approximations can be used: simulation models in continuous time and discretized models at the aggregate level. Stylized examples were set up where both approximations could be implemented. These aimed to illustrate determinants of the use of the two approximations: cycle length and precision, the use of continuity corrections in discretized models and the discretization of rates into probabilities. The examples were also used to explore the impact of the approximations not only in terms of absolute survival but also cost effectiveness and incremental comparisons. Discretized models better approximate continuous time results if lower cycle lengths are used. Continuous time simulation models are inherently stochastic, and the precision of the results is determined by the simulation sample size. The use of continuity corrections in discretized models allows the use of greater cycle lengths, producing no significant bias from the discretization. How the process is discretized (the conversion of rates into probabilities) is key. Results show that appropriate discretization coupled with the use of a continuity correction produces results unbiased for higher cycle lengths. Alternative methods of discretization are less efficient, i.e. lower cycle lengths are needed to obtain unbiased results. The developed work showed the importance of acknowledging bias in estimating cost effectiveness. When the alternative approximations can be applied, we argue that it is preferable to implement a cohort discretized model rather than a simulation model in continuous time. In practice, however, it may not be possible to represent the decision problem by any conventionally defined discretized model, in which case other model designs need to be applied, e.g. a simulation model.
Collapse
|
5
|
Abstract
Cohort analysis is a widespread tool for computing expected costs and quality-adjusted life years (QALYs) in Markov models for medical cost-effectiveness analyses. Although not always explicitly identified, such models commonly have multiple simple factors, or components. In these, a health state consists of a multiple component vector, one component for each factor, and arbitrary combinations of components are possible. The authors show here that when the model does not assume any probabilistic dependence among these factors, then a standard cohort analysis may be decomposed into several independent cohort analyses, one for each factor, and the results may be combined to produce desired expected costs and QALYs. These single-factor cohort analyses are not only simpler but also computationally more efficient. The authors derive the appropriate formulas for this cohort decomposition in discrete time and give several examples of their use based on published cost-effectiveness analyses. Explicitly identifying the simple factors of which a model is composed allows these factors to be portrayed graphically. Graphical depiction of the simple factors that comprise a model reduces model complexity, makes model formulation easier and more transparent, and thereby facilitates peer inspection and critique.
Collapse
Affiliation(s)
- Gordon Hazen
- Department of Industrial Engineering and Management Sciences, Northwestern University, Evanston, Illinois
| | - Zhe Li
- Department of Industrial Engineering and Management Sciences, Northwestern University, Evanston, Illinois
| |
Collapse
|
6
|
Hazen GB, Schwartz A. Incorporating extrinsic goals into decision and cost-effectiveness analyses. Med Decis Making 2009; 29:580-9. [PMID: 19329774 DOI: 10.1177/0272989x09333121] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
It has not been widely recognized that medical patients as individuals may have goals that are not easily expressed in terms of quality-adjusted life years (QALYs). The QALY model deals with ongoing goals such as reducing pain or maintaining mobility, but goals such as completing an important project or seeing a child graduate from college occur at unique points in time and do not lend themselves to easy expression in terms of QALYs. Such extrinsic goals have been posited as explanations for preferences inconsistent with the QALY model, such as unwillingness to trade away time or accept gambles. In this article, the authors examine methods for including extrinsic goals in medical decision and cost-effectiveness analyses. As illustrations, they revisit 2 previously published analyses, the management of unruptured intracranial arteriovenous malformations (AVMs) and the evaluation of preventive strategies for BRCA + women.
Collapse
Affiliation(s)
- Gordon B Hazen
- Department of Industrial Engineering and Management Sciences, Northwestern University, Evanston, Illinois 60208, USA.
| | | |
Collapse
|
7
|
|
8
|
Hazen GB, Huang M. Parametric Sensitivity Analysis Using Large-Sample Approximate Bayesian Posterior Distributions. DECISION ANALYSIS 2006. [DOI: 10.1287/deca.1060.0078] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/05/2023]
|
9
|
Abstract
To date, decision trees and Markov models have been the most common methods used in pharmacoeconomic evaluations. Both of these techniques lack the flexibility required to appropriately represent clinical reality. In this paper an alternative, more natural, way to model clinical reality--discrete event simulation--is presented and its application is illustrated with a real world example.A discrete event simulation represents the course of disease very naturally, with few restrictions. Neither mutually exclusive branches nor states are required, nor is a fixed cycle. All relevant aspects can be incorporated explicitly and efficiently. Flexibility in handling perspectives and carrying out sensitivity analyses, including structural variations, is incorporated and the entire model can be presented very transparently. The main limitations are imposed by lack of data to fit realistic models. Discrete event simulation, though rarely employed in pharmacoeconomics today, should be strongly considered when carrying out economic evaluations, particularly those aimed at informing policy makers and at estimating the budget impact of a pharmaceutical intervention.
Collapse
Affiliation(s)
- J Jaime Caro
- Caro Research Institute, 336 Baker Avenue, Concord, MA 01742, USA.
| |
Collapse
|
10
|
Bravata DM, McDonald KM, Szeto H, Smith WM, Rydzak C, Owens DK. A conceptual framework for evaluating information technologies and decision support systems for bioterrorism preparedness and response. Med Decis Making 2004; 24:192-206. [PMID: 15090105 DOI: 10.1177/0272989x04263254] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
OBJECTIVES The authors sought to develop a conceptual framework for evaluating whether existing information technologies and decision support systems (IT/DSSs) would assist the key decisions faced by clinicians and public health officials preparing for and responding to bioterrorism. METHODS They reviewed reports of natural and bioterrorism related infectious outbreaks, bioterrorism preparedness exercises, and advice from experts to identify the key decisions, tasks, and information needs of clinicians and public health officials during a bioterrorism response. The authors used task decomposition to identify the subtasks and data requirements of IT/DSSs designed to facilitate a bioterrorism response. They used the results of the task decomposition to develop evaluation criteria for IT/DSSs for bioterrorism preparedness. They then applied these evaluation criteria to 341 reports of 217 existing IT/DSSs that could be used to support a bioterrorism response. MAIN RESULTS In response to bioterrorism, clinicians must make decisions in 4 critical domains (diagnosis, management, prevention, and reporting to public health), and public health officials must make decisions in 4 other domains (interpretation of bioterrorism surveillance data, outbreak investigation, outbreak control, and communication). The time horizons and utility functions for these decisions differ. From the task decomposition, the authors identified critical subtasks for each of the 8 decisions. For example, interpretation of diagnostic tests is an important subtask of diagnostic decision making that requires an understanding of the tests' sensitivity and specificity. Therefore, an evaluation criterion applied to reports of diagnostic IT/DSSs for bioterrorism asked whether the reports described the systems' sensitivity and specificity. Of the 217 existing IT/DSSs that could be used to respond to bioterrorism, 79 studies evaluated 58 systems for at least 1 performance metric. CONCLUSIONS The authors identified 8 key decisions that clinicians and public health officials must make in response to bioterrorism. When applying the evaluation system to 217 currently available IT/DSSs that could potentially support the decisions of clinicians and public health officials, the authors found that the literature provides little information about the accuracy of these systems.
Collapse
Affiliation(s)
- Dena M Bravata
- Center for Primary Care and Outcomes Research, Stanford University, Stanford, California 94305-6019, USA.
| | | | | | | | | | | |
Collapse
|
11
|
Hazen GB. Stochastic trees and the StoTree modeling environment: models and software for medical decision analysis. J Med Syst 2002; 26:399-413. [PMID: 12182205 DOI: 10.1023/a:1016401115823] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
In this paper we present a review of stochastic trees, a convenient modeling approach for medical treatment decision analyses. Stochastic trees are a generalization of decision trees that incorporate useful features from continuous-time Markov chains. We also discuss StoTree, a freely available software tool for the formulation and solution of stochastic trees, implemented in the Excel spreadsheet environment.
Collapse
Affiliation(s)
- Gordon B Hazen
- IE/MS Department, Northwestern University, Evanston, Illinois 60208, USA
| |
Collapse
|
12
|
Patten SB, Lee RC. Modeling methods for facilitating decisions in pharmaceutical policy and population therapeutics. Pharmacoepidemiol Drug Saf 2002; 11:165-8. [PMID: 11998542 DOI: 10.1002/pds.706] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Affiliation(s)
- Scott B Patten
- Departments of Community Health Sciences and Psychiatry, University of Calgary, 3330 Hospital Drive NW, Calgary, AB, Canada T2N 4N1.
| | | |
Collapse
|
13
|
Abstract
Many real-world medical applications require timely actions to be taken in time pressured situations. Existing approaches to dynamic decision modeling have provided relatively efficient methods for representing and reasoning, but the process of computing the optimal solution has remained intractable. A major reason for this difficulty is the lack of models that are capable of modeling temporal processes and dealing with time-critical situations. This paper presents a formalism called the time-critical dynamic influence diagram that provide the capability for both temporal and space abstraction. To deal with the time criticality, we exploit the concept of space and temporal abstraction to reduce the computational complexity and propose an anytime algorithm for the solution process. We illustrate through out the paper, the various approaches with the use of a medical problem on the treatment of cardiac arrest.
Collapse
Affiliation(s)
- Yanping Xiang
- Department of Industrial and Systems Engineering, National University of Singapore, 10 Kent Ridge Crescent, 119260, Singapore
| | | |
Collapse
|
14
|
|
15
|
Cao C, Leong TY, Leong AP, Seow FC. Dynamic decision analysis in medicine: a data-driven approach. Int J Med Inform 1998; 51:13-28. [PMID: 9749896 DOI: 10.1016/s1386-5056(98)00085-9] [Citation(s) in RCA: 18] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
Dynamic decision analysis concerns decision problems in which both time and uncertainty are explicitly considered. Two major challenges in dynamic decision analysis are on proper formulation of a model for the problem and effective elicitation of the numerous time-dependent conditional probabilities for the model. Based on a new, general dynamic decision modeling framework called DynaMoL (Dynamic decision Modeling Language), we propose a data-driven approach to addressing these issues. Our approach uses available problem data from large medical databases, guides the decision modeling at a proper level of abstraction and establishes a Bayesian learning method for automatic extraction of the probabilistic parameters. We demonstrate the theoretical implications and practical promises of this new approach to dynamic decision analysis in medicine through a comprehensive case study in the optimal follow-up of patients after curative colorectal cancer surgery.
Collapse
Affiliation(s)
- C Cao
- Department of Information Systems and Computer Science, National University of Singapore, Singapore
| | | | | | | |
Collapse
|
16
|
Boiney LG, Winkler RL, Sarin RK, Matchar DB. Combining Patient Utility with Health Status Assessment to Improve Medical Decision Making. JOURNAL OF MULTI-CRITERIA DECISION ANALYSIS 1996. [DOI: 10.1002/(sici)1099-1360(199612)5:4<248::aid-mcda107>3.0.co;2-q] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
17
|
Hazen GB. Factored stochastic trees: a tool for solving complex temporal medical decision models. Med Decis Making 1993; 13:227-36. [PMID: 8412552 DOI: 10.1177/0272989x9301300309] [Citation(s) in RCA: 20] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/30/2023]
Abstract
The stochastic tree is a continuous-time version of a Markov-cycle tree, useful for constructing and solving medical decision models in which risks of mortality and morbidity may extend over time. Stochastic trees have advantages over Markov-cycle trees in graphic display and computational solution. Like the decision tree or Markov-cycle tree, stochastic tree models of complex medical decision problems can be too large for convenient graphic formulation and display. This paper introduces the notion of factoring a large stochastic tree into simpler components, each of which may be easily displayed. It also shows how the rollback solution procedure for unfactored stochastic trees may be conveniently adapted to solve factored trees. These concepts are illustrated using published examples from the medical literature.
Collapse
Affiliation(s)
- G B Hazen
- Department of Industrial Engineering and Management Sciences, Northwestern University, Evanston, Illinois 60208-3119
| |
Collapse
|