1
|
Goldstein BA, Mohottige D, Bessias S, Cary MP. Enhancing Clinical Decision Support in Nephrology: Addressing Algorithmic Bias Through Artificial Intelligence Governance. Am J Kidney Dis 2024:S0272-6386(24)00791-1. [PMID: 38851444 DOI: 10.1053/j.ajkd.2024.04.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2023] [Revised: 04/01/2024] [Accepted: 04/06/2024] [Indexed: 06/10/2024]
Abstract
There has been a steady rise in the use of clinical decision support (CDS) tools to guide Nephrology, as well as general clinical care. Through guidance set by federal agencies and concerns raised by clinical investigators, there has been an equal rise in understanding whether such tools exhibit algorithmic bias leading to unfairness. This has spurred the more fundamental question of whether sensitive variables such as race should be included in CDS tools. In order to properly answer this question, it is necessary to understand how algorithmic bias arises. We break down three sources of bias encountered when using electronic health record data to develop CDS tools: (1) use of proxy variables, (2) observability concerns and (3) underlying heterogeneity. We discuss how answering the question of whether to include sensitive variables like race often hinges more on qualitative considerations than on quantitative analysis, dependent on the function that the sensitive variable serves. Based on our experience with our own institution's CDS governance group, we show how health system-based governance committees play a central role in guiding these difficult and important considerations. Ultimately, our goal is to foster a community practice of model development and governance teams that emphasizes consciousness about sensitive variables and prioritizes equity.
Collapse
Affiliation(s)
- Benjamin A Goldstein
- Department of Biostatistics and Bioinformatics, School of Medicine, Duke University, Durham NC; AI Health, School of Medicine, Duke University, Durham NC.
| | - Dinushika Mohottige
- Institute for Health Equity Research, Department of Population Health, Icahn School of Medicine at Mount Sinai, New York, NY; Barbara T. Murphy Division of Nephrology, Department of Medicine, Icahn School of Medicine at Mount Sinai, New York, NY
| | - Sophia Bessias
- AI Health, School of Medicine, Duke University, Durham NC
| | - Michael P Cary
- AI Health, School of Medicine, Duke University, Durham NC; School of Nursing, Duke University, Durham NC
| |
Collapse
|
2
|
Cary MP, Zink A, Wei S, Olson A, Yan M, Senior R, Bessias S, Gadhoumi K, Jean-Pierre G, Wang D, Ledbetter LS, Economou-Zavlanos NJ, Obermeyer Z, Pencina MJ. Mitigating Racial And Ethnic Bias And Advancing Health Equity In Clinical Algorithms: A Scoping Review. Health Aff (Millwood) 2023; 42:1359-1368. [PMID: 37782868 PMCID: PMC10668606 DOI: 10.1377/hlthaff.2023.00553] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/04/2023]
Abstract
In August 2022 the Department of Health and Human Services (HHS) issued a notice of proposed rulemaking prohibiting covered entities, which include health care providers and health plans, from discriminating against individuals when using clinical algorithms in decision making. However, HHS did not provide specific guidelines on how covered entities should prevent discrimination. We conducted a scoping review of literature published during the period 2011-22 to identify health care applications, frameworks, reviews and perspectives, and assessment tools that identify and mitigate bias in clinical algorithms, with a specific focus on racial and ethnic bias. Our scoping review encompassed 109 articles comprising 45 empirical health care applications that included tools tested in health care settings, 16 frameworks, and 48 reviews and perspectives. We identified a wide range of technical, operational, and systemwide bias mitigation strategies for clinical algorithms, but there was no consensus in the literature on a single best practice that covered entities could employ to meet the HHS requirements. Future research should identify optimal bias mitigation methods for various scenarios, depending on factors such as patient population, clinical setting, algorithm design, and types of bias to be addressed.
Collapse
Affiliation(s)
- Michael P Cary
- Michael P. Cary Jr. , Duke University, Durham, North Carolina
| | - Anna Zink
- Anna Zink, University of Chicago, Chicago, Illinois
| | - Sijia Wei
- Sijia Wei, Northwestern University, Chicago, Illinois
| | | | | | | | | | | | | | | | | | | | - Ziad Obermeyer
- Ziad Obermeyer, University of California Berkeley, Berkeley, California
| | | |
Collapse
|
3
|
Abstract
OBJECTIVES Through a scoping review, we examine in this survey what ways health equity has been promoted in clinical research informatics with patient implications and especially published in the year of 2021 (and some in 2022). METHOD A scoping review was conducted guided by using methods described in the Joanna Briggs Institute Manual. The review process consisted of five stages: 1) development of aim and research question, 2) literature search, 3) literature screening and selection, 4) data extraction, and 5) accumulate and report results. RESULTS From the 478 identified papers in 2021 on the topic of clinical research informatics with focus on health equity as a patient implication, 8 papers met our inclusion criteria. All included papers focused on artificial intelligence (AI) technology. The papers addressed health equity in clinical research informatics either through the exposure of inequity in AI-based solutions or using AI as a tool for promoting health equity in the delivery of healthcare services. While algorithmic bias poses a risk to health equity within AI-based solutions, AI has also uncovered inequity in traditional treatment and demonstrated effective complements and alternatives that promotes health equity. CONCLUSIONS Clinical research informatics with implications for patients still face challenges of ethical nature and clinical value. However, used prudently-for the right purpose in the right context-clinical research informatics could bring powerful tools in advancing health equity in patient care.
Collapse
|
4
|
Bedoya AD, Economou-Zavlanos NJ, Goldstein BA, Young A, Jelovsek JE, O'Brien C, Parrish AB, Elengold S, Lytle K, Balu S, Huang E, Poon EG, Pencina MJ. A framework for the oversight and local deployment of safe and high-quality prediction models. J Am Med Inform Assoc 2022; 29:1631-1636. [PMID: 35641123 DOI: 10.1093/jamia/ocac078] [Citation(s) in RCA: 21] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2022] [Revised: 04/08/2022] [Accepted: 05/16/2022] [Indexed: 11/13/2022] Open
Abstract
Artificial intelligence/machine learning models are being rapidly developed and used in clinical practice. However, many models are deployed without a clear understanding of clinical or operational impact and frequently lack monitoring plans that can detect potential safety signals. There is a lack of consensus in establishing governance to deploy, pilot, and monitor algorithms within operational healthcare delivery workflows. Here, we describe a governance framework that combines current regulatory best practices and lifecycle management of predictive models being used for clinical care. Since January 2021, we have successfully added models to our governance portfolio and are currently managing 52 models.
Collapse
Affiliation(s)
- Armando D Bedoya
- Department of Medicine, Duke University, Durham, North Carolina, USA.,Duke University Health System, Durham, North Carolina, USA
| | | | - Benjamin A Goldstein
- Department of Biostatistics and Bioinformatics, Duke University School of Medicine, Durham, North Carolina, USA
| | - Allison Young
- Duke University School of Medicine, Durham, North Carolina, USA
| | - J Eric Jelovsek
- Department of Obstetrics and Gynecology, Duke University, Durham, North Carolina, USA
| | - Cara O'Brien
- Department of Medicine, Duke University, Durham, North Carolina, USA.,Duke University Health System, Durham, North Carolina, USA
| | | | - Scott Elengold
- Office of Counsel, Duke University, Durham, North Carolina, USA
| | - Kay Lytle
- Duke University Health System, Durham, North Carolina, USA
| | - Suresh Balu
- Duke Institute for Health Innovation, Durham, North Carolina, USA
| | - Erich Huang
- Department of Medicine, Duke University, Durham, North Carolina, USA.,Duke University Health System, Durham, North Carolina, USA
| | - Eric G Poon
- Department of Medicine, Duke University, Durham, North Carolina, USA.,Duke University Health System, Durham, North Carolina, USA.,Department of Biostatistics and Bioinformatics, Duke University School of Medicine, Durham, North Carolina, USA
| | - Michael J Pencina
- Department of Biostatistics and Bioinformatics, Duke University School of Medicine, Durham, North Carolina, USA.,Duke AI Health, Duke University School of Medicine, Durham, North Carolina, USA
| |
Collapse
|