1
|
Purysko AS, Zacharias-Andrews K, Tomkins KG, Turkbey IB, Giganti F, Bhargavan-Chatfield M, Larson DB. Improving Prostate MR Image Quality in Practice - Initial results from the ACR Prostate MR Image Quality Improvement Collaborative. J Am Coll Radiol 2024:S1546-1440(24)00416-2. [PMID: 38729590 DOI: 10.1016/j.jacr.2024.04.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2023] [Revised: 04/06/2024] [Accepted: 04/13/2024] [Indexed: 05/12/2024]
Abstract
OBJECTIVE Variability in prostate MRI quality is an increasingly recognized problem that negatively affects patient care. This report aims to describe the results and key learnings of the first cohort of the ACR Learning Network Prostate MR Image Quality Improvement Collaborative. METHODS Teams from five organizations in the U.S. were trained on a structured improvement method. After reaching a consensus on image quality and auditing their images using the Prostate Imaging Quality (PI-QUAL) system, teams conducted a current state analysis to identify barriers to obtaining high-quality images. Through plan-do-study-act cycles involving frontline staff, each site designed and tested interventions targeting image quality key drivers. The percentage of exams meeting quality criteria (i.e., PI-QUAL score ≥ 4) was plotted on a run chart, and project progress was reviewed in weekly meetings. At the collaborative level, the goal was to increase the percentage of exams with PI-QUAL ≥ 4 to at least 85%. RESULTS Across 2380 exams audited, the mean weekly rates of prostate MR exams meeting image quality criteria increased from 67% (range: 60-74%) at baseline to 87% (range: 80-97%) upon program completion. The most commonly employed interventions were MR protocol adjustments, development and implementation of patient preparation instructions, personell training and development of an auditing process mechanism. CONCLUSION A Learning Network model, where organizations share knowledge and work together toward a common goal, can improve prostate MR image quality at multiple sites simultaneously. The inaugural cohort's key learnings provide a roadmap for improvement on a broader scale.
Collapse
Affiliation(s)
- Andrei S Purysko
- Section of Abdominal Imaging, Imaging Institute, Cleveland Clinic, Cleveland, OH, USA.
| | | | | | - Ismail Baris Turkbey
- Molecular Imaging Program, National Cancer Institute, Bethesda, MD, USA. https://twitter.com/radiolobt
| | - Francesco Giganti
- Department of Radiology, University College London Hospital NHS Foundation Trust, London, UK; Division of Surgery & Interventional Science, University College London, London, UK. https://twitter.com/giga_fra
| | | | - David B Larson
- Department of Radiology, Stanford University School of Medicine, Stanford, CA, USA. https://twitter.com/larson_david_b
| |
Collapse
|
2
|
Larson DB, Doo FX, Allen B, Mongan J, Flanders AE, Wald C. Proceedings from the 2022 ACR-RSNA Workshop on Safety, Effectiveness, Reliability, and Transparency in AI. J Am Coll Radiol 2024:S1546-1440(24)00137-6. [PMID: 38354844 DOI: 10.1016/j.jacr.2024.01.024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/24/2023] [Revised: 01/27/2024] [Accepted: 01/27/2024] [Indexed: 02/16/2024]
Abstract
Despite the surge in AI development for healthcare applications, particularly for medical imaging applications, there has been limited adoption of such AI tools into clinical practice. During a one-day workshop in November, 2022, co-organized by the American College of Radiology (ACR) and the Radiological Society of North America (RSNA), participants outlined experiences and problems with implementing AI in clinical practice, defined the needs of various stakeholders in the AI ecosystem, and elicited potential solutions and strategies related to the safety, effectiveness, reliability, and transparency of AI algorithms. Participants included radiologists from academic and community radiology practices, informatics leaders responsible for AI implementation, regulatory agency employees, and specialty society representatives. The major themes that emerged fell into two categories: 1) AI product development and 2) implementation of AI-based applications in clinical practice. In particular, participants highlighted key aspects of AI product development to include clear clinical task definitions; well-curated data from diverse geographic, economical, and healthcare settings; standards and mechanisms to monitor model reliability; and transparency regarding model performance, both in controlled and real-world settings. For implementation, participants emphasized the need for strong institutional governance; systematic evaluation, selection, and validation methods carried out by local teams; seamless integration into the clinical workflow; performance monitoring and support by local teams; performance monitoring by external entities; and alignment of incentives through credentialing and reimbursement. Participants predicted that clinical implementation of AI in radiology will continue to be limited until the safety, effectiveness, reliability, and transparency of such tools are more fully addressed.
Collapse
Affiliation(s)
- David B Larson
- Department of Radiology, Stanford University Medical Center, Stanford, CA.
| | - Florence X Doo
- University of Maryland Medical Intelligent Imaging (UM2ii) Center, Baltimore, MD. https://twitter.com/flo_doo
| | - Bibb Allen
- Department of Radiology, Grandview Medical Center, Birmingham, AL. https://twitter.com/bibballen
| | - John Mongan
- Department of Radiology and Biomedical Imaging, University of California San Francisco, San Francisco, CA. https://twitter.com/MonganMD
| | - Adam E Flanders
- Department of Radiology, Thomas Jefferson University, Philadelphia, PA. https://twitter.com/BFlanksteak
| | - Christoph Wald
- Department of Radiology, Lahey Hospital and Medical Center, Boston, MA. https://twitter.com/waldchristoph
| |
Collapse
|
3
|
Woolen SA, Larson DB, Lewis GM, Malik K, Foster CA, Martin MF, Maturen KE. Leveraging Quality Improvement Principles for Radiology Sustainability: Bridging Advocacy and Action. J Am Coll Radiol 2024; 21:234-238. [PMID: 37956883 DOI: 10.1016/j.jacr.2023.11.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2023] [Revised: 10/18/2023] [Accepted: 11/07/2023] [Indexed: 11/15/2023]
Affiliation(s)
- Sean A Woolen
- Department of Radiology and Biomedical Imaging, University of California, San Francisco, San Francisco, California
| | - David B Larson
- Senior Vice Chair for Strategy and Clinical Operations, Department of Radiology, Stanford University, Stanford, California; Chair of the ACR Commission on Quality and Safety, and the ABR Trustee for Quality and Safety
| | - Geoffrey M Lewis
- School for Environment and Sustainability, University of Michigan, Ann Arbor, Michigan
| | - Konrad Malik
- Department of Radiology, Michigan Medicine, Ann Arbor, Michigan
| | - Colby A Foster
- Department of Radiology, Michigan Medicine, Ann Arbor, Michigan
| | - Marisa F Martin
- Department of Radiology, Michigan Medicine, Ann Arbor, Michigan
| | - Katherine E Maturen
- Department of Radiology, Michigan Medicine, Ann Arbor, Michigan; Department of Obstetrics & Gynecology, Michigan Medicine, Ann Arbor, Michigan; Associate Chair for Ambulatory Care and Strategy, Michigan Radiology; and an ABR Trustee for Abdominal Imaging.
| |
Collapse
|
4
|
Larson DB. A Vision for Global CT Radiation Dose Optimization. J Am Coll Radiol 2024:S1546-1440(24)00120-0. [PMID: 38302037 DOI: 10.1016/j.jacr.2024.01.014] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2023] [Revised: 01/11/2024] [Accepted: 01/17/2024] [Indexed: 02/03/2024]
Abstract
The topic of CT radiation dose management is receiving renewed attention since the recent approval by CMS for new CT dose measures. Widespread variation in CT dose persists in practices across the world, suggesting that current dose optimization techniques are lacking. The author outlines a proposed strategy for facilitating global CT radiation dose optimization. CT radiation dose optimization can be defined as the routine use of CT scan parameters that consistently produce images just above the minimum threshold of acceptable image quality for a given clinical indication, accounting for relevant patient characteristics, using the most dose-efficient techniques available on the scanner. To accomplish this, an image quality-based target dose must be established for every protocol; for nonhead CT applications, these target dose values must be expressed as a function of patient size. As variation in outcomes is reduced, the dose targets can be decreased to more closely approximate the minimum image quality threshold. Maintaining CT radiation dose optimization requires a process control program, including measurement, evaluation, feedback, and control. This is best accomplished by local teams made up of radiologists, medical physicists, and technologists, supported with protected time and needed tools, including analytics and protocol management applications. Other stakeholders critical to facilitating CT radiation dose management include researchers, funding agencies, industry, regulators, accreditors, payers, and the ACR. Analogous coordinated approaches have transformed quality in other industries and can be the mechanism for achieving the universal goal of CT radiation dose optimization.
Collapse
Affiliation(s)
- David B Larson
- Executive Vice Chair, Department of Radiology, Stanford University School of Medicine, Stanford, California; and Chair, ACR Commission on Quality and Safety.
| |
Collapse
|
5
|
Ng MY, Youssef A, Miner AS, Sarellano D, Long J, Larson DB, Hernandez-Boussard T, Langlotz CP. Perceptions of Data Set Experts on Important Characteristics of Health Data Sets Ready for Machine Learning: A Qualitative Study. JAMA Netw Open 2023; 6:e2345892. [PMID: 38039004 PMCID: PMC10692863 DOI: 10.1001/jamanetworkopen.2023.45892] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/17/2023] [Accepted: 10/20/2023] [Indexed: 12/02/2023] Open
Abstract
Importance The lack of data quality frameworks to guide the development of artificial intelligence (AI)-ready data sets limits their usefulness for machine learning (ML) research in health care and hinders the diagnostic excellence of developed clinical AI applications for patient care. Objective To discern what constitutes high-quality and useful data sets for health and biomedical ML research purposes according to subject matter experts. Design, Setting, and Participants This qualitative study interviewed data set experts, particularly those who are creators and ML researchers. Semistructured interviews were conducted in English and remotely through a secure video conferencing platform between August 23, 2022, and January 5, 2023. A total of 93 experts were invited to participate. Twenty experts were enrolled and interviewed. Using purposive sampling, experts were affiliated with a diverse representation of 16 health data sets/databases across organizational sectors. Content analysis was used to evaluate survey information and thematic analysis was used to analyze interview data. Main Outcomes and Measures Data set experts' perceptions on what makes data sets AI ready. Results Participants included 20 data set experts (11 [55%] men; mean [SD] age, 42 [11] years), of whom all were health data set creators, and 18 of the 20 were also ML researchers. Themes (3 main and 11 subthemes) were identified and integrated into an AI-readiness framework to show their association within the health data ecosystem. Participants partially determined the AI readiness of data sets using priority appraisal elements of accuracy, completeness, consistency, and fitness. Ethical acquisition and societal impact emerged as appraisal considerations in that participant samples have not been described to date in prior data quality frameworks. Factors that drive creation of high-quality health data sets and mitigate risks associated with data reuse in ML research were also relevant to AI readiness. The state of data availability, data quality standards, documentation, team science, and incentivization were associated with elements of AI readiness and the overall perception of data set usefulness. Conclusions and Relevance In this qualitative study of data set experts, participants contributed to the development of a grounded framework for AI data set quality. Data set AI readiness required the concerted appraisal of many elements and the balancing of transparency and ethical reflection against pragmatic constraints. The movement toward more reliable, relevant, and ethical AI and ML applications for patient care will inevitably require strategic updates to data set creation practices.
Collapse
Affiliation(s)
- Madelena Y. Ng
- Department of Medicine (Biomedical Informatics), Stanford University School of Medicine, Stanford, California
- Department of Biomedical Data Science, Stanford University School of Medicine, Stanford, California
| | - Alaa Youssef
- Department of Radiology, Stanford University School of Medicine, Stanford, California
| | - Adam S. Miner
- Department of Psychiatry and Behavioral Sciences, Stanford University School of Medicine, Stanford, California
| | - Daniela Sarellano
- Department of Radiology, Stanford University School of Medicine, Stanford, California
| | - Jin Long
- Department of Pediatrics, Stanford University School of Medicine, Stanford, California
| | - David B. Larson
- Department of Radiology, Stanford University School of Medicine, Stanford, California
| | - Tina Hernandez-Boussard
- Department of Medicine (Biomedical Informatics), Stanford University School of Medicine, Stanford, California
- Department of Biomedical Data Science, Stanford University School of Medicine, Stanford, California
| | - Curtis P. Langlotz
- Department of Medicine (Biomedical Informatics), Stanford University School of Medicine, Stanford, California
- Department of Biomedical Data Science, Stanford University School of Medicine, Stanford, California
- Department of Radiology, Stanford University School of Medicine, Stanford, California
| |
Collapse
|
6
|
Shen L, Lobo VE, Cordova D, Larson DB, Kamaya A. Establishing a Point-of-Care Ultrasound Program: An Institutional Approach for Developing a Point-of-Care Ultrasound Program Infrastructure. J Am Coll Radiol 2023:S1546-1440(23)00941-9. [PMID: 37984768 DOI: 10.1016/j.jacr.2023.10.026] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2023] [Revised: 08/31/2023] [Accepted: 10/16/2023] [Indexed: 11/22/2023]
Abstract
Point-of-care ultrasound (POCUS) is rapidly accelerating in adoption and applications outside the traditional realm of diagnostic radiology departments. Although the use of this imaging technology in a distributed fashion has great potential, there are many associated challenges. To address these challenges, the authors developed an enterprise-wide POCUS program at their institution (Stanford Health Care). Here, the authors share their experience, the governance organization, and their approaches to device and information security, training, and quality assurance. The authors also share the basic principles they use to guide their approach to manage these challenges. Through their work, the authors have learned that a foundational framework of defining POCUS and the different levels of POCUS use and delineating program management elements are critical. The authors hope that their experience will be helpful to others who are also interested in POCUS or in the process of creating POCUS programs at their institutions. With a clearly established framework, patient safety and quality of care are improved for everyone.
Collapse
Affiliation(s)
- Luyao Shen
- Codirector of Point-of-Care Ultrasound, Department of Radiology, Stanford University School of Medicine, Stanford, California.
| | - Viveta E Lobo
- Codirector of Point-of-Care Ultrasound, Department of Emergency Medicine, Stanford University School of Medicine, Stanford, California
| | - Dorothy Cordova
- Program Manager of Point-of-Care Ultrasound, Imaging Services, Stanford Health Care, Stanford, California
| | - David B Larson
- Senior Vice Chair for Strategy and Clinical Operations, Department of Radiology, Stanford University School of Medicine, Stanford, California
| | - Aya Kamaya
- Chief of Body Imaging, Director of Ultrasound, Department of Radiology, Stanford University School of Medicine, Stanford, California
| |
Collapse
|
7
|
Larson DB, Flemming DJ, Barr RM, Canon CL, Morgan DE. Redesign of the American Board of Radiology Diagnostic Radiology Certifying Examination. AJR Am J Roentgenol 2023; 221:687-693. [PMID: 37315014 DOI: 10.2214/ajr.23.29585] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
On April 13, 2023, the American Board of Radiology (ABR) announced plans to replace the current computer-based diagnostic radiology (DR) certifying examination with a new oral examination to be administered remotely, beginning in 2028. This article describes the planned changes and the process that led to those changes. In keeping with its commitment to continuous improvement, the ABR gathered input regarding the DR initial certification process. Respondents generally agreed that the qualifying (core) examination was satisfactory but expressed concerns regarding the computer-based certifying examination's effectiveness and impact on training. Examination redesign was conducted using input from key groups with a goal of effectively evaluating competence and incentivizing study behaviors that best prepare candidates for radiology practice. Major design elements included examination structure, breadth and depth of content, and timing. The new oral examination will focus on critical findings as well as common and important diagnoses routinely encountered in all diagnostic specialties, including radiology procedures. Candidates will first be eligible for the examination in the calendar year after residency graduation. Additional details will be finalized and announced in coming years. The ABR will continue to engage with interested parties throughout the implementation process.
Collapse
Affiliation(s)
- David B Larson
- Department of Radiology, Stanford University School of Medicine, 453 Quarry Rd, Mail Code 5659, Stanford, CA 94304
| | - Donald J Flemming
- Department of Radiology, Penn State Health Milton S. Hershey Medical Center, Hershey, PA
| | | | - Cheri L Canon
- Department of Radiology, University of Alabama at Birmingham, Birmingham, AL
| | - Desiree E Morgan
- Department of Radiology, University of Alabama at Birmingham, Birmingham, AL
| |
Collapse
|
8
|
Purysko AS, Tempany C, Macura KJ, Turkbey B, Rosenkrantz AB, Gupta RT, Attridge L, Hernandez D, Garcia-Tomkins K, Bhargavan-Chatfield M, Weinreb J, Larson DB. American College of Radiology initiatives on prostate magnetic resonance imaging quality. Eur J Radiol 2023; 165:110937. [PMID: 37352683 PMCID: PMC10461171 DOI: 10.1016/j.ejrad.2023.110937] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2023] [Revised: 06/14/2023] [Accepted: 06/16/2023] [Indexed: 06/25/2023]
Abstract
Magnetic resonance imaging (MRI) has become integral to diagnosing and managing patients with suspected or confirmed prostate cancer. However, the benefits of utilizing MRI can be hindered by quality issues during imaging acquisition, interpretation, and reporting. As the utilization of prostate MRI continues to increase in clinical practice, the variability in MRI quality and how it can negatively impact patient care have become apparent. The American College of Radiology (ACR) has recognized this challenge and developed several initiatives to address the issue of inconsistent MRI quality and ensure that imaging centers deliver high-quality patient care. These initiatives include the Prostate Imaging Reporting and Data System (PI-RADS), developed in collaboration with an international panel of experts and members of the European Society of Urogenital Radiology (ESUR), the Prostate MR Image Quality Improvement Collaborative, which is part of the ACR Learning Network, the ACR Prostate Cancer MRI Center Designation, and the ACR Appropriateness Criteria. In this article, we will discuss the importance of these initiatives in establishing quality assurance and quality control programs for prostate MRI and how they can improve patient outcomes.
Collapse
Affiliation(s)
- Andrei S Purysko
- Section of Abdominal Imaging and Nuclear Radiology Department, Imaging Institute, Cleveland Clinic, Cleveland, OH, USA.
| | - Clare Tempany
- Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, MA, USA
| | - Katarzyna J Macura
- The Russel H. Morgan Department of Radiology and Radiological Science, The James Buchanan Brady Urological Institute, Johns Hopkins University, Baltimore, MD, USA
| | - Baris Turkbey
- Molecular Imaging Branch, NCI, NIH, Bethesda, MD, USA
| | | | - Rajan T Gupta
- Departments of Radiology and Surgery and Duke Cancer Institute Center for Prostate and Urologic Cancers, Duke University Medical Center, Durham, NC, USA
| | | | | | | | | | - Jeffrey Weinreb
- Department of Radiology, Yale School of Medicine, New Haven, CT, USA
| | - David B Larson
- Department of Radiology, Stanford University, Stanford, CA, USA
| |
Collapse
|
9
|
Lockhart ME, Larson DB, Thorwarth WT. Response to: American College of Radiology Appropriateness Criteria®: A bibliometric analysis of panel members. Insights Imaging 2023; 14:131. [PMID: 37466743 DOI: 10.1186/s13244-023-01457-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2023] [Accepted: 02/08/2023] [Indexed: 07/20/2023] Open
Affiliation(s)
- Mark E Lockhart
- University of Alabama at Birmingham, 1720 University Blvd, Birmingham, AL, 35294, USA.
| | - David B Larson
- Department of Radiology, Stanford University School of Medicine, 453 Quarry Rd, Mail Code 5659, Stanford, CA, 94305, USA
| | - William T Thorwarth
- American College of Radiology, 1892 Preston White Dr, Reston, VA, 20191, USA
| |
Collapse
|
10
|
Hwang GL, Vilendrer S, Amano A, Brown-Johnson C, Kling SM, Faust A, Willis MH, Larson DB. From Acceptable to Superlative: Scaling a Technologist Coaching Intervention to Improve Image Quality. J Am Coll Radiol 2023. [DOI: 10.1016/j.jacr.2022.10.007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/04/2023]
|
11
|
Larson DB, Tomkins KG, Zacharias-Andrews K, Irani N, Pittman SM, Purysko AS, Wandtke B, Bhargavan-Chatfield M. The ACR Learning Network: Facilitating Local Performance Improvement Through Shared Learning. J Am Coll Radiol 2023; 20:369-376. [PMID: 36922112 DOI: 10.1016/j.jacr.2023.01.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2022] [Revised: 01/19/2023] [Accepted: 01/24/2023] [Indexed: 03/14/2023]
Abstract
PURPOSE The ACR Learning Network was established to test the viability of the learning network model in radiology. In this report, the authors review the learning network concept, introduce the ACR Learning Network and its components, and report progress to date and plans for the future. METHODS Patterned after institutional programs developed by the principal investigator, the ACR Learning Network was composed of four distinct improvement collaboratives. Initial participating sites were solicited through broad program advertisement. Candidate programs were selected on the basis of assessments of local leadership support, experience with quality improvement initiatives, intraorganizational relationships, and access to data and analytic support. Participation began with completing a 27-week formal quality improvement training and project support program, with local teams reporting weekly progress on a common performance measure. RESULTS Four improvement collaborative topics were chosen for the initial cohort with the following numbers of participating sites: mammography positioning (6), prostate MR image quality (6), lung cancer screening (6), and follow-up on recommendations for management of incidental findings (4). To date, all sites have remained actively engaged and have progressed in an expected fashion. A detailed report of the results of the improvement phase will be provided in a future publication. CONCLUSIONS To date, the ACR Learning Network has successfully achieved planned milestones outlined in the program's plan, with preparation under way for the second and third cohorts. By providing a shared platform for improvement training and knowledge sharing, the authors are optimistic that the network may facilitate widespread performance improvement in radiology on a number of topics for years to come.
Collapse
Affiliation(s)
- David B Larson
- Senior Vice Chair, Strategy and Clinical Operations, Department of Radiology, Stanford University School of Medicine, Stanford, California; and Chair, ACR Commission on Quality and Safety.
| | | | | | - Neville Irani
- Healthcare Quality Improvement Platform, Leawood, Kansas
| | - Sarah M Pittman
- Department of Radiology, Stanford University School of Medicine, Stanford, California
| | - Andrei S Purysko
- Section of Abdominal Imaging, Imaging Institute, Cleveland Clinic, Cleveland, Ohio. https://twitter.com/Purysko
| | - Ben Wandtke
- Department of Imaging Sciences, University of Rochester Medical Center, Rochester, New York. https://twitter.com/DrWandtke
| | - Mythreyi Bhargavan-Chatfield
- Executive Vice President for Quality and Safety, American College of Radiology, Reston, Virginia. https://twitter.com/MythreyiC
| |
Collapse
|
12
|
Affiliation(s)
- David B Larson
- From the Department of Radiology, Stanford University School of Medicine, 453 Quarry Rd, Mail Code 5659, Stanford, CA 94304
| |
Collapse
|
13
|
Daye D, Wiggins WF, Lungren MP, Alkasab T, Kottler N, Allen B, Roth CJ, Bizzo BC, Durniak K, Brink JA, Larson DB, Dreyer KJ, Langlotz CP. Implementation of Clinical Artificial Intelligence in Radiology: Who Decides and How? Radiology 2022; 305:555-563. [PMID: 35916673 PMCID: PMC9713445 DOI: 10.1148/radiol.212151] [Citation(s) in RCA: 41] [Impact Index Per Article: 20.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2021] [Revised: 03/30/2022] [Accepted: 04/12/2022] [Indexed: 01/03/2023]
Abstract
As the role of artificial intelligence (AI) in clinical practice evolves, governance structures oversee the implementation, maintenance, and monitoring of clinical AI algorithms to enhance quality, manage resources, and ensure patient safety. In this article, a framework is established for the infrastructure required for clinical AI implementation and presents a road map for governance. The road map answers four key questions: Who decides which tools to implement? What factors should be considered when assessing an application for implementation? How should applications be implemented in clinical practice? Finally, how should tools be monitored and maintained after clinical implementation? Among the many challenges for the implementation of AI in clinical practice, devising flexible governance structures that can quickly adapt to a changing environment will be essential to ensure quality patient care and practice improvement objectives.
Collapse
Affiliation(s)
- Dania Daye
- From the Department of Radiology, Massachusetts General Hospital,
Harvard Medical School, 55 Fruit St, GRB 297, Boston, MA 02155 (D.D., T.A.,
B.C.B., K.D., J.A.B., K.J.D.); Department of Radiology, Duke University, Durham,
NC (W.F.W., C.J.R.); Department of Radiology, Stanford University, Stanford,
Calif (M.P.L., D.B.L., C.P.L.); Radiology Partners, El Segundo, Calif (N.K.);
and Department of Radiology, Grandview Medical Center, Birmingham, Ala
(B.A.)
| | - Walter F. Wiggins
- From the Department of Radiology, Massachusetts General Hospital,
Harvard Medical School, 55 Fruit St, GRB 297, Boston, MA 02155 (D.D., T.A.,
B.C.B., K.D., J.A.B., K.J.D.); Department of Radiology, Duke University, Durham,
NC (W.F.W., C.J.R.); Department of Radiology, Stanford University, Stanford,
Calif (M.P.L., D.B.L., C.P.L.); Radiology Partners, El Segundo, Calif (N.K.);
and Department of Radiology, Grandview Medical Center, Birmingham, Ala
(B.A.)
| | - Matthew P. Lungren
- From the Department of Radiology, Massachusetts General Hospital,
Harvard Medical School, 55 Fruit St, GRB 297, Boston, MA 02155 (D.D., T.A.,
B.C.B., K.D., J.A.B., K.J.D.); Department of Radiology, Duke University, Durham,
NC (W.F.W., C.J.R.); Department of Radiology, Stanford University, Stanford,
Calif (M.P.L., D.B.L., C.P.L.); Radiology Partners, El Segundo, Calif (N.K.);
and Department of Radiology, Grandview Medical Center, Birmingham, Ala
(B.A.)
| | - Tarik Alkasab
- From the Department of Radiology, Massachusetts General Hospital,
Harvard Medical School, 55 Fruit St, GRB 297, Boston, MA 02155 (D.D., T.A.,
B.C.B., K.D., J.A.B., K.J.D.); Department of Radiology, Duke University, Durham,
NC (W.F.W., C.J.R.); Department of Radiology, Stanford University, Stanford,
Calif (M.P.L., D.B.L., C.P.L.); Radiology Partners, El Segundo, Calif (N.K.);
and Department of Radiology, Grandview Medical Center, Birmingham, Ala
(B.A.)
| | - Nina Kottler
- From the Department of Radiology, Massachusetts General Hospital,
Harvard Medical School, 55 Fruit St, GRB 297, Boston, MA 02155 (D.D., T.A.,
B.C.B., K.D., J.A.B., K.J.D.); Department of Radiology, Duke University, Durham,
NC (W.F.W., C.J.R.); Department of Radiology, Stanford University, Stanford,
Calif (M.P.L., D.B.L., C.P.L.); Radiology Partners, El Segundo, Calif (N.K.);
and Department of Radiology, Grandview Medical Center, Birmingham, Ala
(B.A.)
| | - Bibb Allen
- From the Department of Radiology, Massachusetts General Hospital,
Harvard Medical School, 55 Fruit St, GRB 297, Boston, MA 02155 (D.D., T.A.,
B.C.B., K.D., J.A.B., K.J.D.); Department of Radiology, Duke University, Durham,
NC (W.F.W., C.J.R.); Department of Radiology, Stanford University, Stanford,
Calif (M.P.L., D.B.L., C.P.L.); Radiology Partners, El Segundo, Calif (N.K.);
and Department of Radiology, Grandview Medical Center, Birmingham, Ala
(B.A.)
| | - Christopher J. Roth
- From the Department of Radiology, Massachusetts General Hospital,
Harvard Medical School, 55 Fruit St, GRB 297, Boston, MA 02155 (D.D., T.A.,
B.C.B., K.D., J.A.B., K.J.D.); Department of Radiology, Duke University, Durham,
NC (W.F.W., C.J.R.); Department of Radiology, Stanford University, Stanford,
Calif (M.P.L., D.B.L., C.P.L.); Radiology Partners, El Segundo, Calif (N.K.);
and Department of Radiology, Grandview Medical Center, Birmingham, Ala
(B.A.)
| | - Bernardo C. Bizzo
- From the Department of Radiology, Massachusetts General Hospital,
Harvard Medical School, 55 Fruit St, GRB 297, Boston, MA 02155 (D.D., T.A.,
B.C.B., K.D., J.A.B., K.J.D.); Department of Radiology, Duke University, Durham,
NC (W.F.W., C.J.R.); Department of Radiology, Stanford University, Stanford,
Calif (M.P.L., D.B.L., C.P.L.); Radiology Partners, El Segundo, Calif (N.K.);
and Department of Radiology, Grandview Medical Center, Birmingham, Ala
(B.A.)
| | - Kimberly Durniak
- From the Department of Radiology, Massachusetts General Hospital,
Harvard Medical School, 55 Fruit St, GRB 297, Boston, MA 02155 (D.D., T.A.,
B.C.B., K.D., J.A.B., K.J.D.); Department of Radiology, Duke University, Durham,
NC (W.F.W., C.J.R.); Department of Radiology, Stanford University, Stanford,
Calif (M.P.L., D.B.L., C.P.L.); Radiology Partners, El Segundo, Calif (N.K.);
and Department of Radiology, Grandview Medical Center, Birmingham, Ala
(B.A.)
| | - James A. Brink
- From the Department of Radiology, Massachusetts General Hospital,
Harvard Medical School, 55 Fruit St, GRB 297, Boston, MA 02155 (D.D., T.A.,
B.C.B., K.D., J.A.B., K.J.D.); Department of Radiology, Duke University, Durham,
NC (W.F.W., C.J.R.); Department of Radiology, Stanford University, Stanford,
Calif (M.P.L., D.B.L., C.P.L.); Radiology Partners, El Segundo, Calif (N.K.);
and Department of Radiology, Grandview Medical Center, Birmingham, Ala
(B.A.)
| | - David B. Larson
- From the Department of Radiology, Massachusetts General Hospital,
Harvard Medical School, 55 Fruit St, GRB 297, Boston, MA 02155 (D.D., T.A.,
B.C.B., K.D., J.A.B., K.J.D.); Department of Radiology, Duke University, Durham,
NC (W.F.W., C.J.R.); Department of Radiology, Stanford University, Stanford,
Calif (M.P.L., D.B.L., C.P.L.); Radiology Partners, El Segundo, Calif (N.K.);
and Department of Radiology, Grandview Medical Center, Birmingham, Ala
(B.A.)
| | | | | |
Collapse
|
14
|
Daye D, Wiggins WF, Lungren MP, Alkasab T, Kottler N, Allen B, Roth CJ, Bizzo BC, Durniak K, Brink JA, Larson DB, Dreyer KJ, Langlotz CP. Implementation of Clinical Artificial Intelligence in Radiology: Who Decides and How? Radiology 2022; 305:E62. [PMID: 36154286 DOI: 10.1148/radiol.229021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
15
|
Vilendrer S, Saliba‐Gustafsson EA, Asch SM, Brown‐Johnson CG, Kling SM, Shaw JG, Winget M, Larson DB. Evaluating clinician‐led quality improvement initiatives: A system‐wide embedded research partnership at Stanford Medicine. Learn Health Syst 2022; 6:e10335. [PMID: 36263267 PMCID: PMC9576232 DOI: 10.1002/lrh2.10335] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2022] [Revised: 07/25/2022] [Accepted: 08/01/2022] [Indexed: 11/13/2022] Open
Abstract
Introduction Many healthcare delivery systems have developed clinician‐led quality improvement (QI) initiatives but fewer have also developed in‐house evaluation units. Engagement between the two entities creates unique opportunities. Stanford Medicine funded a collaboration between their Improvement Capability Development Program (ICDP), which coordinates and incentivizes clinician‐led QI efforts, and the Evaluation Sciences Unit (ESU), a multidisciplinary group of embedded researchers with expertise in implementation and evaluation sciences. Aim To describe the ICDP‐ESU partnership and report key learnings from the first 2 y of operation September 2019 to August 2021. Methods Department‐level physician and operational QI leaders were offered an ESU consultation to workshop design, methods, and overall scope of their annual QI projects. A steering committee of high‐level stakeholders from operational, clinical, and research perspectives subsequently selected three projects for in‐depth partnered evaluation with the ESU based on evaluability, importance to the health system, and broader relevance. Selected project teams met regularly with the ESU to develop mixed methods evaluations informed by relevant implementation science frameworks, while aligning the evaluation approach with the clinical teams' QI goals. Results Sixty and 62 ICDP projects were initiated during the 2 cycles, respectively, across 18 departments, of which ESU consulted with 15 (83%). Within each annual cycle, evaluators made actionable, summative findings rapidly available to partners to inform ongoing improvement. Other reported benefits of the partnership included rapid adaptation to COVID‐19 needs, expanded clinician evaluation skills, external knowledge dissemination through scholarship, and health system‐wide knowledge exchange. Ongoing considerations for improving the collaboration included the need for multi‐year support to enable nimble response to dynamic health system needs and timely data access. Conclusion Presence of embedded evaluation partners in the enterprise‐wide QI program supported identification of analogous endeavors (eg, telemedicine adoption) and cross‐cutting lessons across QI efforts, clinician capacity building, and knowledge dissemination through scholarship.
Collapse
Affiliation(s)
- Stacie Vilendrer
- Department of Medicine, Division of Primary Care and Population Health Stanford University School of Medicine California USA
| | - Erika A. Saliba‐Gustafsson
- Department of Medicine, Division of Primary Care and Population Health Stanford University School of Medicine California USA
| | - Steven M. Asch
- Department of Medicine, Division of Primary Care and Population Health Stanford University School of Medicine California USA
| | - Cati G. Brown‐Johnson
- Department of Medicine, Division of Primary Care and Population Health Stanford University School of Medicine California USA
| | - Samantha M.R. Kling
- Department of Medicine, Division of Primary Care and Population Health Stanford University School of Medicine California USA
| | - Jonathan G. Shaw
- Department of Medicine, Division of Primary Care and Population Health Stanford University School of Medicine California USA
| | - Marcy Winget
- Department of Medicine, Division of Primary Care and Population Health Stanford University School of Medicine California USA
| | - David B. Larson
- Department of Radiology Stanford University School of Medicine California USA
| |
Collapse
|
16
|
Larson DB, Krishnaraj A, Mendelson DS, Langlotz CP, Wald C. Moving Toward Seamless Interinstitutional Electronic Image Transfer. J Am Coll Radiol 2022; 19:460-468. [PMID: 35114138 DOI: 10.1016/j.jacr.2021.11.017] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2021] [Revised: 11/12/2021] [Accepted: 11/17/2021] [Indexed: 11/25/2022]
Abstract
The fact that medical images are still predominately exchanged between institutions via physical media is unacceptable in the era of value-driven health care. Although better solutions are technically possible, problems of coordination and market dynamics may be inhibiting progress more than technical factors. We provide a macrosystem analysis of the problem of interinstitutional medical image exchange and propose a strategy for nudging the market toward a patient-friendly solution. The system can be viewed as a network, with autonomous nodes interconnected by links through which information is exchanged. A variety of potential network configurations include those that depend on individual carriers, peer-to-peer links, one or multiple hubs, or a hybrid of models. We find the linked multihub model, in which individual institutions are connected to other institutions via image exchange companies, to be the configuration most likely to create a patient-friendly electronic image exchange system. To achieve this configuration, image exchange companies, which operate in a competitive marketplace, must exchange images with each other. We call on these vendors to immediately commit to coordinating in this manner. We call on all other stakeholders, including medical societies, payers, and regulators, to actively encourage and facilitate this behavior. Specifically, we call on institutions to create appropriate market incentives by only contracting with image exchange vendors who are committed to begin vendor-to-vendor image exchange by no later than 2024.
Collapse
Affiliation(s)
- David B Larson
- Chair, Commission on Quality and Safety, ACR; Member, Board of Chancellors, ACR; and Vice Chair, Education and Clinical Operations, Department of Radiology, Stanford University School of Medicine, Stanford, California.
| | - Arun Krishnaraj
- Chair, Commission on Patient- and Family-Centered Care, ACR; Member, Board of Chancellors, ACR; and Chief, Division of Body Imaging, Department of Radiology and Medical Imaging, University of Virginia School of Medicine, Charlottesville, Virginia
| | - David S Mendelson
- Vice Chair, Informatics, Department of Radiology, The Mount Sinai Medical Center, New York, New York
| | - Curtis P Langlotz
- Member, Board of Directors, RSNA, and Associate Chair, Information Systems, Department of Radiology, Stanford University School of Medicine, Stanford, California
| | - Christoph Wald
- Chair, Commission on Informatics, ACR; Member, Board of Chancellors, ACR; and Chair, Department of Radiology, Lahey Hospital and Medical Center, Burlington, Massachusetts; Tufts University Medical School, Boston, Massachusetts
| |
Collapse
|
17
|
Zucker EJ, Wintch S, Chang Y, Commerford L, Diaz RB, Redfern TH, Wang TN, Lam L, Frush DP, Larson DB. Increasing the Utilization of Moderate Sedation Services for Pediatric Imaging. Radiographics 2021; 41:2127-2135. [PMID: 34723694 DOI: 10.1148/rg.2021210061] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Performing motion-free imaging is frequently challenging in children. To bridge the gap between examinations performed in children who are awake and in those under general anesthesia, a moderate sedation program was implemented at our institution but was seldom used despite substantial eligibility. In conjunction with a 5-month quality improvement (QI) course, a multidisciplinary team was assembled and, by using an A3 approach, sought to address the most important key drivers of low utilization, namely the need for clear moderate sedation eligibility criteria, reliable protocol routing order, consistent moderate sedation screening performed by registered nurses (RNs), and enhanced visibility of moderate sedation services to ordering providers. Initial steps focused on developing better-defined criteria and protocoling standard work for technologists and RNs, with coaching and audits. Modality-specific forecasting was then implemented to reroute profiles of patients who were awaiting scheduling or already scheduled for an examination with general anesthesia to the moderate sedation queue to identify more eligible patients. These manual efforts were coupled with higher reliability but more protracted electronic health record changes, facilitating automated protocol routing on the basis of moderate sedation eligibility and order entry constraints. As a result, scheduled imaging examinations requiring moderate sedation increased from a mean of 1.2 examinations per week to a sustained 6.1 examinations per week (range, 4-8) over the 5-month period, exceeding the team SMART (specific, measurable, achievable, relevant, and time bound) goal to achieve an average of five examinations per week by the QI course end. By targeting the most high-impact yet modifiable process deficiencies through a multifaceted team approach and initially investing in manual efforts to gain cultural buy-in while awaiting higher-reliability interventions, the project achieved success and may serve as a more general model for workflow change when there is organizational resistance. ©RSNA, 2021.
Collapse
Affiliation(s)
- Evan J Zucker
- From the Department of Radiology, Stanford University School of Medicine, 725 Welch Rd, Palo Alto, CA 94304 (E.J.Z., Y.C., L.C., R.B.D., T.H.R., D.P.F., D.B.L.); and Sedation Program (S.W.), Department of Anesthesia (T.N.W.), and Department of Performance Improvement (L.L.), Stanford Children's Health, Stanford, Calif
| | - Stephanie Wintch
- From the Department of Radiology, Stanford University School of Medicine, 725 Welch Rd, Palo Alto, CA 94304 (E.J.Z., Y.C., L.C., R.B.D., T.H.R., D.P.F., D.B.L.); and Sedation Program (S.W.), Department of Anesthesia (T.N.W.), and Department of Performance Improvement (L.L.), Stanford Children's Health, Stanford, Calif
| | - Young Chang
- From the Department of Radiology, Stanford University School of Medicine, 725 Welch Rd, Palo Alto, CA 94304 (E.J.Z., Y.C., L.C., R.B.D., T.H.R., D.P.F., D.B.L.); and Sedation Program (S.W.), Department of Anesthesia (T.N.W.), and Department of Performance Improvement (L.L.), Stanford Children's Health, Stanford, Calif
| | - Lindsey Commerford
- From the Department of Radiology, Stanford University School of Medicine, 725 Welch Rd, Palo Alto, CA 94304 (E.J.Z., Y.C., L.C., R.B.D., T.H.R., D.P.F., D.B.L.); and Sedation Program (S.W.), Department of Anesthesia (T.N.W.), and Department of Performance Improvement (L.L.), Stanford Children's Health, Stanford, Calif
| | - Rizza-Belen Diaz
- From the Department of Radiology, Stanford University School of Medicine, 725 Welch Rd, Palo Alto, CA 94304 (E.J.Z., Y.C., L.C., R.B.D., T.H.R., D.P.F., D.B.L.); and Sedation Program (S.W.), Department of Anesthesia (T.N.W.), and Department of Performance Improvement (L.L.), Stanford Children's Health, Stanford, Calif
| | - Trista H Redfern
- From the Department of Radiology, Stanford University School of Medicine, 725 Welch Rd, Palo Alto, CA 94304 (E.J.Z., Y.C., L.C., R.B.D., T.H.R., D.P.F., D.B.L.); and Sedation Program (S.W.), Department of Anesthesia (T.N.W.), and Department of Performance Improvement (L.L.), Stanford Children's Health, Stanford, Calif
| | - Tammy N Wang
- From the Department of Radiology, Stanford University School of Medicine, 725 Welch Rd, Palo Alto, CA 94304 (E.J.Z., Y.C., L.C., R.B.D., T.H.R., D.P.F., D.B.L.); and Sedation Program (S.W.), Department of Anesthesia (T.N.W.), and Department of Performance Improvement (L.L.), Stanford Children's Health, Stanford, Calif
| | - Linda Lam
- From the Department of Radiology, Stanford University School of Medicine, 725 Welch Rd, Palo Alto, CA 94304 (E.J.Z., Y.C., L.C., R.B.D., T.H.R., D.P.F., D.B.L.); and Sedation Program (S.W.), Department of Anesthesia (T.N.W.), and Department of Performance Improvement (L.L.), Stanford Children's Health, Stanford, Calif
| | - Donald P Frush
- From the Department of Radiology, Stanford University School of Medicine, 725 Welch Rd, Palo Alto, CA 94304 (E.J.Z., Y.C., L.C., R.B.D., T.H.R., D.P.F., D.B.L.); and Sedation Program (S.W.), Department of Anesthesia (T.N.W.), and Department of Performance Improvement (L.L.), Stanford Children's Health, Stanford, Calif
| | - David B Larson
- From the Department of Radiology, Stanford University School of Medicine, 725 Welch Rd, Palo Alto, CA 94304 (E.J.Z., Y.C., L.C., R.B.D., T.H.R., D.P.F., D.B.L.); and Sedation Program (S.W.), Department of Anesthesia (T.N.W.), and Department of Performance Improvement (L.L.), Stanford Children's Health, Stanford, Calif
| |
Collapse
|
18
|
Eng DK, Khandwala NB, Long J, Fefferman NR, Lala SV, Strubel NA, Milla SS, Filice RW, Sharp SE, Towbin AJ, Francavilla ML, Kaplan SL, Ecklund K, Prabhu SP, Dillon BJ, Everist BM, Anton CG, Bittman ME, Dennis R, Larson DB, Seekins JM, Silva CT, Zandieh AR, Langlotz CP, Lungren MP, Halabi SS. Artificial Intelligence Algorithm Improves Radiologist Performance in Skeletal Age Assessment: A Prospective Multicenter Randomized Controlled Trial. Radiology 2021; 301:692-699. [PMID: 34581608 DOI: 10.1148/radiol.2021204021] [Citation(s) in RCA: 38] [Impact Index Per Article: 12.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Background Previous studies suggest that use of artificial intelligence (AI) algorithms as diagnostic aids may improve the quality of skeletal age assessment, though these studies lack evidence from clinical practice. Purpose To compare the accuracy and interpretation time of skeletal age assessment on hand radiograph examinations with and without the use of an AI algorithm as a diagnostic aid. Materials and Methods In this prospective randomized controlled trial, the accuracy of skeletal age assessment on hand radiograph examinations was performed with (n = 792) and without (n = 739) the AI algorithm as a diagnostic aid. For examinations with the AI algorithm, the radiologist was shown the AI interpretation as part of their routine clinical work and was permitted to accept or modify it. Hand radiographs were interpreted by 93 radiologists from six centers. The primary efficacy outcome was the mean absolute difference between the skeletal age dictated into the radiologists' signed report and the average interpretation of a panel of four radiologists not using a diagnostic aid. The secondary outcome was the interpretation time. A linear mixed-effects regression model with random center- and radiologist-level effects was used to compare the two experimental groups. Results Overall mean absolute difference was lower when radiologists used the AI algorithm compared with when they did not (5.36 months vs 5.95 months; P = .04). The proportions at which the absolute difference exceeded 12 months (9.3% vs 13.0%, P = .02) and 24 months (0.5% vs 1.8%, P = .02) were lower with the AI algorithm than without it. Median radiologist interpretation time was lower with the AI algorithm than without it (102 seconds vs 142 seconds, P = .001). Conclusion Use of an artificial intelligence algorithm improved skeletal age assessment accuracy and reduced interpretation times for radiologists, although differences were observed between centers. Clinical trial registration no. NCT03530098 © RSNA, 2021 Online supplemental material is available for this article. See also the editorial by Rubin in this issue.
Collapse
Affiliation(s)
- David K Eng
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Nishith B Khandwala
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Jin Long
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Nancy R Fefferman
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Shailee V Lala
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Naomi A Strubel
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Sarah S Milla
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Ross W Filice
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Susan E Sharp
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Alexander J Towbin
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Michael L Francavilla
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Summer L Kaplan
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Kirsten Ecklund
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Sanjay P Prabhu
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Brian J Dillon
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Brian M Everist
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Christopher G Anton
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Mark E Bittman
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Rebecca Dennis
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - David B Larson
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Jayne M Seekins
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Cicero T Silva
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Arash R Zandieh
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Curtis P Langlotz
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Matthew P Lungren
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| | - Safwan S Halabi
- From the Department of Computer Science, Stanford University, 300 N Pasteur Dr, Stanford, CA 94305 (D.K.E., N.B.K.); Departments of Pediatrics (J.L.) and Radiology (D.B.L., J.M.S., C.P.L., M.P.L., S.S.H.), Stanford University School of Medicine, Stanford, Calif; Department of Radiology, New York University School of Medicine, New York, NY (N.R.F., S.V.L., N.A.S., M.E.B.); Department of Radiology, Emory School of Medicine and Children's Healthcare of Atlanta, Atlanta, Ga (S.S.M.); Department of Radiology, MedStar Health and Georgetown University School of Medicine, Washington, DC (R.W.F., A.R.Z.); Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio (S.E.S., A.J.T., C.G.A.); Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, Pa (M.L.F., S.L.K., R.D.); Department of Radiology, Harvard Medical School and Boston Children's Hospital, Boston, Mass (K.E., S.P.P.); Department of Radiology, Yale School of Medicine, New Haven, Conn (B.J.D., C.T.S.); and Department of Radiology, Kansas University School of Medicine, Kansas City, Kan (B.M.E.)
| |
Collapse
|
19
|
Chaudhari AS, Sandino CM, Cole EK, Larson DB, Gold GE, Vasanawala SS, Lungren MP, Hargreaves BA, Langlotz CP. Prospective Deployment of Deep Learning in MRI: A Framework for Important Considerations, Challenges, and Recommendations for Best Practices. J Magn Reson Imaging 2021; 54:357-371. [PMID: 32830874 PMCID: PMC8639049 DOI: 10.1002/jmri.27331] [Citation(s) in RCA: 31] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2020] [Revised: 07/27/2020] [Accepted: 07/31/2020] [Indexed: 12/16/2022] Open
Abstract
Artificial intelligence algorithms based on principles of deep learning (DL) have made a large impact on the acquisition, reconstruction, and interpretation of MRI data. Despite the large number of retrospective studies using DL, there are fewer applications of DL in the clinic on a routine basis. To address this large translational gap, we review the recent publications to determine three major use cases that DL can have in MRI, namely, that of model-free image synthesis, model-based image reconstruction, and image or pixel-level classification. For each of these three areas, we provide a framework for important considerations that consist of appropriate model training paradigms, evaluation of model robustness, downstream clinical utility, opportunities for future advances, as well recommendations for best current practices. We draw inspiration for this framework from advances in computer vision in natural imaging as well as additional healthcare fields. We further emphasize the need for reproducibility of research studies through the sharing of datasets and software. LEVEL OF EVIDENCE: 5 TECHNICAL EFFICACY STAGE: 2.
Collapse
Affiliation(s)
| | - Christopher M Sandino
- Department of Radiology, Stanford University, Stanford, California, USA
- Department of Electrical Engineering, Stanford University, Stanford, California, USA
| | - Elizabeth K Cole
- Department of Radiology, Stanford University, Stanford, California, USA
- Department of Electrical Engineering, Stanford University, Stanford, California, USA
| | - David B Larson
- Department of Radiology, Stanford University, Stanford, California, USA
| | - Garry E Gold
- Department of Radiology, Stanford University, Stanford, California, USA
- Department of Orthopaedic Surgery, Stanford University, Stanford, California, USA
- Department of Bioengineering, Stanford University, Stanford, California, USA
| | | | - Matthew P Lungren
- Department of Radiology, Stanford University, Stanford, California, USA
| | - Brian A Hargreaves
- Department of Radiology, Stanford University, Stanford, California, USA
- Department of Electrical Engineering, Stanford University, Stanford, California, USA
- Department of Biomedical Informatics, Stanford University, Stanford, California, USA
| | - Curtis P Langlotz
- Department of Radiology, Stanford University, Stanford, California, USA
- Department of Biomedical Informatics, Stanford University, Stanford, California, USA
| |
Collapse
|
20
|
Larson DB, Broder JC, Bhargavan-Chatfield M, Donnelly LF, Kadom N, Khorasani R, Sharpe RE, Pahade JK, Moriarity AK, Tan N, Siewert B, Kruskal JB. Transitioning From Peer Review to Peer Learning: Report of the 2020 Peer Learning Summit. J Am Coll Radiol 2020; 17:1499-1508. [DOI: 10.1016/j.jacr.2020.07.016] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2020] [Revised: 07/05/2020] [Accepted: 07/15/2020] [Indexed: 10/23/2022]
|
21
|
Larson DB, Harvey H, Rubin DL, Irani N, Tse JR, Langlotz CP. Regulatory Frameworks for Development and Evaluation of Artificial Intelligence-Based Diagnostic Imaging Algorithms: Summary and Recommendations. J Am Coll Radiol 2020; 18:413-424. [PMID: 33096088 PMCID: PMC7574690 DOI: 10.1016/j.jacr.2020.09.060] [Citation(s) in RCA: 42] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2020] [Revised: 09/23/2020] [Accepted: 09/23/2020] [Indexed: 12/28/2022]
Abstract
Although artificial intelligence (AI)-based algorithms for diagnosis hold promise for improving care, their safety and effectiveness must be ensured to facilitate wide adoption. Several recently proposed regulatory frameworks provide a solid foundation but do not address a number of issues that may prevent algorithms from being fully trusted. In this article, we review the major regulatory frameworks for software as a medical device applications, identify major gaps, and propose additional strategies to improve the development and evaluation of diagnostic AI algorithms. We identify the following major shortcomings of the current regulatory frameworks: (1) conflation of the diagnostic task with the diagnostic algorithm, (2) superficial treatment of the diagnostic task definition, (3) no mechanism to directly compare similar algorithms, (4) insufficient characterization of safety and performance elements, (5) lack of resources to assess performance at each installed site, and (6) inherent conflicts of interest. We recommend the following additional measures: (1) separate the diagnostic task from the algorithm, (2) define performance elements beyond accuracy, (3) divide the evaluation process into discrete steps, (4) encourage assessment by a third-party evaluator, (5) incorporate these elements into the manufacturers’ development process. Specifically, we recommend four phases of development and evaluation, analogous to those that have been applied to pharmaceuticals and proposed for software applications, to help ensure world-class performance of all algorithms at all installed sites. In the coming years, we anticipate the emergence of a substantial body of research dedicated to ensuring the accuracy, reliability, and safety of the algorithms.
Collapse
Affiliation(s)
- David B Larson
- Vice Chair, Education and Clinical Operations, Department of Radiology, Stanford University School of Medicine, Stanford, California.
| | - Hugh Harvey
- Institute for Cognitive Neuroscience, University College, London, UK
| | - Daniel L Rubin
- Director of Biomedical Informatics at Stanford Cancer Institute, Departments of Biomedical Data Science, Radiology, and Medicine, Stanford University School of Medicine, Stanford, California
| | - Neville Irani
- Department of Radiology, University of Kansas Medical Center, Kansas City, Kansas
| | - Justin R Tse
- Department of Radiological Sciences, David Geffen School of Medicine, University of California, Los Angeles, California
| | - Curtis P Langlotz
- Associate Chair, Information Systems, Department of Radiology, Stanford University School of Medicine, Stanford, California
| |
Collapse
|
22
|
Larson DB, Willis MH, Hwang GL. Recognizing and Avoiding the Most Common Mistakes in Quality Improvement. J Am Coll Radiol 2020; 18:511-513. [PMID: 33069677 DOI: 10.1016/j.jacr.2020.09.053] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2020] [Accepted: 09/21/2020] [Indexed: 11/19/2022]
Affiliation(s)
- David B Larson
- Vice Chair, Education and Clinical Operations, Department of Radiology, Stanford University School of Medicine, Stanford, California.
| | - Marc H Willis
- Associate Chair, Quality Improvement, Department of Radiology, Stanford University School of Medicine, Stanford, California
| | - Gloria L Hwang
- Associate Chair, Performance Improvement, Department of Radiology, Stanford University School of Medicine, Stanford, California
| |
Collapse
|
23
|
Kuhn KJ, Larson DB. Critical Results in Radiology: Defined by Clinical Judgment or by a List? J Am Coll Radiol 2020; 18:294-297. [PMID: 32783896 DOI: 10.1016/j.jacr.2020.07.009] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2020] [Revised: 07/06/2020] [Accepted: 07/07/2020] [Indexed: 10/23/2022]
Affiliation(s)
- Karin J Kuhn
- Department of Radiology, Stanford University Medical Center, Stanford, California.
| | - David B Larson
- Vice Chair for Education and Clinical Operations, Associate Chief Quality Officer for Improvement for Improvement for Stanford Health Care, physician co-leader of the Stanford Medicine Center for Improvement at Stanford University, Stanford University Medical Center, Stanford, California
| | | |
Collapse
|
24
|
Madhuripan N, Cheung HMC, Alicia Cheong LH, Jawahar A, Willis MH, Larson DB. Variables Influencing Radiology Volume Recovery During the Next Phase of the Coronavirus Disease 2019 (COVID-19) Pandemic. J Am Coll Radiol 2020; 17:855-864. [PMID: 32505562 PMCID: PMC7262523 DOI: 10.1016/j.jacr.2020.05.026] [Citation(s) in RCA: 31] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2020] [Revised: 05/22/2020] [Accepted: 05/22/2020] [Indexed: 11/29/2022]
Abstract
The coronavirus disease 2019 (COVID-19) pandemic has reduced radiology volumes across the country as providers have decreased elective care to minimize the spread of infection and free up health care delivery system capacity. After the stay-at-home order was issued in our county, imaging volumes at our institution decreased to approximately 46% of baseline volumes, similar to the experience of other radiology practices. Given the substantial differences in severity and timing of the disease in different geographic regions, estimating resumption of radiology volumes will be one of the next major challenges for radiology practices. We hypothesize that there are six major variables that will likely predict radiology volumes: (1) severity of disease in the local region, including potential subsequent "waves" of infection; (2) lifting of government social distancing restrictions; (3) patient concern regarding risk of leaving home and entering imaging facilities; (4) management of pent-up demand for imaging delayed during the acute phase of the pandemic, including institutional capacity; (5) impact of the economic downturn on health insurance and ability to pay for imaging; and (6) radiology practice profile reflecting amount of elective imaging performed, including type of patients seen by the radiology practice such as emergency, inpatient, outpatient mix and subspecialty types. We encourage radiology practice leaders to use these and other relevant variables to plan for the coming weeks and to work collaboratively with local health system and governmental leaders to help ensure that needed patient care is restored as quickly as the environment will safely permit.
Collapse
Affiliation(s)
- Nikhil Madhuripan
- Department of Radiology, Stanford University School of Medicine, Stanford, California
| | - Helen M C Cheung
- Department of Radiology, Stanford University School of Medicine, Stanford, California
| | - Li Hsia Alicia Cheong
- Department of Radiology, Stanford University School of Medicine, Stanford, California
| | - Anugayathri Jawahar
- Department of Radiology, Stanford University School of Medicine, Stanford, California
| | - Marc H Willis
- Associate Chair of Quality Improvement, Department of Radiology, Stanford University School of Medicine, Stanford, California
| | - David B Larson
- Vice Chair of Education and Clinical Operations, Department of Radiology, Stanford University School of Medicine, Stanford, California.
| |
Collapse
|
25
|
Larson DB, Magnus DC, Lungren MP, Shah NH, Langlotz CP. Ethics of Using and Sharing Clinical Imaging Data for Artificial Intelligence: A Proposed Framework. Radiology 2020; 295:675-682. [PMID: 32208097 DOI: 10.1148/radiol.2020192536] [Citation(s) in RCA: 78] [Impact Index Per Article: 19.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
In this article, the authors propose an ethical framework for using and sharing clinical data for the development of artificial intelligence (AI) applications. The philosophical premise is as follows: when clinical data are used to provide care, the primary purpose for acquiring the data is fulfilled. At that point, clinical data should be treated as a form of public good, to be used for the benefit of future patients. In their 2013 article, Faden et al argued that all who participate in the health care system, including patients, have a moral obligation to contribute to improving that system. The authors extend that framework to questions surrounding the secondary use of clinical data for AI applications. Specifically, the authors propose that all individuals and entities with access to clinical data become data stewards, with fiduciary (or trust) responsibilities to patients to carefully safeguard patient privacy, and to the public to ensure that the data are made widely available for the development of knowledge and tools to benefit future patients. According to this framework, the authors maintain that it is unethical for providers to "sell" clinical data to other parties by granting access to clinical data, especially under exclusive arrangements, in exchange for monetary or in-kind payments that exceed costs. The authors also propose that patient consent is not required before the data are used for secondary purposes when obtaining such consent is prohibitively costly or burdensome, as long as mechanisms are in place to ensure that ethical standards are strictly followed. Rather than debate whether patients or provider organizations "own" the data, the authors propose that clinical data are not owned at all in the traditional sense, but rather that all who interact with or control the data have an obligation to ensure that the data are used for the benefit of future patients and society.
Collapse
Affiliation(s)
- David B Larson
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105
| | - David C Magnus
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105
| | - Matthew P Lungren
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105
| | - Nigam H Shah
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105
| | - Curtis P Langlotz
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105
| |
Collapse
|
26
|
Pan I, Thodberg HH, Halabi SS, Kalpathy-Cramer J, Larson DB. Improving Automated Pediatric Bone Age Estimation Using Ensembles of Models from the 2017 RSNA Machine Learning Challenge. Radiol Artif Intell 2019; 1:e190053. [PMID: 32090207 PMCID: PMC6884060 DOI: 10.1148/ryai.2019190053] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2019] [Revised: 07/12/2019] [Accepted: 08/23/2019] [Indexed: 11/11/2022]
Abstract
PURPOSE To investigate improvements in performance for automatic bone age estimation that can be gained through model ensembling. MATERIALS AND METHODS A total of 48 submissions from the 2017 RSNA Pediatric Bone Age Machine Learning Challenge were used. Participants were provided with 12 611 pediatric hand radiographs with bone ages determined by a pediatric radiologist to develop models for bone age determination. The final results were determined using a test set of 200 radiographs labeled with the weighted average of six ratings. The mean pairwise model correlation and performance of all possible model combinations for ensembles of up to 10 models using the mean absolute deviation (MAD) were evaluated. A bootstrap analysis using the 200 test radiographs was conducted to estimate the true generalization MAD. RESULTS The estimated generalization MAD of a single model was 4.55 months. The best-performing ensemble consisted of four models with an MAD of 3.79 months. The mean pairwise correlation of models within this ensemble was 0.47. In comparison, the lowest achievable MAD by combining the highest-ranking models based on individual scores was 3.93 months using eight models with a mean pairwise model correlation of 0.67. CONCLUSION Combining less-correlated, high-performing models resulted in better performance than naively combining the top-performing models. Machine learning competitions within radiology should be encouraged to spur development of heterogeneous models whose predictions can be combined to achieve optimal performance.© RSNA, 2019 Supplemental material is available for this article. See also the commentary by Siegel in this issue.
Collapse
Affiliation(s)
- Ian Pan
- From the Department of Radiology, Warren Alpert Medical School, Brown University, 593 Eddy St, Providence, RI 02903 (I.P.); Department of Diagnostic Imaging, Rhode Island Hospital, Providence, RI (I.P.); Visiana, Hørsholm, Denmark (H.H.T.); Department of Radiology, Stanford University, Palo Alto, Calif (S.S.H., D.B.L.); and Department of Radiology, Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Harvard Medical School, Boston, Mass (J.K.C.)
| | - Hans Henrik Thodberg
- From the Department of Radiology, Warren Alpert Medical School, Brown University, 593 Eddy St, Providence, RI 02903 (I.P.); Department of Diagnostic Imaging, Rhode Island Hospital, Providence, RI (I.P.); Visiana, Hørsholm, Denmark (H.H.T.); Department of Radiology, Stanford University, Palo Alto, Calif (S.S.H., D.B.L.); and Department of Radiology, Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Harvard Medical School, Boston, Mass (J.K.C.)
| | - Safwan S. Halabi
- From the Department of Radiology, Warren Alpert Medical School, Brown University, 593 Eddy St, Providence, RI 02903 (I.P.); Department of Diagnostic Imaging, Rhode Island Hospital, Providence, RI (I.P.); Visiana, Hørsholm, Denmark (H.H.T.); Department of Radiology, Stanford University, Palo Alto, Calif (S.S.H., D.B.L.); and Department of Radiology, Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Harvard Medical School, Boston, Mass (J.K.C.)
| | - Jayashree Kalpathy-Cramer
- From the Department of Radiology, Warren Alpert Medical School, Brown University, 593 Eddy St, Providence, RI 02903 (I.P.); Department of Diagnostic Imaging, Rhode Island Hospital, Providence, RI (I.P.); Visiana, Hørsholm, Denmark (H.H.T.); Department of Radiology, Stanford University, Palo Alto, Calif (S.S.H., D.B.L.); and Department of Radiology, Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Harvard Medical School, Boston, Mass (J.K.C.)
| | - David B. Larson
- From the Department of Radiology, Warren Alpert Medical School, Brown University, 593 Eddy St, Providence, RI 02903 (I.P.); Department of Diagnostic Imaging, Rhode Island Hospital, Providence, RI (I.P.); Visiana, Hørsholm, Denmark (H.H.T.); Department of Radiology, Stanford University, Palo Alto, Calif (S.S.H., D.B.L.); and Department of Radiology, Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Harvard Medical School, Boston, Mass (J.K.C.)
| |
Collapse
|
27
|
Abstract
Radiology practices are increasingly implementing standardized report templates to overcome the drawbacks of individual templates. However, implementing a standardized structured reporting program is not necessarily straightforward. This article provides practical guidance for radiologists who wish to implement standardized structured reporting in their practice. Challenges that radiology groups encounter tend to fall into two categories: technical and organizational. Defining and carrying out technical work can be tedious but tends to be relatively straightforward, whereas overcoming organizational challenges often requires changes in individuals' strongly held values, beliefs, roles, and relationships. Established organizational change models can help frame the organizational strategy to implement a standardized structured reporting program. Once leadership support is secured, a standardized structured reporting committee can be convened to establish report priorities, standards, design principles, and guidelines. Report standards help to establish the common framework upon which all report templates are constructed, helping to ensure report consistency. By using these standards, committee members can create reports relevant to their subspecialties, which can then be edited for formatting and content. Once report templates have been developed, edited, and published, an abbreviated form of the same process can be used to maintain the reports, which can be accomplished with much less effort than that initially required to create the templates. After standardized structured report templates are implemented and become embedded in practice, most radiologists eventually appreciate the merits of the program. ©RSNA, 2018.
Collapse
Affiliation(s)
- David B Larson
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105
| |
Collapse
|
28
|
Larson DB, Boland GW. Imaging Quality Control in the Era of Artificial Intelligence. J Am Coll Radiol 2019; 16:1259-1266. [DOI: 10.1016/j.jacr.2019.05.048] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2019] [Revised: 05/27/2019] [Accepted: 05/29/2019] [Indexed: 12/13/2022]
|
29
|
Davenport MS, Larson DB. Measuring Diagnostic Radiologists: What Measurements Should We Use? J Am Coll Radiol 2019; 16:333-335. [PMID: 30718210 DOI: 10.1016/j.jacr.2018.12.011] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/03/2018] [Accepted: 12/12/2018] [Indexed: 11/27/2022]
Affiliation(s)
- Matthew S Davenport
- Department of Radiology and the Department of Urology, Michigan Medicine, Ann Arbor, Michigan; Michigan Radiology Quality Collaborative, Ann Arbor, Michigan.
| | - David B Larson
- Department of Radiology, Stanford University School of Medicine, Stanford California
| |
Collapse
|
30
|
Larson DB. Re: "Reducing Variability of Radiation Dose in CT". J Am Coll Radiol 2018; 15:1669-1670. [PMID: 30522642 DOI: 10.1016/j.jacr.2018.07.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2018] [Accepted: 07/05/2018] [Indexed: 11/27/2022]
Affiliation(s)
- David B Larson
- Associate Professor of Pediatric Radiology, Stanford University School of Medicine, 300 Pasteur Drive, Stanford CA 94305-5105.
| |
Collapse
|
31
|
Bien N, Rajpurkar P, Ball RL, Irvin J, Park A, Jones E, Bereket M, Patel BN, Yeom KW, Shpanskaya K, Halabi S, Zucker E, Fanton G, Amanatullah DF, Beaulieu CF, Riley GM, Stewart RJ, Blankenberg FG, Larson DB, Jones RH, Langlotz CP, Ng AY, Lungren MP. Deep-learning-assisted diagnosis for knee magnetic resonance imaging: Development and retrospective validation of MRNet. PLoS Med 2018; 15:e1002699. [PMID: 30481176 PMCID: PMC6258509 DOI: 10.1371/journal.pmed.1002699] [Citation(s) in RCA: 281] [Impact Index Per Article: 46.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/02/2018] [Accepted: 10/23/2018] [Indexed: 12/22/2022] Open
Abstract
BACKGROUND Magnetic resonance imaging (MRI) of the knee is the preferred method for diagnosing knee injuries. However, interpretation of knee MRI is time-intensive and subject to diagnostic error and variability. An automated system for interpreting knee MRI could prioritize high-risk patients and assist clinicians in making diagnoses. Deep learning methods, in being able to automatically learn layers of features, are well suited for modeling the complex relationships between medical images and their interpretations. In this study we developed a deep learning model for detecting general abnormalities and specific diagnoses (anterior cruciate ligament [ACL] tears and meniscal tears) on knee MRI exams. We then measured the effect of providing the model's predictions to clinical experts during interpretation. METHODS AND FINDINGS Our dataset consisted of 1,370 knee MRI exams performed at Stanford University Medical Center between January 1, 2001, and December 31, 2012 (mean age 38.0 years; 569 [41.5%] female patients). The majority vote of 3 musculoskeletal radiologists established reference standard labels on an internal validation set of 120 exams. We developed MRNet, a convolutional neural network for classifying MRI series and combined predictions from 3 series per exam using logistic regression. In detecting abnormalities, ACL tears, and meniscal tears, this model achieved area under the receiver operating characteristic curve (AUC) values of 0.937 (95% CI 0.895, 0.980), 0.965 (95% CI 0.938, 0.993), and 0.847 (95% CI 0.780, 0.914), respectively, on the internal validation set. We also obtained a public dataset of 917 exams with sagittal T1-weighted series and labels for ACL injury from Clinical Hospital Centre Rijeka, Croatia. On the external validation set of 183 exams, the MRNet trained on Stanford sagittal T2-weighted series achieved an AUC of 0.824 (95% CI 0.757, 0.892) in the detection of ACL injuries with no additional training, while an MRNet trained on the rest of the external data achieved an AUC of 0.911 (95% CI 0.864, 0.958). We additionally measured the specificity, sensitivity, and accuracy of 9 clinical experts (7 board-certified general radiologists and 2 orthopedic surgeons) on the internal validation set both with and without model assistance. Using a 2-sided Pearson's chi-squared test with adjustment for multiple comparisons, we found no significant differences between the performance of the model and that of unassisted general radiologists in detecting abnormalities. General radiologists achieved significantly higher sensitivity in detecting ACL tears (p-value = 0.002; q-value = 0.019) and significantly higher specificity in detecting meniscal tears (p-value = 0.003; q-value = 0.019). Using a 1-tailed t test on the change in performance metrics, we found that providing model predictions significantly increased clinical experts' specificity in identifying ACL tears (p-value < 0.001; q-value = 0.006). The primary limitations of our study include lack of surgical ground truth and the small size of the panel of clinical experts. CONCLUSIONS Our deep learning model can rapidly generate accurate clinical pathology classifications of knee MRI exams from both internal and external datasets. Moreover, our results support the assertion that deep learning models can improve the performance of clinical experts during medical imaging interpretation. Further research is needed to validate the model prospectively and to determine its utility in the clinical setting.
Collapse
Affiliation(s)
- Nicholas Bien
- Department of Computer Science, Stanford University, Stanford, California, United States of America
- * E-mail:
| | - Pranav Rajpurkar
- Department of Computer Science, Stanford University, Stanford, California, United States of America
| | - Robyn L. Ball
- Quantitative Sciences Unit, Department of Medicine, Stanford University, Stanford, California, United States of America
| | - Jeremy Irvin
- Department of Computer Science, Stanford University, Stanford, California, United States of America
| | - Allison Park
- Department of Computer Science, Stanford University, Stanford, California, United States of America
| | - Erik Jones
- Department of Computer Science, Stanford University, Stanford, California, United States of America
| | - Michael Bereket
- Department of Computer Science, Stanford University, Stanford, California, United States of America
| | - Bhavik N. Patel
- Department of Radiology, Stanford University, Stanford, California, United States of America
| | - Kristen W. Yeom
- Department of Radiology, Stanford University, Stanford, California, United States of America
| | - Katie Shpanskaya
- Department of Radiology, Stanford University, Stanford, California, United States of America
| | - Safwan Halabi
- Department of Radiology, Stanford University, Stanford, California, United States of America
| | - Evan Zucker
- Department of Radiology, Stanford University, Stanford, California, United States of America
| | - Gary Fanton
- Department of Orthopedic Surgery, Stanford University, Stanford, California, United States of America
| | - Derek F. Amanatullah
- Department of Orthopedic Surgery, Stanford University, Stanford, California, United States of America
| | - Christopher F. Beaulieu
- Department of Radiology, Stanford University, Stanford, California, United States of America
| | - Geoffrey M. Riley
- Department of Radiology, Stanford University, Stanford, California, United States of America
| | - Russell J. Stewart
- Department of Radiology, Stanford University, Stanford, California, United States of America
| | - Francis G. Blankenberg
- Department of Radiology, Stanford University, Stanford, California, United States of America
| | - David B. Larson
- Department of Radiology, Stanford University, Stanford, California, United States of America
| | - Ricky H. Jones
- Department of Radiology, Stanford University, Stanford, California, United States of America
| | - Curtis P. Langlotz
- Department of Radiology, Stanford University, Stanford, California, United States of America
| | - Andrew Y. Ng
- Department of Computer Science, Stanford University, Stanford, California, United States of America
| | - Matthew P. Lungren
- Department of Radiology, Stanford University, Stanford, California, United States of America
| |
Collapse
|
32
|
Affiliation(s)
- Jonathan B. Kruskal
- From the Department of Radiology, Beth Israel Deaconess Medical Center, Harvard Medical School, One Deaconess Rd, Boston, MA 02215 (J.B.K.); and Department of Radiology, Stanford University School of Medicine, Stanford, Calif (D.B.L.)
| | - David B. Larson
- From the Department of Radiology, Beth Israel Deaconess Medical Center, Harvard Medical School, One Deaconess Rd, Boston, MA 02215 (J.B.K.); and Department of Radiology, Stanford University School of Medicine, Stanford, Calif (D.B.L.)
| |
Collapse
|
33
|
Chen MC, Ball RL, Yang L, Moradzadeh N, Chapman BE, Larson DB, Langlotz CP, Amrhein TJ, Lungren MP. Deep Learning to Classify Radiology Free-Text Reports. Radiology 2017; 286:845-852. [PMID: 29135365 DOI: 10.1148/radiol.2017171115] [Citation(s) in RCA: 106] [Impact Index Per Article: 15.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Purpose To evaluate the performance of a deep learning convolutional neural network (CNN) model compared with a traditional natural language processing (NLP) model in extracting pulmonary embolism (PE) findings from thoracic computed tomography (CT) reports from two institutions. Materials and Methods Contrast material-enhanced CT examinations of the chest performed between January 1, 1998, and January 1, 2016, were selected. Annotations by two human radiologists were made for three categories: the presence, chronicity, and location of PE. Classification of performance of a CNN model with an unsupervised learning algorithm for obtaining vector representations of words was compared with the open-source application PeFinder. Sensitivity, specificity, accuracy, and F1 scores for both the CNN model and PeFinder in the internal and external validation sets were determined. Results The CNN model demonstrated an accuracy of 99% and an area under the curve value of 0.97. For internal validation report data, the CNN model had a statistically significant larger F1 score (0.938) than did PeFinder (0.867) when classifying findings as either PE positive or PE negative, but no significant difference in sensitivity, specificity, or accuracy was found. For external validation report data, no statistical difference between the performance of the CNN model and PeFinder was found. Conclusion A deep learning CNN model can classify radiology free-text reports with accuracy equivalent to or beyond that of an existing traditional NLP model. © RSNA, 2017 Online supplemental material is available for this article.
Collapse
Affiliation(s)
- Matthew C Chen
- From the Department of Radiology, Stanford University School of Medicine, Stanford University Medical Center, 725 Welch Rd, Room 1675, Stanford, Calif 94305-5913 (M.C.C., N.M., D.B.L., C.P.L., M.P.L.); Stanford Center for Biomedical Informatics Research, Stanford University, Stanford, Calif (R.L.B., L.Y.); Department of Bioinformatics, University of Utah Medical Center, Salt Lake City, Utah (B.E.C.); and Department of Radiology, Duke University Medical Center, Durham, NC (T.J.A.)
| | - Robyn L Ball
- From the Department of Radiology, Stanford University School of Medicine, Stanford University Medical Center, 725 Welch Rd, Room 1675, Stanford, Calif 94305-5913 (M.C.C., N.M., D.B.L., C.P.L., M.P.L.); Stanford Center for Biomedical Informatics Research, Stanford University, Stanford, Calif (R.L.B., L.Y.); Department of Bioinformatics, University of Utah Medical Center, Salt Lake City, Utah (B.E.C.); and Department of Radiology, Duke University Medical Center, Durham, NC (T.J.A.)
| | - Lingyao Yang
- From the Department of Radiology, Stanford University School of Medicine, Stanford University Medical Center, 725 Welch Rd, Room 1675, Stanford, Calif 94305-5913 (M.C.C., N.M., D.B.L., C.P.L., M.P.L.); Stanford Center for Biomedical Informatics Research, Stanford University, Stanford, Calif (R.L.B., L.Y.); Department of Bioinformatics, University of Utah Medical Center, Salt Lake City, Utah (B.E.C.); and Department of Radiology, Duke University Medical Center, Durham, NC (T.J.A.)
| | - Nathaniel Moradzadeh
- From the Department of Radiology, Stanford University School of Medicine, Stanford University Medical Center, 725 Welch Rd, Room 1675, Stanford, Calif 94305-5913 (M.C.C., N.M., D.B.L., C.P.L., M.P.L.); Stanford Center for Biomedical Informatics Research, Stanford University, Stanford, Calif (R.L.B., L.Y.); Department of Bioinformatics, University of Utah Medical Center, Salt Lake City, Utah (B.E.C.); and Department of Radiology, Duke University Medical Center, Durham, NC (T.J.A.)
| | - Brian E Chapman
- From the Department of Radiology, Stanford University School of Medicine, Stanford University Medical Center, 725 Welch Rd, Room 1675, Stanford, Calif 94305-5913 (M.C.C., N.M., D.B.L., C.P.L., M.P.L.); Stanford Center for Biomedical Informatics Research, Stanford University, Stanford, Calif (R.L.B., L.Y.); Department of Bioinformatics, University of Utah Medical Center, Salt Lake City, Utah (B.E.C.); and Department of Radiology, Duke University Medical Center, Durham, NC (T.J.A.)
| | - David B Larson
- From the Department of Radiology, Stanford University School of Medicine, Stanford University Medical Center, 725 Welch Rd, Room 1675, Stanford, Calif 94305-5913 (M.C.C., N.M., D.B.L., C.P.L., M.P.L.); Stanford Center for Biomedical Informatics Research, Stanford University, Stanford, Calif (R.L.B., L.Y.); Department of Bioinformatics, University of Utah Medical Center, Salt Lake City, Utah (B.E.C.); and Department of Radiology, Duke University Medical Center, Durham, NC (T.J.A.)
| | - Curtis P Langlotz
- From the Department of Radiology, Stanford University School of Medicine, Stanford University Medical Center, 725 Welch Rd, Room 1675, Stanford, Calif 94305-5913 (M.C.C., N.M., D.B.L., C.P.L., M.P.L.); Stanford Center for Biomedical Informatics Research, Stanford University, Stanford, Calif (R.L.B., L.Y.); Department of Bioinformatics, University of Utah Medical Center, Salt Lake City, Utah (B.E.C.); and Department of Radiology, Duke University Medical Center, Durham, NC (T.J.A.)
| | - Timothy J Amrhein
- From the Department of Radiology, Stanford University School of Medicine, Stanford University Medical Center, 725 Welch Rd, Room 1675, Stanford, Calif 94305-5913 (M.C.C., N.M., D.B.L., C.P.L., M.P.L.); Stanford Center for Biomedical Informatics Research, Stanford University, Stanford, Calif (R.L.B., L.Y.); Department of Bioinformatics, University of Utah Medical Center, Salt Lake City, Utah (B.E.C.); and Department of Radiology, Duke University Medical Center, Durham, NC (T.J.A.)
| | - Matthew P Lungren
- From the Department of Radiology, Stanford University School of Medicine, Stanford University Medical Center, 725 Welch Rd, Room 1675, Stanford, Calif 94305-5913 (M.C.C., N.M., D.B.L., C.P.L., M.P.L.); Stanford Center for Biomedical Informatics Research, Stanford University, Stanford, Calif (R.L.B., L.Y.); Department of Bioinformatics, University of Utah Medical Center, Salt Lake City, Utah (B.E.C.); and Department of Radiology, Duke University Medical Center, Durham, NC (T.J.A.)
| |
Collapse
|
34
|
Larson DB, Chen MC, Lungren MP, Halabi SS, Stence NV, Langlotz CP. Performance of a Deep-Learning Neural Network Model in Assessing Skeletal Maturity on Pediatric Hand Radiographs. Radiology 2017; 287:313-322. [PMID: 29095675 DOI: 10.1148/radiol.2017170236] [Citation(s) in RCA: 227] [Impact Index Per Article: 32.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/29/2023]
Abstract
Purpose To compare the performance of a deep-learning bone age assessment model based on hand radiographs with that of expert radiologists and that of existing automated models. Materials and Methods The institutional review board approved the study. A total of 14 036 clinical hand radiographs and corresponding reports were obtained from two children's hospitals to train and validate the model. For the first test set, composed of 200 examinations, the mean of bone age estimates from the clinical report and three additional human reviewers was used as the reference standard. Overall model performance was assessed by comparing the root mean square (RMS) and mean absolute difference (MAD) between the model estimates and the reference standard bone ages. Ninety-five percent limits of agreement were calculated in a pairwise fashion for all reviewers and the model. The RMS of a second test set composed of 913 examinations from the publicly available Digital Hand Atlas was compared with published reports of an existing automated model. Results The mean difference between bone age estimates of the model and of the reviewers was 0 years, with a mean RMS and MAD of 0.63 and 0.50 years, respectively. The estimates of the model, the clinical report, and the three reviewers were within the 95% limits of agreement. RMS for the Digital Hand Atlas data set was 0.73 years, compared with 0.61 years of a previously reported model. Conclusion A deep-learning convolutional neural network model can estimate skeletal maturity with accuracy similar to that of an expert radiologist and to that of existing automated models. © RSNA, 2017 An earlier incorrect version of this article appeared online. This article was corrected on January 19, 2018.
Collapse
Affiliation(s)
- David B Larson
- From the Departments of Radiology (D.B.L., M.P.L., S.S.H., C.P.L.), Computer Science (M.C.C.), and Biomedical Informatics (C.P.L.), Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105; and Department of Radiology, Children's Hospital Colorado, Aurora, Colo (N.V.S.)
| | - Matthew C Chen
- From the Departments of Radiology (D.B.L., M.P.L., S.S.H., C.P.L.), Computer Science (M.C.C.), and Biomedical Informatics (C.P.L.), Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105; and Department of Radiology, Children's Hospital Colorado, Aurora, Colo (N.V.S.)
| | - Matthew P Lungren
- From the Departments of Radiology (D.B.L., M.P.L., S.S.H., C.P.L.), Computer Science (M.C.C.), and Biomedical Informatics (C.P.L.), Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105; and Department of Radiology, Children's Hospital Colorado, Aurora, Colo (N.V.S.)
| | - Safwan S Halabi
- From the Departments of Radiology (D.B.L., M.P.L., S.S.H., C.P.L.), Computer Science (M.C.C.), and Biomedical Informatics (C.P.L.), Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105; and Department of Radiology, Children's Hospital Colorado, Aurora, Colo (N.V.S.)
| | - Nicholas V Stence
- From the Departments of Radiology (D.B.L., M.P.L., S.S.H., C.P.L.), Computer Science (M.C.C.), and Biomedical Informatics (C.P.L.), Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105; and Department of Radiology, Children's Hospital Colorado, Aurora, Colo (N.V.S.)
| | - Curtis P Langlotz
- From the Departments of Radiology (D.B.L., M.P.L., S.S.H., C.P.L.), Computer Science (M.C.C.), and Biomedical Informatics (C.P.L.), Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105; and Department of Radiology, Children's Hospital Colorado, Aurora, Colo (N.V.S.)
| |
Collapse
|
35
|
Chwang WB, Iv M, Smith J, Kalnins A, Mickelsen J, Bammer R, Fleischmann D, Larson DB, Wintermark M, Zeineh M. Reducing Functional MR Imaging Acquisition Times by Optimizing Workflow. Radiographics 2017; 37:316-322. [PMID: 28076003 DOI: 10.1148/rg.2017160035] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Functional magnetic resonance (MR) imaging is a complex, specialized examination that is able to noninvasively measure information critical to patient care such as hemispheric language lateralization ( 1 ). Diagnostic functional MR imaging requires extensive patient interaction as well as the coordinated efforts of the entire health care team. We observed in our practice at an academic center that the times to perform functional MR imaging examinations were excessively lengthy, making scheduling of the examination difficult. The purpose of our project was to reduce functional MR imaging acquisition times by increasing the efficiency of our workflow, using specific quality tools to drive improvement of functional MR imaging. We assembled a multidisciplinary team and retrospectively reviewed all functional MR imaging examinations performed at our institution from January 2013 to August 2015. We identified five key drivers: (a) streamlined protocols, (b) consistent patient monitoring, (c) clear visual slides and audio, (d) improved patient understanding, and (e) minimized patient motion. We then implemented four specific interventions over a period of 10 months: (a) eliminating intravenous contrast medium, (b) reducing repeated language paradigms, (c) updating technologist and physician checklists, and (d) updating visual slides and audio. Our mean functional MR imaging acquisition time was reduced from 76.3 to 53.2 minutes, while our functional MR imaging examinations remained of diagnostic quality. As a result, we reduced our routine scheduling time for functional MR imaging from 2 hours to 1 hour, improving patient comfort and satisfaction as well as saving time for additional potential MR imaging acquisitions. Our efforts to optimize functional MR imaging workflow constitute a practice quality improvement project that is beneficial for patient care and can be applied broadly to other functional MR imaging practices. ©RSNA, 2017.
Collapse
Affiliation(s)
- Wilson B Chwang
- From the Department of Radiology, Stanford Health Care, Lucas Center for Imaging, 1201 Welch Rd, Room P271, Stanford, CA 94305
| | - Michael Iv
- From the Department of Radiology, Stanford Health Care, Lucas Center for Imaging, 1201 Welch Rd, Room P271, Stanford, CA 94305
| | - Jason Smith
- From the Department of Radiology, Stanford Health Care, Lucas Center for Imaging, 1201 Welch Rd, Room P271, Stanford, CA 94305
| | - Aleksandrs Kalnins
- From the Department of Radiology, Stanford Health Care, Lucas Center for Imaging, 1201 Welch Rd, Room P271, Stanford, CA 94305
| | - Jake Mickelsen
- From the Department of Radiology, Stanford Health Care, Lucas Center for Imaging, 1201 Welch Rd, Room P271, Stanford, CA 94305
| | - Roland Bammer
- From the Department of Radiology, Stanford Health Care, Lucas Center for Imaging, 1201 Welch Rd, Room P271, Stanford, CA 94305
| | - Dominik Fleischmann
- From the Department of Radiology, Stanford Health Care, Lucas Center for Imaging, 1201 Welch Rd, Room P271, Stanford, CA 94305
| | - David B Larson
- From the Department of Radiology, Stanford Health Care, Lucas Center for Imaging, 1201 Welch Rd, Room P271, Stanford, CA 94305
| | - Max Wintermark
- From the Department of Radiology, Stanford Health Care, Lucas Center for Imaging, 1201 Welch Rd, Room P271, Stanford, CA 94305
| | - Michael Zeineh
- From the Department of Radiology, Stanford Health Care, Lucas Center for Imaging, 1201 Welch Rd, Room P271, Stanford, CA 94305
| |
Collapse
|
36
|
Kalnins A, Mickelsen LJ, Marsh D, Zorich C, Casal S, Tai WA, Vora N, Olalia G, Wintermark M, Larson DB. Decreasing Stroke Code to CT Time in Patients Presenting with Stroke Symptoms. Radiographics 2017; 37:1559-1568. [PMID: 28820652 DOI: 10.1148/rg.2017160190] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Guided quality improvement (QI) programs present an effective means to streamline stroke code to computed tomography (CT) times in a comprehensive stroke center. Applying QI methods and a multidisciplinary team approach may decrease the stroke code to CT time in non-prenotified emergency department (ED) patients presenting with symptoms of stroke. The aim of this project was to decrease this time for non-prenotified stroke code patients from a baseline mean of 20 minutes to one less than 15 minutes during an 18-week period by applying QI methods in the context of a structured QI program. By reducing this time, it was expected that the door-to-CT time guideline of 25 minutes could be met more consistently. Through the structured QI program, we gained an understanding of the process that enabled us to effectively identify key drivers of performance to guide project interventions. As a result of these interventions, the stroke code to CT time for non-prenotified stroke code patients decreased to a mean of less than 14 minutes. This article reports these methods and results so that others can similarly improve the time it takes to perform nonenhanced CT studies in non-prenotified stroke code patients in the ED. ©RSNA, 2017.
Collapse
Affiliation(s)
- Aleksandrs Kalnins
- From the Departments of Radiology (A.K., L.J.M., D.M., C.Z., M.W., D.B.L.), Neurology and Neurological Sciences (S.C., N.V.), and Emergency Medicine (G.O.), Stanford University School of Medicine, Stanford Hospital and Clinics, Stanford, Calif; and Neuroscience Service Line, Department of Medicine, Christiana Care Health System, Newark, Del (W.A.T.)
| | - L Jake Mickelsen
- From the Departments of Radiology (A.K., L.J.M., D.M., C.Z., M.W., D.B.L.), Neurology and Neurological Sciences (S.C., N.V.), and Emergency Medicine (G.O.), Stanford University School of Medicine, Stanford Hospital and Clinics, Stanford, Calif; and Neuroscience Service Line, Department of Medicine, Christiana Care Health System, Newark, Del (W.A.T.)
| | - Daisha Marsh
- From the Departments of Radiology (A.K., L.J.M., D.M., C.Z., M.W., D.B.L.), Neurology and Neurological Sciences (S.C., N.V.), and Emergency Medicine (G.O.), Stanford University School of Medicine, Stanford Hospital and Clinics, Stanford, Calif; and Neuroscience Service Line, Department of Medicine, Christiana Care Health System, Newark, Del (W.A.T.)
| | - Christoph Zorich
- From the Departments of Radiology (A.K., L.J.M., D.M., C.Z., M.W., D.B.L.), Neurology and Neurological Sciences (S.C., N.V.), and Emergency Medicine (G.O.), Stanford University School of Medicine, Stanford Hospital and Clinics, Stanford, Calif; and Neuroscience Service Line, Department of Medicine, Christiana Care Health System, Newark, Del (W.A.T.)
| | - Stephanie Casal
- From the Departments of Radiology (A.K., L.J.M., D.M., C.Z., M.W., D.B.L.), Neurology and Neurological Sciences (S.C., N.V.), and Emergency Medicine (G.O.), Stanford University School of Medicine, Stanford Hospital and Clinics, Stanford, Calif; and Neuroscience Service Line, Department of Medicine, Christiana Care Health System, Newark, Del (W.A.T.)
| | - Waimei Amy Tai
- From the Departments of Radiology (A.K., L.J.M., D.M., C.Z., M.W., D.B.L.), Neurology and Neurological Sciences (S.C., N.V.), and Emergency Medicine (G.O.), Stanford University School of Medicine, Stanford Hospital and Clinics, Stanford, Calif; and Neuroscience Service Line, Department of Medicine, Christiana Care Health System, Newark, Del (W.A.T.)
| | - Nirali Vora
- From the Departments of Radiology (A.K., L.J.M., D.M., C.Z., M.W., D.B.L.), Neurology and Neurological Sciences (S.C., N.V.), and Emergency Medicine (G.O.), Stanford University School of Medicine, Stanford Hospital and Clinics, Stanford, Calif; and Neuroscience Service Line, Department of Medicine, Christiana Care Health System, Newark, Del (W.A.T.)
| | - Gennette Olalia
- From the Departments of Radiology (A.K., L.J.M., D.M., C.Z., M.W., D.B.L.), Neurology and Neurological Sciences (S.C., N.V.), and Emergency Medicine (G.O.), Stanford University School of Medicine, Stanford Hospital and Clinics, Stanford, Calif; and Neuroscience Service Line, Department of Medicine, Christiana Care Health System, Newark, Del (W.A.T.)
| | - Max Wintermark
- From the Departments of Radiology (A.K., L.J.M., D.M., C.Z., M.W., D.B.L.), Neurology and Neurological Sciences (S.C., N.V.), and Emergency Medicine (G.O.), Stanford University School of Medicine, Stanford Hospital and Clinics, Stanford, Calif; and Neuroscience Service Line, Department of Medicine, Christiana Care Health System, Newark, Del (W.A.T.)
| | - David B Larson
- From the Departments of Radiology (A.K., L.J.M., D.M., C.Z., M.W., D.B.L.), Neurology and Neurological Sciences (S.C., N.V.), and Emergency Medicine (G.O.), Stanford University School of Medicine, Stanford Hospital and Clinics, Stanford, Calif; and Neuroscience Service Line, Department of Medicine, Christiana Care Health System, Newark, Del (W.A.T.)
| |
Collapse
|
37
|
Abstract
The modern radiology department is built around the flow of information. Ordering providers request imaging studies to be performed, technologists complete the work required to perform the imaging studies, and radiologists interpret and report on the imaging findings. As each of these steps is performed, data flow between multiple information systems, most notably the radiology information system (RIS), the picture archiving and communication system (PACS) and the voice dictation system. Even though data flow relatively seamlessly, the majority of our systems and processes are inefficient. The purpose of this article is to describe the radiology value stream and describe how radiology informaticists in one department have worked to improve the efficiency of the value stream at each step. Through these examples, we identify and describe several themes that we believe have been crucial to our success.
Collapse
Affiliation(s)
- Alexander J Towbin
- Department of Radiology, Cincinnati Children's Hospital Medical Center, 3333 Burnet Ave., MLC 5031, Cincinnati, OH, 45229, USA.
| | - Laurie A Perry
- Department of Radiology, Cincinnati Children's Hospital Medical Center, 3333 Burnet Ave., MLC 5031, Cincinnati, OH, 45229, USA
| | - David B Larson
- Department of Radiology, Stanford University School of Medicine, Stanford, CA, USA
| |
Collapse
|
38
|
Larson DB, Donnelly LF, Podberesky DJ, Merrow AC, Sharpe RE, Kruskal JB. Peer Feedback, Learning, and Improvement: Answering the Call of the Institute of Medicine Report on Diagnostic Error. Radiology 2017; 283:231-241. [DOI: 10.1148/radiol.2016161254] [Citation(s) in RCA: 78] [Impact Index Per Article: 11.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Affiliation(s)
- David B. Larson
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105 (D.B.L.); Texas Children’s Hospital, Houston, Tex (L.F.D.); Nemours Children’s Health System, Orlando, Fla (D.J.P.); Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio (A.C.M.); Kaiser Permanente, Denver, Colo (R.E.S.); and Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.)
| | - Lane F. Donnelly
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105 (D.B.L.); Texas Children’s Hospital, Houston, Tex (L.F.D.); Nemours Children’s Health System, Orlando, Fla (D.J.P.); Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio (A.C.M.); Kaiser Permanente, Denver, Colo (R.E.S.); and Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.)
| | - Daniel J. Podberesky
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105 (D.B.L.); Texas Children’s Hospital, Houston, Tex (L.F.D.); Nemours Children’s Health System, Orlando, Fla (D.J.P.); Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio (A.C.M.); Kaiser Permanente, Denver, Colo (R.E.S.); and Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.)
| | - Arnold C. Merrow
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105 (D.B.L.); Texas Children’s Hospital, Houston, Tex (L.F.D.); Nemours Children’s Health System, Orlando, Fla (D.J.P.); Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio (A.C.M.); Kaiser Permanente, Denver, Colo (R.E.S.); and Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.)
| | - Richard E. Sharpe
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105 (D.B.L.); Texas Children’s Hospital, Houston, Tex (L.F.D.); Nemours Children’s Health System, Orlando, Fla (D.J.P.); Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio (A.C.M.); Kaiser Permanente, Denver, Colo (R.E.S.); and Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.)
| | - Jonathan B. Kruskal
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105 (D.B.L.); Texas Children’s Hospital, Houston, Tex (L.F.D.); Nemours Children’s Health System, Orlando, Fla (D.J.P.); Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio (A.C.M.); Kaiser Permanente, Denver, Colo (R.E.S.); and Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.)
| |
Collapse
|
39
|
Larson DB, Mickelsen LJ, Garcia K. Realizing Improvement through Team Empowerment (RITE): A Team-based, Project-based Multidisciplinary Improvement Program. Radiographics 2016; 36:2170-2183. [DOI: 10.1148/rg.2016160136] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
40
|
Affiliation(s)
- Stephen G. Bryant
- Center for Medication Monitoring, University of Texas Medical Branch, Galveston, Texas
| | - David B. Larson
- Primary Care Research Section, Biometric & Clinical Applications Branch, Division of Biometry & Applied Science, National Institute of Mental Health, Rockville, Maryland
| | - Seymour Fisher
- Center for Medication Monitoring, University of Texas Medical Branch, Galveston, Texas
| | | |
Collapse
|
41
|
Gartner J, Larson DB, Vachar-Mayberry CD. A Systematic Review of the Quantity and Quality of Empirical Research Published in Four Pastoral Counseling Journals: 1975–1984. ACTA ACUST UNITED AC 2016. [DOI: 10.1177/002234099004400205] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Compares the quantity and quality of research in pastoral counseling journals to that found in similar journals in psychiatry, geriatrics, and nursing. Concludes from the data that pastoral counseling research is less likely to state hypotheses, to use control groups, to state a sampling method, to report a response rate, to evaluate at more than a simple point in time, or to discuss limitations of the findings. Infers that, given these factors, pastoral counseling has failed to develop adequately as a behavioral science. Critical responses follow the article.
Collapse
Affiliation(s)
- John Gartner
- Director of Research, Department of Pastoral Counseling, Loyola College, 7135 Minstrel Way, Columbia, MD 21045
| | - David B. Larson
- Research Psychiatrist, Department of Psychiatry, Duke University Medical Center, c/o Room 18C-14, 5600 Fishers Lane, Rockville, MD 20857
| | - Carole D. Vachar-Mayberry
- Clinical Psychologist, Assistant Professor of Family Medicine, University of Virginia School of Medicine; Family Practice Behavioral Medical Coordinator, Roanoke Memorial Hospital, Bellview at Jefferson Streets, PO Box 13367, Roanoke, VA 24033
| |
Collapse
|
42
|
Abstract
Serious adverse events continue to occur in clinical practice, despite our best preventive efforts. It is essential that radiologists, both as individuals and as a part of organizations, learn from such events and make appropriate changes to decrease the likelihood that such events will recur. Root cause analysis (RCA) is a process to (a) identify factors that underlie variation in performance or that predispose an event toward undesired outcomes and (b) allow for development of effective strategies to decrease the likelihood of similar adverse events occurring in the future. An RCA process should be performed within the environment of a culture of safety, focusing on underlying system contributors and, in a confidential manner, taking into account the emotional effects on the staff involved. The Joint Commission now requires that a credible RCA be performed within 45 days for all sentinel or major adverse events, emphasizing the need for all radiologists to understand the processes with which an effective RCA can be performed. Several RCA-related tools that have been found to be useful in the radiology setting include the "five whys" approach to determine causation; cause-and-effect, or Ishikawa, diagrams; causal tree mapping; affinity diagrams; and Pareto charts.
Collapse
Affiliation(s)
- Olga R Brook
- From the Department of Radiology, Beth Israel Deaconess Medical Center, 330 Brookline Ave, Boston, MA 02215 (O.R.B., J.B.K., R.L.E.); and Department of Radiology, Stanford University, Stanford, Calif (D.B.L.)
| | - Jonathan B Kruskal
- From the Department of Radiology, Beth Israel Deaconess Medical Center, 330 Brookline Ave, Boston, MA 02215 (O.R.B., J.B.K., R.L.E.); and Department of Radiology, Stanford University, Stanford, Calif (D.B.L.)
| | - Ronald L Eisenberg
- From the Department of Radiology, Beth Israel Deaconess Medical Center, 330 Brookline Ave, Boston, MA 02215 (O.R.B., J.B.K., R.L.E.); and Department of Radiology, Stanford University, Stanford, Calif (D.B.L.)
| | - David B Larson
- From the Department of Radiology, Beth Israel Deaconess Medical Center, 330 Brookline Ave, Boston, MA 02215 (O.R.B., J.B.K., R.L.E.); and Department of Radiology, Stanford University, Stanford, Calif (D.B.L.)
| |
Collapse
|
43
|
Abstract
Controlled intervention studies offer considerable promise to better understand relationships and possible mechanisms between spiritual and religious factors and health. Studies examining spiritually augmented cognitive–behavioral therapies, forgiveness interventions, different meditation approaches, 12-step fellowships, and prayer have provided some evidence, albeit modest, of efficacy in improving health under specific conditions. Researchers need to describe spiritual and religious factors more clearly and precisely, as well as demonstrate that such factors independently influence treatment efficacy. Inclusion of potential moderating and mediating variables (e.g. extent of religious commitment, intrinsic religiousness, specific religious coping strategy) in intervention designs could help explain relationships and outcomes. Using a variety of research designs (e.g. randomized clinical trials, single-subject experimental designs) and assessment methods (e.g. daily self-monitoring, ambulatory physiological measures, in-depth structured interviews) would avoid current limitations of short-term studies using only questionnaires.
Collapse
|
44
|
Abstract
Data were obtained on patients' attribution of symptoms using two different post-marketing surveillance methods. In the patient-initiated method, outpatients were randomly assigned to have a printed notice conspicuously attached to the outside of their medication bags; the "outsert" requested the patients to monitor themselves during the next 2 weeks and to report via a toll-free telephone number any new or unusual symptoms. In the staff-initiated method, other patients filling the same target drug prescriptions did not receive any outsert but were identified at the pharmacy for a telephone interview 10 to 14 days later. The target drugs were chosen from two classes for which the major adverse drug reactions (ADRs) are well identified: oral antibiotics and tricyclic antidepressants. During the interview, for each reported symptom the patient was asked whether it might have been caused by the target drug. Results indicated that older patients, irrespective of the surveillance method or the drug class, appear to be capable of discriminating probable ADRs at least as well as or possibly better than younger patients.
Collapse
|
45
|
|
46
|
Lee CS, Wadhwa V, Kruskal JB, Larson DB. Conducting a Successful Practice Quality Improvement Project for American Board of Radiology Certification. Radiographics 2015; 35:1643-51. [PMID: 26334572 DOI: 10.1148/rg.2015150024] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Practice quality improvement (PQI) is a required component of the American Board of Radiology (ABR) Maintenance of Certification (MOC) cycle, with the goal to "improve the quality of health care through diplomate-initiated learning and quality improvement." The essential requirements of PQI projects include relevance to one's practice, achievability in one's clinical setting, results suited for repeat measurements during an ABR MOC cycle, and reasonable expectation to result in quality improvement (QI). PQI projects can be performed by a group or an individual or as part of a participating institution. Given the interdisciplinary nature of radiology, teamwork is critical to ensure patient safety and the success of PQI projects. Additionally, successful QI requires considerable investment of time and resources, coordination, organizational support, and individual engagement. Group PQI projects offer many advantages, especially in larger practices and for processes that cross organizational boundaries, whereas individual projects may be preferred in small practices or for focused projects. In addition to the three-phase "plan, do, study, act" model advocated by the ABR, there are several other improvement models, which are based on continuous data collection and rapid simultaneous testing of multiple interventions. When properly planned, supported, and executed, group PQI projects can improve the value and viability of a radiology practice.
Collapse
Affiliation(s)
- Cindy S Lee
- From the Department of Radiology and Biomedical Imaging, University of California-San Francisco, 505 Parnassus Ave, Room L374, San Francisco, CA 94143 (C.S.L.); the Russell H. Morgan Department of Radiology and Radiological Sciences, Johns Hopkins University School of Medicine, Baltimore, Md (V.W.); Department of Radiology, Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.); and Department of Radiology, Stanford University School of Medicine, Stanford, Calif (D.B.L.)
| | - Vibhor Wadhwa
- From the Department of Radiology and Biomedical Imaging, University of California-San Francisco, 505 Parnassus Ave, Room L374, San Francisco, CA 94143 (C.S.L.); the Russell H. Morgan Department of Radiology and Radiological Sciences, Johns Hopkins University School of Medicine, Baltimore, Md (V.W.); Department of Radiology, Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.); and Department of Radiology, Stanford University School of Medicine, Stanford, Calif (D.B.L.)
| | - Jonathan B Kruskal
- From the Department of Radiology and Biomedical Imaging, University of California-San Francisco, 505 Parnassus Ave, Room L374, San Francisco, CA 94143 (C.S.L.); the Russell H. Morgan Department of Radiology and Radiological Sciences, Johns Hopkins University School of Medicine, Baltimore, Md (V.W.); Department of Radiology, Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.); and Department of Radiology, Stanford University School of Medicine, Stanford, Calif (D.B.L.)
| | - David B Larson
- From the Department of Radiology and Biomedical Imaging, University of California-San Francisco, 505 Parnassus Ave, Room L374, San Francisco, CA 94143 (C.S.L.); the Russell H. Morgan Department of Radiology and Radiological Sciences, Johns Hopkins University School of Medicine, Baltimore, Md (V.W.); Department of Radiology, Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.); and Department of Radiology, Stanford University School of Medicine, Stanford, Calif (D.B.L.)
| |
Collapse
|
47
|
Affiliation(s)
- David B Larson
- Department of Radiology, Stanford University School of Medicine, 300 Pasteur Drive, Stanford, CA, 94305-5105, USA,
| |
Collapse
|
48
|
Larson DB. Optimizing CT radiation dose based on patient size and image quality: the size-specific dose estimate method. Pediatr Radiol 2014; 44 Suppl 3:501-5. [PMID: 25304711 DOI: 10.1007/s00247-014-3077-y] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/23/2014] [Revised: 04/17/2014] [Accepted: 06/01/2014] [Indexed: 10/24/2022]
Abstract
The principle of ALARA (dose as low as reasonably achievable) calls for dose optimization rather than dose reduction, per se. Optimization of CT radiation dose is accomplished by producing images of acceptable diagnostic image quality using the lowest dose method available. Because it is image quality that constrains the dose, CT dose optimization is primarily a problem of image quality rather than radiation dose. Therefore, the primary focus in CT radiation dose optimization should be on image quality. However, no reliable direct measure of image quality has been developed for routine clinical practice. Until such measures become available, size-specific dose estimates (SSDE) can be used as a reasonable image-quality estimate. The SSDE method of radiation dose optimization for CT abdomen and pelvis consists of plotting SSDE for a sample of examinations as a function of patient size, establishing an SSDE threshold curve based on radiologists' assessment of image quality, and modifying protocols to consistently produce doses that are slightly above the threshold SSDE curve. Challenges in operationalizing CT radiation dose optimization include data gathering and monitoring, managing the complexities of the numerous protocols, scanners and operators, and understanding the relationship of the automated tube current modulation (ATCM) parameters to image quality. Because CT manufacturers currently maintain their ATCM algorithms as secret for proprietary reasons, prospective modeling of SSDE for patient populations is not possible without reverse engineering the ATCM algorithm and, hence, optimization by this method requires a trial-and-error approach.
Collapse
Affiliation(s)
- David B Larson
- Department of Radiology, Stanford University School of Medicine, 300 Pasteur Drive, Stanford, CA, 94305-5105, USA,
| |
Collapse
|
49
|
Piña IL, Cohen PD, Larson DB, Marion LN, Sills MR, Solberg LI, Zerzan J. A framework for describing health care delivery organizations and systems. Am J Public Health 2014; 105:670-9. [PMID: 24922130 DOI: 10.2105/ajph.2014.301926] [Citation(s) in RCA: 35] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Describing, evaluating, and conducting research on the questions raised by comparative effectiveness research and characterizing care delivery organizations of all kinds, from independent individual provider units to large integrated health systems, has become imperative. Recognizing this challenge, the Delivery Systems Committee, a subgroup of the Agency for Healthcare Research and Quality's Effective Health Care Stakeholders Group, which represents a wide diversity of perspectives on health care, created a draft framework with domains and elements that may be useful in characterizing various sizes and types of care delivery organizations and may contribute to key outcomes of interest. The framework may serve as the door to further studies in areas in which clear definitions and descriptions are lacking.
Collapse
Affiliation(s)
- Ileana L Piña
- Ileana L. Piña is with Albert Einstein College of Medicine and Montefiore-Einstein Medical Center, Bronx, NY. Perry D. Cohen is with the Parkinson Pipeline Project, Washington, DC. David B. Larson is with the Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, OH. Lucy N. Marion is with the Medical College of Georgia School of Nursing, Macon. Marion R. Sills is with the University of Colorado School of Medicine, Denver. Leif I. Solberg is with HealthPartners Medical Group and Clinics, Minneapolis, MN. Judy Zerzan is with the Colorado Department of Health Care Policy and Financing, Denver
| | | | | | | | | | | | | |
Collapse
|
50
|
|