1
|
Helminski D, Sussman JB, Pfeiffer PN, Kokaly AN, Ranusch A, Renji AD, Damschroder LJ, Landis-Lewis Z, Kurlander JE. Development, Implementation, and Evaluation Methods for Dashboards in Health Care: Scoping Review. JMIR Med Inform 2024; 12:e59828. [PMID: 39656991 PMCID: PMC11651422 DOI: 10.2196/59828] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2024] [Revised: 09/26/2024] [Accepted: 10/26/2024] [Indexed: 12/17/2024] Open
Abstract
Background Dashboards have become ubiquitous in health care settings, but to achieve their goals, they must be developed, implemented, and evaluated using methods that help ensure they meet the needs of end users and are suited to the barriers and facilitators of the local context. Objective This scoping review aimed to explore published literature on health care dashboards to characterize the methods used to identify factors affecting uptake, strategies used to increase dashboard uptake, and evaluation methods, as well as dashboard characteristics and context. Methods MEDLINE, Embase, Web of Science, and the Cochrane Library were searched from inception through July 2020. Studies were included if they described the development or evaluation of a health care dashboard with publication from 2018-2020. Clinical setting, purpose (categorized as clinical, administrative, or both), end user, design characteristics, methods used to identify factors affecting uptake, strategies to increase uptake, and evaluation methods were extracted. Results From 116 publications, we extracted data for 118 dashboards. Inpatient (45/118, 38.1%) and outpatient (42/118, 35.6%) settings were most common. Most dashboards had ≥2 stated purposes (84/118, 71.2%); of these, 54 of 118 (45.8%) were administrative, 43 of 118 (36.4%) were clinical, and 20 of 118 (16.9%) had both purposes. Most dashboards included frontline clinical staff as end users (97/118, 82.2%). To identify factors affecting dashboard uptake, half involved end users in the design process (59/118, 50%); fewer described formative usability testing (26/118, 22%) or use of any theory or framework to guide development, implementation, or evaluation (24/118, 20.3%). The most common strategies used to increase uptake included education (60/118, 50.8%); audit and feedback (59/118, 50%); and advisory boards (54/118, 45.8%). Evaluations of dashboards (84/118, 71.2%) were mostly quantitative (60/118, 50.8%), with fewer using only qualitative methods (6/118, 5.1%) or a combination of quantitative and qualitative methods (18/118, 15.2%). Conclusions Most dashboards forego steps during development to ensure they suit the needs of end users and the clinical context; qualitative evaluation-which can provide insight into ways to improve dashboard effectiveness-is uncommon. Education and audit and feedback are frequently used to increase uptake. These findings illustrate the need for promulgation of best practices in dashboard development and will be useful to dashboard planners.
Collapse
Affiliation(s)
- Danielle Helminski
- Department of Internal Medicine, University of Michigan, 2800 Plymouth Road, NCRC Building 14, Ann Arbor, MI, 48109, United States, 1 734 430 5359
| | - Jeremy B Sussman
- Department of Internal Medicine, University of Michigan, 2800 Plymouth Road, NCRC Building 14, Ann Arbor, MI, 48109, United States, 1 734 430 5359
- Institute for Healthcare Policy and Innovation, University of Michigan, Ann Arbor, MI, United States
- Veterans Affairs Ann Arbor Center for Clinical Management Research, Ann Arbor, MI, United States
| | - Paul N Pfeiffer
- Institute for Healthcare Policy and Innovation, University of Michigan, Ann Arbor, MI, United States
- Veterans Affairs Ann Arbor Center for Clinical Management Research, Ann Arbor, MI, United States
- Department of Psychiatry, University of Michigan, Ann Arbor, MI, United States
| | - Alex N Kokaly
- Department of Medicine, University of California Los Angeles Health, Los Angeles, CA, United States
| | - Allison Ranusch
- Veterans Affairs Ann Arbor Center for Clinical Management Research, Ann Arbor, MI, United States
| | - Anjana Deep Renji
- Department of Learning Health Sciences, University of Michigan, Ann Arbor, MI, United States
| | - Laura J Damschroder
- Veterans Affairs Ann Arbor Center for Clinical Management Research, Ann Arbor, MI, United States
| | - Zach Landis-Lewis
- Department of Learning Health Sciences, University of Michigan, Ann Arbor, MI, United States
| | - Jacob E Kurlander
- Department of Internal Medicine, University of Michigan, 2800 Plymouth Road, NCRC Building 14, Ann Arbor, MI, 48109, United States, 1 734 430 5359
- Institute for Healthcare Policy and Innovation, University of Michigan, Ann Arbor, MI, United States
- Veterans Affairs Ann Arbor Center for Clinical Management Research, Ann Arbor, MI, United States
| |
Collapse
|
2
|
Bucalon B, Shaw T, Brown K, Kay J. State-of-the-art Dashboards on Clinical Indicator Data to Support Reflection on Practice: Scoping Review. JMIR Med Inform 2022; 10:e32695. [PMID: 35156928 PMCID: PMC8887640 DOI: 10.2196/32695] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2021] [Revised: 11/19/2021] [Accepted: 12/04/2021] [Indexed: 11/13/2022] Open
Abstract
BACKGROUND There is an increasing interest in using routinely collected eHealth data to support reflective practice and long-term professional learning. Studies have evaluated the impact of dashboards on clinician decision-making, task completion time, user satisfaction, and adherence to clinical guidelines. OBJECTIVE This scoping review aims to summarize the literature on dashboards based on patient administrative, medical, and surgical data for clinicians to support reflective practice. METHODS A scoping review was conducted using the Arksey and O'Malley framework. A search was conducted in 5 electronic databases (MEDLINE, Embase, Scopus, ACM Digital Library, and Web of Science) to identify studies that met the inclusion criteria. Study selection and characterization were performed by 2 independent reviewers (BB and CP). One reviewer extracted the data that were analyzed descriptively to map the available evidence. RESULTS A total of 18 dashboards from 8 countries were assessed. Purposes for the dashboards were designed for performance improvement (10/18, 56%), to support quality and safety initiatives (6/18, 33%), and management and operations (4/18, 22%). Data visualizations were primarily designed for team use (12/18, 67%) rather than individual clinicians (4/18, 22%). Evaluation methods varied among asking the clinicians directly (11/18, 61%), observing user behavior through clinical indicators and use log data (14/18, 78%), and usability testing (4/18, 22%). The studies reported high scores on standard usability questionnaires, favorable surveys, and interview feedback. Improvements to underlying clinical indicators were observed in 78% (7/9) of the studies, whereas 22% (2/9) of the studies reported no significant changes in performance. CONCLUSIONS This scoping review maps the current literature landscape on dashboards based on routinely collected clinical indicator data. Although there were common data visualization techniques and clinical indicators used across studies, there was diversity in the design of the dashboards and their evaluation. There was a lack of detail regarding the design processes documented for reproducibility. We identified a lack of interface features to support clinicians in making sense of and reflecting on their personal performance data.
Collapse
Affiliation(s)
- Bernard Bucalon
- Human Centred Technology Cluster, School of Computer Science, The University of Sydney, Darlington, Australia
- Practice Analytics, Digital Health Cooperative Research Centre, Sydney, Australia
| | - Tim Shaw
- Practice Analytics, Digital Health Cooperative Research Centre, Sydney, Australia
- Research in Implementation Science and e-Health Group, Faculty of Medicine and Health, The University of Sydney, Sydney, Australia
| | - Kerri Brown
- Practice Analytics, Digital Health Cooperative Research Centre, Sydney, Australia
- Professional Practice Directorate, The Royal Australasian College of Physicians, Sydney, Australia
| | - Judy Kay
- Human Centred Technology Cluster, School of Computer Science, The University of Sydney, Darlington, Australia
- Practice Analytics, Digital Health Cooperative Research Centre, Sydney, Australia
| |
Collapse
|
3
|
Khanna N, Klyushnenkova E, Montgomery R. Hypertension and Diabetes Quality Improvement in a Practice Transformation Network. Am J Med Qual 2020; 35:486-490. [PMID: 32141300 DOI: 10.1177/1062860620910200] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
The Garden Practice Transformation Network in Maryland brought together primary and specialty practices working toward value-based models and efficient care delivery. Practices were provided with coaching and live and multimedia education regarding practice transformation and quality improvement. Coaches supported practices in multipronged approaches to quality improvement. Practice champions and clinical staff were trained on appropriate documentation of blood pressure (BP) and diabetes measures, and new workflows to optimize care delivery. Quality improvement staff were trained in extracting data from electronic health records, providing feedback to practice clinicians, and reporting to Practice Transformation Network staff. Final measurement of BP control was 66%, and final measurement of blood glucose control was 28%. Quality improvement activities in a practice transformation network led to the delivery of high-quality care and quality improvement.
Collapse
Affiliation(s)
- Niharika Khanna
- University of Maryland School of Medicine, Baltimore, MD Discern Health, Baltimore, MD
| | | | | |
Collapse
|
4
|
Phillips RL, Cohen DJ, Kaufman A, Dickinson WP, Cykert S. Facilitating Practice Transformation in Frontline Health Care. Ann Fam Med 2019; 17:S2-S5. [PMID: 31405869 PMCID: PMC6827672 DOI: 10.1370/afm.2439] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/13/2019] [Revised: 06/18/2019] [Accepted: 07/01/2019] [Indexed: 01/13/2023] Open
Affiliation(s)
- Robert L Phillips
- Center for Professionalism & Value in Health Care, Lexington, Kentucky
| | - Deborah J Cohen
- Department of Family Medicine, Oregon Health & Science University, Portland, Oregon
| | - Arthur Kaufman
- Office for Community Health, University of New Mexico, Albuquerque, New Mexico
| | - W Perry Dickinson
- Department of Family Medicine, University of Colorado School of Medicine, Aurora, Colorado
| | - Samuel Cykert
- Division of General Medicine and Clinical Epidemiology, School of Medicine, University of North Carolina, Chapel Hill, North Carolina.,Cecil G. Sheps Center for Health Services Research, University of North Carolina, Chapel Hill, North Carolina
| |
Collapse
|