1
|
Bazzi AR, Roth AM, Akiba CF, Huffaker SL, Patel SV, Smith J, Laurano R, Orme S, Zarkin GA, Morgan-Lopez A, Lambdin BH. A systems analysis and improvement approach to optimizing syringe services programs' delivery of HIV testing and referrals: Study protocol for a parallel-group randomized controlled trial (SAIA-SSP-HIV). PLoS One 2025; 20:e0319340. [PMID: 39999129 PMCID: PMC11856318 DOI: 10.1371/journal.pone.0319340] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2024] [Accepted: 01/21/2025] [Indexed: 02/27/2025] Open
Abstract
With changing drug supplies and associated drug consumption behaviors, HIV transmission has increased among people who inject drugs in the United States. HIV testing and referrals to effective prevention and treatment services are critical for individual and population health, yet multilevel barriers limit access to HIV testing for this population, even within syringe services programs (SSPs). In this organizational-level interrupted time series randomized controlled trial, we will assess the effectiveness and cost-effectiveness of an implementation strategy, the Systems Analysis and Improvement Approach (SAIA), in optimizing HIV testing and referrals to appropriate clinical services among U.S. SSPs. From 01/12/2023 to 01/07/2025, we will recruit a diverse sample of 32 SSPs nationally that directly provide HIV testing to participants. SSPs will be randomized to the active implementation arm (i.e., SAIA-SSP-HIV) or an implementation-as-usual arm (n = 16 organizations per arm). SAIA-SSP-HIV is a flexible, data-driven implementation strategy designed to help optimize SSPs' delivery of HIV testing and referrals to appropriate clinical services for HIV prevention (e.g., pre-exposure prophylaxis) and treatment. In the active implementation arm, trained SAIA specialists will guide SSPs through three cyclical steps over 12 months: (1) process mapping to identify organization-specific needs, (2) cascade analysis and prioritization of areas for improvement, and (3) testing solutions through continuous quality improvement. In both arms, we will collect outcome data over 21 months (3-month lead-in period, 12-month implementation period, 6-month sustainment period). We will assess the initial and sustained effectiveness of SAIA and calculate its cost and cost-effectiveness. This trial presents a novel opportunity to test the effectiveness of an organization-level implementation strategy for optimizing the delivery of HIV screening and referrals in community settings that are frequented by an at-risk population. If successful, SAIA-SSP-HIV could be adapted for other infectious or chronic disease care cascades within SSPs. Trial registration: ClinicalTrials.gov: NCT06025435.
Collapse
Affiliation(s)
- Angela R. Bazzi
- Herbert Wertheim School of Public Health, University of California San Diego, La Jolla, California, United States of America
- Boston University School of Public Health, Boston, Massachusetts, United States of America
| | - Alexis M. Roth
- Dornsife School of Public Health, Drexel University, Philadelphia, Pennsylvania, United States of America
| | - Christopher F. Akiba
- RTI International, Research Triangle Park, Durham, North Carolina, United States of America
| | - Shelby L. Huffaker
- Boston University School of Public Health, Boston, Massachusetts, United States of America
| | - Sheila V. Patel
- RTI International, Research Triangle Park, Durham, North Carolina, United States of America
| | - Jessica Smith
- RTI International, Research Triangle Park, Durham, North Carolina, United States of America
| | - Rose Laurano
- Dornsife School of Public Health, Drexel University, Philadelphia, Pennsylvania, United States of America
| | - Stephen Orme
- RTI International, Research Triangle Park, Durham, North Carolina, United States of America
| | - Gary A. Zarkin
- RTI International, Research Triangle Park, Durham, North Carolina, United States of America
| | - Antonio Morgan-Lopez
- RTI International, Research Triangle Park, Durham, North Carolina, United States of America
| | - Barrot H. Lambdin
- RTI International, Research Triangle Park, Durham, North Carolina, United States of America
| |
Collapse
|
2
|
Albers B, Verweij L, Blum K, Oesch S, Schultes MT, Clack L, Naef R. Firm, yet flexible: a fidelity debate paper with two case examples. Implement Sci 2024; 19:79. [PMID: 39639379 PMCID: PMC11619306 DOI: 10.1186/s13012-024-01406-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2024] [Accepted: 11/05/2024] [Indexed: 12/07/2024] Open
Abstract
BACKGROUND In healthcare research and practice, intervention and implementation fidelity represent the steadfast adherence to core components of research-supported interventions and the strategies employed for their implementation. Evaluating fidelity involves determining whether these core components were delivered as intended. Without fidelity data, the results of complex interventions cannot be meaningfully interpreted. Increasingly, the necessity for firmness and strict adherence by implementers and their organizations has been questioned, with calls for flexibility to accommodate contextual conditions. This shift makes contemporary fidelity a balancing act, requiring researchers to navigate various tensions. This debate paper explores these tensions, drawing on experiences from developing fidelity assessments in two ongoing effectiveness-implementation hybrid trials. MAIN BODY First, given often scarce knowledge about the core components of complex interventions and implementation strategies, decisions about fidelity requirements involve a degree of subjective reasoning. Researchers should make these decisions transparent using theory or logic models. Second, because fidelity is context-dependent and applies to both interventions and implementation strategies, researchers must rethink fidelity concepts with every study while balancing firmness and flexibility. This is particularly crucial for hybrid studies, with their differing emphasis on intervention and implementation fidelity. Third, fidelity concepts typically focus on individual behaviors. However, since organizational and system factors also influence fidelity, there is a growing need to define fidelity criteria at these levels. Finally, as contemporary fidelity concepts prioritize flexible over firm adherence, building, evaluating, and maintaining fidelity in healthcare research has become more complex. This complexity calls for intensified efforts to expand the knowledge base for pragmatic and adaptive fidelity measurement in trial and routine healthcare settings. CONCLUSION Contemporary conceptualizations of fidelity place greater demands on how fidelity is examined, necessitating the expansion of fidelity frameworks to include organizational and system levels, the service- and study-specific conceptualizations of intervention and implementation fidelity, and the development of pragmatic approaches for assessing fidelity in research and practice. Continuing to build knowledge on how to balance requirements for firmness and flexibility remains a crucial task within the field of implementation science.
Collapse
Affiliation(s)
- Bianca Albers
- Institute for Implementation Science in Health Care, Universitätstrasse 84, Zurich, 8006, Switzerland.
| | - Lotte Verweij
- Institute for Implementation Science in Health Care, Universitätstrasse 84, Zurich, 8006, Switzerland
- Center of Clinical Nursing Science, University Hospital Zurich, Zurich, Switzerland
| | - Kathrin Blum
- Institute for Implementation Science in Health Care, Universitätstrasse 84, Zurich, 8006, Switzerland
| | - Saskia Oesch
- Institute for Implementation Science in Health Care, Universitätstrasse 84, Zurich, 8006, Switzerland
- Center of Clinical Nursing Science, University Hospital Zurich, Zurich, Switzerland
| | - Marie-Therese Schultes
- Institute for Implementation Science in Health Care, Universitätstrasse 84, Zurich, 8006, Switzerland
| | - Lauren Clack
- Institute for Implementation Science in Health Care, Universitätstrasse 84, Zurich, 8006, Switzerland
- Division of Infectious Diseases and Hospital Epidemiology, University of Zurich and University Hospital Zurich, Zurich, Switzerland
| | - Rahel Naef
- Institute for Implementation Science in Health Care, Universitätstrasse 84, Zurich, 8006, Switzerland
- Center of Clinical Nursing Science, University Hospital Zurich, Zurich, Switzerland
| |
Collapse
|
3
|
Lewis CC, Frank HE, Cruden G, Kim B, Stahmer AC, Lyon AR, Albers B, Aarons GA, Beidas RS, Mittman BS, Weiner BJ, Williams NJ, Powell BJ. A research agenda to advance the study of implementation mechanisms. Implement Sci Commun 2024; 5:98. [PMID: 39285504 PMCID: PMC11403843 DOI: 10.1186/s43058-024-00633-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2024] [Accepted: 08/30/2024] [Indexed: 09/20/2024] Open
Abstract
BACKGROUND Implementation science scholars have made significant progress identifying factors that enable or obstruct the implementation of evidence-based interventions, and testing strategies that may modify those factors. However, little research sheds light on how or why strategies work, in what contexts, and for whom. Studying implementation mechanisms-the processes responsible for change-is crucial for advancing the field of implementation science and enhancing its value in facilitating equitable policy and practice change. The Agency for Healthcare Research and Quality funded a conference series to achieve two aims: (1) develop a research agenda on implementation mechanisms, and (2) actively disseminate the research agenda to research, policy, and practice audiences. This article presents the resulting research agenda, including priorities and actions to encourage its execution. METHOD Building on prior concept mapping work, in a semi-structured, 3-day, in-person working meeting, 23 US-based researchers used a modified nominal group process to generate priorities and actions for addressing challenges to studying implementation mechanisms. During each of the three 120-min sessions, small groups responded to the prompt: "What actions need to be taken to move this research forward?" The groups brainstormed actions, which were then shared with the full group and discussed with the support of facilitators trained in structured group processes. Facilitators grouped critical and novel ideas into themes. Attendees voted on six themes they prioritized to discuss in a fourth, 120-min session, during which small groups operationalized prioritized actions. Subsequently, all ideas were collated, combined, and revised for clarity by a subset of the authorship team. RESULTS From this multistep process, 150 actions emerged across 10 priority areas, which together constitute the research agenda. Actions included discrete activities, projects, or products, and ways to shift how research is conducted to strengthen the study of implementation mechanisms. CONCLUSIONS This research agenda elevates actions to guide the selection, design, and evaluation of implementation mechanisms. By delineating recommended actions to address the challenges of studying implementation mechanisms, this research agenda facilitates expanding the field of implementation science, beyond studying what works to how and why strategies work, in what contexts, for whom, and with which interventions.
Collapse
Affiliation(s)
- Cara C Lewis
- Kaiser Permanente Washington Health Research Institute, 1730 Minor Avenue, Suite 1600, Seattle, WA, 98101, USA.
| | - Hannah E Frank
- The Warren Alpert Medical School, Brown University, Box G-BH, Providence, RI, 02912, USA
| | - Gracelyn Cruden
- Chestnut Health System, Lighthouse Institute - OR Group, 1255 Pearl St, Ste 101, Eugene, OR 97401, USA
| | - Bo Kim
- Center for Healthcare Organization and Implementation Research, VA Boston Healthcare System, 150 South Huntington Avenue, Boston, MA, 02130, USA
- Department of Psychiatry, Harvard Medical School, 25 Shattuck Street, Boston, MA, 02115, USA
| | - Aubyn C Stahmer
- UC Davis MIND Institute, 2825 50Th St, Sacramento, CA, 95819, USA
| | - Aaron R Lyon
- Department of Psychiatry and Behavioral Sciences, University of Washington, 1959 NE Pacific Street Box 356560, Seattle, WA, 98195-6560, USA
| | - Bianca Albers
- Institute for Implementation Science in Health Care, University of Zurich, Zürich, Switzerland
| | - Gregory A Aarons
- Department of Psychiatry, University of California San Diego, 9500 Gilman Drive La Jolla California, San Diego, 92093, CA, USA
| | - Rinad S Beidas
- Department of Medical Social Sciences, Feinberg School of Medicine, Northwestern University, 625 N Michigan Avenue, Evanston, IL, 60661, USA
| | - Brian S Mittman
- Division of Health Services Research & Implementation Science, Department of Research & Evaluation, Kaiser Permanente Southern California, 100 S Los Robles Ave, Pasadena, CA, 91101, USA
| | - Bryan J Weiner
- Department of Global Health, School of Public Health, Box 357965, Seattle, WA, 98195, USA
| | - Nate J Williams
- School of Social Work, Boise State University, Boise, ID, 83725, USA
| | - Byron J Powell
- Center for Mental Health Services Research, Brown School, Washington University in St. Louis, St. Louis, MO, USA
- Center for Dissemination & Implementation, Institute for Public Health, Washington University in St. Louis, St. Louis, MO, USA
- Division of Infectious Diseases, John T. Milliken Department of Medicine, School of Medicine, Washington University in St. Louis, St. Louis, MO, USA
| |
Collapse
|
4
|
Ashcraft LE, Goodrich DE, Hero J, Phares A, Bachrach RL, Quinn DA, Qureshi N, Ernecoff NC, Lederer LG, Scheunemann LP, Rogal SS, Chinman MJ. A systematic review of experimentally tested implementation strategies across health and human service settings: evidence from 2010-2022. Implement Sci 2024; 19:43. [PMID: 38915102 PMCID: PMC11194895 DOI: 10.1186/s13012-024-01369-5] [Citation(s) in RCA: 8] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2023] [Accepted: 05/27/2024] [Indexed: 06/26/2024] Open
Abstract
BACKGROUND Studies of implementation strategies range in rigor, design, and evaluated outcomes, presenting interpretation challenges for practitioners and researchers. This systematic review aimed to describe the body of research evidence testing implementation strategies across diverse settings and domains, using the Expert Recommendations for Implementing Change (ERIC) taxonomy to classify strategies and the Reach Effectiveness Adoption Implementation and Maintenance (RE-AIM) framework to classify outcomes. METHODS We conducted a systematic review of studies examining implementation strategies from 2010-2022 and registered with PROSPERO (CRD42021235592). We searched databases using terms "implementation strategy", "intervention", "bundle", "support", and their variants. We also solicited study recommendations from implementation science experts and mined existing systematic reviews. We included studies that quantitatively assessed the impact of at least one implementation strategy to improve health or health care using an outcome that could be mapped to the five evaluation dimensions of RE-AIM. Only studies meeting prespecified methodologic standards were included. We described the characteristics of studies and frequency of implementation strategy use across study arms. We also examined common strategy pairings and cooccurrence with significant outcomes. FINDINGS Our search resulted in 16,605 studies; 129 met inclusion criteria. Studies tested an average of 6.73 strategies (0-20 range). The most assessed outcomes were Effectiveness (n=82; 64%) and Implementation (n=73; 56%). The implementation strategies most frequently occurring in the experimental arm were Distribute Educational Materials (n=99), Conduct Educational Meetings (n=96), Audit and Provide Feedback (n=76), and External Facilitation (n=59). These strategies were often used in combination. Nineteen implementation strategies were frequently tested and associated with significantly improved outcomes. However, many strategies were not tested sufficiently to draw conclusions. CONCLUSION This review of 129 methodologically rigorous studies built upon prior implementation science data syntheses to identify implementation strategies that had been experimentally tested and summarized their impact on outcomes across diverse outcomes and clinical settings. We present recommendations for improving future similar efforts.
Collapse
Affiliation(s)
- Laura Ellen Ashcraft
- Center for Health Equity Research and Promotion, Corporal Michael Crescenz VA Medical Center, Philadelphia, PA, USA.
- Department of Biostatistics, Epidemiology, and Informatics, University of Pennsylvania Perelman School of Medicine, Philadelphia, PA, USA.
| | - David E Goodrich
- Center for Health Equity Research and Promotion, VA Pittsburgh Healthcare System, Pittsburgh, PA, USA
- Division of General Internal Medicine, Department of Medicine, University of Pittsburgh, Pittsburgh, PA, USA
- Clinical & Translational Science Institute, University of Pittsburgh, Pittsburgh, PA, USA
| | | | - Angela Phares
- Center for Health Equity Research and Promotion, VA Pittsburgh Healthcare System, Pittsburgh, PA, USA
| | - Rachel L Bachrach
- Center for Clinical Management Research, VA Ann Arbor Healthcare System, Ann Arbor, Michigan, USA
- Department of Psychiatry, University of Michigan Medical School, Ann Arbor, MI, USA
| | - Deirdre A Quinn
- Center for Health Equity Research and Promotion, VA Pittsburgh Healthcare System, Pittsburgh, PA, USA
- Division of General Internal Medicine, Department of Medicine, University of Pittsburgh, Pittsburgh, PA, USA
| | | | | | - Lisa G Lederer
- Clinical & Translational Science Institute, University of Pittsburgh, Pittsburgh, PA, USA
| | - Leslie Page Scheunemann
- Division of Geriatric Medicine, University of Pittsburgh, Department of Medicine, Pittsburgh, PA, USA
- Division of Pulmonary, Allergy, Critical Care, and Sleep Medicine, University of Pittsburgh, Department of Medicine, Pittsburgh, PA, USA
| | - Shari S Rogal
- Center for Health Equity Research and Promotion, VA Pittsburgh Healthcare System, Pittsburgh, PA, USA
- Departments of Medicine and Surgery, University of Pittsburgh, Pittsburgh, Pennsylvania, USA
| | - Matthew J Chinman
- Center for Health Equity Research and Promotion, VA Pittsburgh Healthcare System, Pittsburgh, PA, USA
- Division of General Internal Medicine, Department of Medicine, University of Pittsburgh, Pittsburgh, PA, USA
- RAND Corporation, Pittsburgh, PA, USA
| |
Collapse
|
5
|
Lamb D, Milton A, Forsyth R, Lloyd-Evans B, Akther S, Fullarton K, O'Hanlon P, Johnson S, Morant N. Implementation of a crisis resolution team service improvement programme: a qualitative study of the critical ingredients for success. Int J Ment Health Syst 2024; 18:18. [PMID: 38704589 PMCID: PMC11069280 DOI: 10.1186/s13033-024-00638-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/02/2023] [Accepted: 04/25/2024] [Indexed: 05/06/2024] Open
Abstract
BACKGROUND Crisis Resolution Teams (CRTs) offer home-based care for people in mental health crisis, as an alternative to hospital admission. The success of CRTs in England has been variable. In response to this, the CRT Optimization and RElapse prevention (CORE) study developed and trialled a 12-month Service Improvement Programme (SIP) based on a fidelity model. This paper describes a qualitative evaluation of the perspectives of CRT staff, managers, and programme facilitators. We identify barriers and facilitators to implementation, and mechanisms by which service improvements took place. METHODS Managers and staff from six purposively sampled CRTs were interviewed, as well as six facilitators who were employed to support the implementation of service improvement plans. Semi-structured focus groups and individual interviews were conducted and analysed using thematic analysis. FINDINGS A majority of participants viewed all components of the SIP as helpful in improving practice, although online resources were under-used. Perceived barriers to implementation centred principally around lack of staff time and ownership. Support from both senior staff and facilitators was essential in enabling teams to undertake the work associated with the SIP. All participating stakeholder groups reported that using the fidelity model to benchmark their CRT work to best practice and feel part of a 'bigger whole' was valuable. CONCLUSION CRT staff, managers and programme facilitators thought that a structured service improvement programme helped to increase fidelity to a best practice model. Flexibility (from all stakeholders) was key to enable service improvement actions to be manageable within time- and resource-poor teams.
Collapse
Affiliation(s)
- Danielle Lamb
- Department of Applied Health Research, UCL, Gower Street, London, WC1E 6BT, UK.
| | - Alyssa Milton
- Faculty of Medicine and Health, The University of Sydney, Sydney, Australia
| | | | | | | | | | | | | | | |
Collapse
|
6
|
McGinty EE, Alegria M, Beidas RS, Braithwaite J, Kola L, Leslie DL, Moise N, Mueller B, Pincus HA, Shidhaye R, Simon K, Singer SJ, Stuart EA, Eisenberg MD. The Lancet Psychiatry Commission: transforming mental health implementation research. Lancet Psychiatry 2024; 11:368-396. [PMID: 38552663 DOI: 10.1016/s2215-0366(24)00040-3] [Citation(s) in RCA: 28] [Impact Index Per Article: 28.0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/16/2023] [Revised: 02/02/2024] [Accepted: 02/05/2024] [Indexed: 04/19/2024]
Affiliation(s)
| | - Margarita Alegria
- Massachusetts General Hospital, Boston, MA, USA; Department of Medicine, Harvard Medical School, Boston, MA, USA; Department of Psychiatry, Harvard Medical School, Boston, MA, USA
| | - Rinad S Beidas
- Feinberg School of Medicine, Northwestern University, Chicago, IL, USA
| | | | - Lola Kola
- College of Medicine, University of Ibadan, Ibadan, Nigeria; Kings College London, London, UK
| | | | | | | | | | - Rahul Shidhaye
- Pravara Institute of Medical Sciences University, Loni, India; Care and Public Health Research Institute, Maastricht University, Maastricht, Netherlands
| | | | - Sara J Singer
- Stanford University School of Medicine, Stanford, CA, USA
| | | | | |
Collapse
|
7
|
Meyer AE, Choi SY, Tugendrajch S, Rodriguez-Quintana N, Smith SN, Koschmann E, Abelson JL, Bilek EL. Matters of Fidelity: School Provider Adherence and Competence in a Clustered Study of Adaptive Implementation Strategies. EVIDENCE-BASED PRACTICE IN CHILD AND ADOLESCENT MENTAL HEALTH 2024; 9:411-428. [PMID: 39498378 PMCID: PMC11534295 DOI: 10.1080/23794925.2024.2324770] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 11/07/2024]
Abstract
Schools are a promising access point for youth with mental health concerns, but school-based mental health professionals (SPs) often need ongoing support to provide high-fidelity cognitive behavioral therapy (CBT). Adherence and competence, two critical elements of fidelity, were examined in a cluster-randomized implementation trial. We evaluated CBT adherence and then triangulated CBT adherence with end-of-study competence. We then evaluated the effects of two implementation supports, Coaching and (for slower-responding schools) Facilitation, on adherence and competence. By the end of the 43-week study period, 27.8% of SPs met adherence criteria. Adherent SPs scored higher on the competence measure, the CBT Competence Scale (t (116.2) = 3.71, p < .001). No significant difference in adherence was found among SPs at schools assigned to Coaching vs. not (Δ = 6.0%, p = .385), however SPs at schools randomized to Coaching scored significantly higher on two of the four competence subscales (Non-Behavioral and Behavioral skills). Among slower-responder schools, SPs at schools assigned to Facilitation were more likely to demonstrate adherence (Δ = 16.3%, p = .022), but there was no effect of Facilitation on competence. Approximately one quarter of SPs met adherence criteria in the trial; adequate delivery of exposure was a primary obstacle to reaching adherence. Facilitation may be especially suited to help SPs overcome barriers to delivery, whereas Coaching may be especially suited to help SPs improve CBT competence. Both are likely needed to build a mental health work force with the competence and ability to deliver EBPs in schools.
Collapse
Affiliation(s)
- Allison E. Meyer
- Department of Psychiatry, Indiana University School of Medicine, Indianapolis, Indiana, USA
| | - Seo Youn Choi
- School of Public Health, University of Michigan, Ann Arbor, Michigan, USA
| | - Siena Tugendrajch
- Michigan Medicine, University of Michigan, Ann Arbor, Michigan, USA
- Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania, USA
| | - Natalie Rodriguez-Quintana
- Michigan Medicine, University of Michigan, Ann Arbor, Michigan, USA
- Executive Leadership Team, TRAILS, a project of TIDES Center, Ann Arbor, Michigan, USA
| | - Shawna N. Smith
- School of Public Health, University of Michigan, Ann Arbor, Michigan, USA
- Michigan Medicine, University of Michigan, Ann Arbor, Michigan, USA
| | - Elizabeth Koschmann
- Michigan Medicine, University of Michigan, Ann Arbor, Michigan, USA
- Executive Leadership Team, TRAILS, a project of TIDES Center, Ann Arbor, Michigan, USA
| | - James L. Abelson
- Michigan Medicine, University of Michigan, Ann Arbor, Michigan, USA
| | - Emily L. Bilek
- Michigan Medicine, University of Michigan, Ann Arbor, Michigan, USA
| |
Collapse
|
8
|
Gilmartin H, Jones C, Nunnery M, Leonard C, Connelly B, Wills A, Kelley L, Rabin B, Burke RE. An implementation strategy postmortem method developed in the VA rural Transitions Nurse Program to inform spread and scale-up. PLoS One 2024; 19:e0298552. [PMID: 38457367 PMCID: PMC10923440 DOI: 10.1371/journal.pone.0298552] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2023] [Accepted: 01/25/2024] [Indexed: 03/10/2024] Open
Abstract
BACKGROUND High-quality implementation evaluations report on intervention fidelity and adaptations made, but a practical process for evaluating implementation strategies is needed. A retrospective method for evaluating implementation strategies is also required as prospective methods can be resource intensive. This study aimed to establish an implementation strategy postmortem method to identify the implementation strategies used, when, and their perceived importance. We used the rural Transitions Nurse Program (TNP) as a case study, a national care coordination intervention implemented at 11 hospitals over three years. METHODS The postmortem used a retrospective, mixed method, phased approach. Implementation team and front-line staff characterized the implementation strategies used, their timing, frequency, ease of use, and their importance to implementation success. The Expert Recommendations for Implementing Change (ERIC) compilation, the Quality Enhancement Research Initiative phases, and Proctor and colleagues' guidance were used to operationalize the strategies. Survey data were analyzed descriptively, and qualitative data were analyzed using matrix content analysis. RESULTS The postmortem method identified 45 of 73 ERIC strategies introduced, including 41 during pre-implementation, 37 during implementation, and 27 during sustainment. External facilitation, centralized technical assistance, and clinical supervision were ranked as the most important and frequently used strategies. Implementation strategies were more intensively applied in the beginning of the study and tapered over time. CONCLUSIONS The postmortem method identified that more strategies were used in TNP than planned and identified the most important strategies from the perspective of the implementation team and front-line staff. The findings can inform other implementation studies as well as dissemination of the TNP intervention.
Collapse
Affiliation(s)
- Heather Gilmartin
- Denver/Seattle Center of Innovation for Veteran-Centered and Value Driven Care, VA Eastern Colorado Healthcare System, Aurora, Colorado, United States of America
- Department of Health Systems, Management and Policy, Colorado School of Public Health, University of Colorado, Anschutz Medical Campus, Aurora, Colorado, United States of America
| | - Christine Jones
- Denver/Seattle Center of Innovation for Veteran-Centered and Value Driven Care, VA Eastern Colorado Healthcare System, Aurora, Colorado, United States of America
- Division of Geriatric Medicine and Division of Hospital Medicine, Department of Medicine, University of Colorado, Anschutz Medical Campus, Aurora, Colorado, United States of America
| | - Mary Nunnery
- Denver/Seattle Center of Innovation for Veteran-Centered and Value Driven Care, VA Eastern Colorado Healthcare System, Aurora, Colorado, United States of America
| | - Chelsea Leonard
- Denver/Seattle Center of Innovation for Veteran-Centered and Value Driven Care, VA Eastern Colorado Healthcare System, Aurora, Colorado, United States of America
| | - Brigid Connelly
- Denver/Seattle Center of Innovation for Veteran-Centered and Value Driven Care, VA Eastern Colorado Healthcare System, Aurora, Colorado, United States of America
| | - Ashlea Wills
- Denver/Seattle Center of Innovation for Veteran-Centered and Value Driven Care, VA Eastern Colorado Healthcare System, Aurora, Colorado, United States of America
| | - Lynette Kelley
- Denver/Seattle Center of Innovation for Veteran-Centered and Value Driven Care, VA Eastern Colorado Healthcare System, Aurora, Colorado, United States of America
| | - Borsika Rabin
- Denver/Seattle Center of Innovation for Veteran-Centered and Value Driven Care, VA Eastern Colorado Healthcare System, Aurora, Colorado, United States of America
- Herbert Wertheim School of Public Health and Human Longevity Science, University of California San Diego, San Diego, California, United States of America
- Altman Clinical and Translational Research Institute, Dissemination and Implementation Science Center, University of California San Diego, San Diego, California, United States of America
| | - Robert E. Burke
- Center for Health Equity Research and Promotion, Corporal Crescenz VA Medical Center, Philadelphia, Pennsylvania, United States of America
- Hospital Medicine Section – Division of General Internal Medicine, University of Pennsylvania Perelman School of Medicine, Philadelphia, Pennsylvania, United States of America
- Leonard Davis Institute of Health Economics, University of Pennsylvania, Philadelphia, Pennsylvania, United States of America
| |
Collapse
|
9
|
Kopelovich SL, Buck BE, Tauscher J, Lyon AR, Ben-Zeev D. Developing the Workforce of the Digital Future: mHealth Competency and Fidelity Measurement in Community-Based Care. JOURNAL OF TECHNOLOGY IN BEHAVIORAL SCIENCE 2024; 9:35-45. [PMID: 38571682 PMCID: PMC10984896 DOI: 10.1007/s41347-024-00385-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/28/2023] [Revised: 01/01/2024] [Accepted: 01/04/2024] [Indexed: 04/05/2024]
Abstract
Integrating mobile health (mHealth) interventions into settings that serve diverse patient populations requires that prerequisite professional competencies are delineated and that standards for clinical quality assurance can be pragmatically assessed. Heretofore, proposed mHealth competencies have been broad and have lacked a framework to support specific applications. We outline the meta-competencies identified in the literature relevant to mHealth interventions and demonstrate how these meta-competencies can be integrated with population- and intervention-related competencies to help guide a pragmatic approach to competency assessment. We present a use case based on FOCUS-an evidence-based mHealth intervention designed for individuals with serious mental illness and currently being implemented in geographically and demographically diverse community behavioral health settings. Subsequent to identifying the cross-cutting competencies relevant to the target population (outpatients experiencing psychotic symptoms), substratal intervention (Cognitive Behavioral Therapy for psychosis), and treatment modality (mHealth), we detail the development process of an mHealth fidelity monitoring system (mHealth-FMS). We adhered to a published sequential 5-step process to design a fidelity monitoring system that aligns with our integrated mHealth competency framework and that was guided by best practices prescribed by the Treatment Fidelity Workgroup of the National Institutes of Health Behavior Change Consortium. The mHealth-FMS is intended to enhance both clinical and implementation outcomes by grounding the mHealth interventionist and the system of care in which they operate in the core functions, tasks, knowledge, and competencies associated with system-integrated mHealth delivery. Future research will explore acceptability and feasibility of the mHealth-FMS.
Collapse
Affiliation(s)
- Sarah L. Kopelovich
- Department of Psychiatry & Behavioral Sciences, University of Washington School of Medicine, 1959 NE Pacific Street, Box 356560, Seattle, WA 98195-6560 USA
| | - Benjamin E. Buck
- Department of Psychiatry & Behavioral Sciences, University of Washington School of Medicine, 1959 NE Pacific Street, Box 356560, Seattle, WA 98195-6560 USA
| | - Justin Tauscher
- Department of Psychiatry & Behavioral Sciences, University of Washington School of Medicine, 1959 NE Pacific Street, Box 356560, Seattle, WA 98195-6560 USA
| | - Aaron R. Lyon
- Department of Psychiatry & Behavioral Sciences, University of Washington School of Medicine, 1959 NE Pacific Street, Box 356560, Seattle, WA 98195-6560 USA
| | - Dror Ben-Zeev
- Department of Psychiatry & Behavioral Sciences, University of Washington School of Medicine, 1959 NE Pacific Street, Box 356560, Seattle, WA 98195-6560 USA
| |
Collapse
|
10
|
Hwang Y, Hodgson NA, Gitlin LN. Implementing Dementia Caregiver Programs in Real-World Settings: Fidelity Considerations. J Am Med Dir Assoc 2024; 25:34-40.e11. [PMID: 38036027 PMCID: PMC10872702 DOI: 10.1016/j.jamda.2023.10.019] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2023] [Revised: 10/22/2023] [Accepted: 10/23/2023] [Indexed: 12/02/2023]
Abstract
Testing interventions in real-world settings requires fidelity monitoring to ensure implementation integrity. However, strategies to enhance, monitor, and measure fidelity deployed in efficacy trials may not be feasible in pragmatic trials or sustainable in practice. This paper reviews published translational or pragmatic studies of dementia caregiver support interventions to understand how fidelity was previously treated in order to derive recommendations for future pragmatic-like trials. A search using SCOPUS, EMBASE, and Google Scholar identified 31 translational caregiver intervention studies of which 20 (64.5%) referenced fidelity. Of these 20, 11 (55.0%) reported fidelity measurement, whereas 9 (45.0%) only recognized its importance. Of the 11 studies, fidelity was assessed using investigator-developed scoring forms, audio/video recordings, evaluations from caregivers and interventionists, and by comparing outcomes with the original efficacy trial. Additionally, 7 (63.6%) of 11 studies reported fidelity results, representing only 22.5% of 31 studies reporting outcomes demonstrating the inconsistency in the field concerning the reporting outcomes of fidelity. We conclude that fidelity methods used in translational studies to date are not practical nor sustainable for ongoing monitoring of evidence-based programs in real-world settings and that only 2 aspects of fidelity, intervention and adherence, are considered. New approaches are needed to ensure fidelity integrity in pragmatic trials and which can be sustained thereafter.
Collapse
Affiliation(s)
- Yeji Hwang
- College of Nursing and Health Professions, Drexel University, Philadelphia, PA, USA; College of Nursing and Research Institute of Nursing Science, Seoul National University, Seoul, Korea.
| | - Nancy A Hodgson
- School of Nursing, University of Pennsylvania, Philadelphia, PA, USA
| | - Laura N Gitlin
- College of Nursing and Research Institute of Nursing Science, Seoul National University, Seoul, Korea
| |
Collapse
|
11
|
Lovero KL, Kemp CG, Wagenaar BH, Giusto A, Greene MC, Powell BJ, Proctor EK. Application of the Expert Recommendations for Implementing Change (ERIC) compilation of strategies to health intervention implementation in low- and middle-income countries: a systematic review. Implement Sci 2023; 18:56. [PMID: 37904218 PMCID: PMC10617067 DOI: 10.1186/s13012-023-01310-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2023] [Accepted: 10/02/2023] [Indexed: 11/01/2023] Open
Abstract
BACKGROUND The Expert Recommendations for Implementing Change (ERIC) project developed a compilation of implementation strategies that are intended to standardize reporting and evaluation. Little is known about the application of ERIC in low- and middle-income countries (LMICs). We systematically reviewed the literature on the use and specification of ERIC strategies for health intervention implementation in LMICs to identify gaps and inform future research. METHODS We searched peer-reviewed articles published through March 2023 in any language that (1) were conducted in an LMIC and (2) cited seminal ERIC articles or (3) mentioned ERIC in the title or abstract. Two co-authors independently screened all titles, abstracts, and full-text articles, then abstracted study, intervention, and implementation strategy characteristics of included studies. RESULTS The final sample included 60 studies describing research from all world regions, with over 30% published in the final year of our review period. Most studies took place in healthcare settings (n = 52, 86.7%), while 11 (18.2%) took place in community settings and four (6.7%) at the policy level. Across studies, 548 distinct implementation strategies were identified with a median of six strategies (range 1-46 strategies) included in each study. Most studies (n = 32, 53.3%) explicitly matched implementation strategies used for the ERIC compilation. Among those that did, 64 (87.3%) of the 73 ERIC strategies were represented. Many of the strategies not cited included those that target systems- or policy-level barriers. Nearly 85% of strategies included some component of strategy specification, though most only included specification of their action (75.2%), actor (57.3%), and action target (60.8%). A minority of studies employed randomized trials or high-quality quasi-experimental designs; only one study evaluated implementation strategy effectiveness. CONCLUSIONS While ERIC use in LMICs is rapidly growing, its application has not been consistent nor commonly used to test strategy effectiveness. Research in LMICs must better specify strategies and evaluate their impact on outcomes. Moreover, strategies that are tested need to be better specified, so they may be compared across contexts. Finally, strategies targeting policy-, systems-, and community-level determinants should be further explored. TRIAL REGISTRATION PROSPERO, CRD42021268374.
Collapse
Affiliation(s)
- Kathryn L Lovero
- Department of Sociomedical Sciences, Columbia University Mailman School of Public Health, New York, NY, USA.
| | - Christopher G Kemp
- Department of International Health, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, USA
| | - Bradley H Wagenaar
- Department of Global Health, University of Washington, Seattle, WA, USA
- Department of Epidemiology, University of Washington, Seattle, WA, USA
| | - Ali Giusto
- Department of Psychiatry, Columbia University Irving Medical Center, New York State Psychiatric Institute, New York, NY, USA
| | - M Claire Greene
- Program On Forced Migration and Health, Heilbrunn Department of Population and Family Health, Columbia University Mailman School of Public Health, New York, NY, USA
| | - Byron J Powell
- Brown School, Center for Mental Health Services Research, Washington University in St. Louis, St. Louis, MO, USA
- Center for Dissemination & Implementation, Institute for Public Health, Washington University in St. Louis, St. Louis, MO, USA
- Division of Infectious Diseases, John T. Milliken Department of Medicine, School of Medicine, Washington University in St. Louis, St. Louis, MO, USA
| | - Enola K Proctor
- Brown School, Center for Mental Health Services Research, Washington University in St. Louis, St. Louis, MO, USA
- Center for Dissemination & Implementation, Institute for Public Health, Washington University in St. Louis, St. Louis, MO, USA
| |
Collapse
|
12
|
Tschida JE, Drahota A. Fidelity to the ACT SMART Toolkit: an instrumental case study of implementation strategy fidelity. Implement Sci Commun 2023; 4:52. [PMID: 37194052 PMCID: PMC10189967 DOI: 10.1186/s43058-023-00434-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2022] [Accepted: 05/02/2023] [Indexed: 05/18/2023] Open
Abstract
BACKGROUND Evidence-based practices (EBPs) are shown to improve a variety of outcomes for autistic children. However, EBPs often are mis-implemented or not implemented in community-based settings where many autistic children receive usual care services. A blended implementation process and capacity-building implementation strategy, developed to facilitate the adoption and implementation of EBPs for autism spectrum disorder (ASD) in community-based settings, is the Autism Community Toolkit: Systems to Measure and Adopt Research-based Treatments (ACT SMART Toolkit). Based on an adapted Exploration, Adoption decision, Preparation, Implementation, Sustainment (EPIS) Framework, the multi-phased ACT SMART Toolkit is comprised of (a) implementation facilitation, (b) agency-based implementation teams, and (c) a web-based interface. In this instrumental case study, we developed and utilized a method to evaluate fidelity to the ACT SMART Toolkit. This study responds to the need for implementation strategy fidelity evaluation methods and may provide evidence supporting the use of the ACT SMART Toolkit. METHODS We used an instrumental case study approach to assess fidelity to the ACT SMART Toolkit during its pilot study with six ASD community agencies located in southern California. We assessed adherence, dose, and implementation team responsiveness for each phase and activity of the toolkit at both an aggregate and individual agency level. RESULTS Overall, we found that adherence, dose, and implementation team responsiveness to the ACT SMART Toolkit were high, with some variability by EPIS phase and specific activity as well as by ASD community agency. At the aggregate level, adherence and dose were rated notably lowest during the preparation phase of the toolkit, which is a more activity-intensive phase of the toolkit. CONCLUSIONS This evaluation of fidelity to the ACT SMART Toolkit, utilizing an instrumental case study design, demonstrated the potential for the strategy to be used with fidelity in ASD community-based agencies. Findings related to the variability of implementation strategy fidelity in the present study may also inform future adaptations to the toolkit and point to broader trends of how implementation strategy fidelity may vary by content and context.
Collapse
Affiliation(s)
- Jessica E Tschida
- Department of Psychology, Michigan State University, 316 Physics Rd, East Lansing, MI, 48824, USA.
| | - Amy Drahota
- Department of Psychology, Michigan State University, 316 Physics Rd, East Lansing, MI, 48824, USA
- Child and Adolescent Services Research Center (CASRC), 3665 Kearny Villa Road, Suite 200N, San Diego, CA, 92123, USA
| |
Collapse
|
13
|
Akiba CF, Powell BJ, Pence BW, Muessig K, Golin CE, Go V. "We start where we are": a qualitative study of barriers and pragmatic solutions to the assessment and reporting of implementation strategy fidelity. Implement Sci Commun 2022; 3:117. [PMID: 36309715 PMCID: PMC9617230 DOI: 10.1186/s43058-022-00365-4] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2022] [Accepted: 10/19/2022] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Fidelity measurement of implementation strategies is underdeveloped and underreported, and the level of reporting is decreasing over time. Failing to properly measure the factors that affect the delivery of an implementation strategy may obscure the link between a strategy and its outcomes. Barriers to assessing and reporting implementation strategy fidelity among researchers are not well understood. The aims of this qualitative study were to identify barriers to fidelity measurement and pragmatic pathways towards improvement. METHODS We conducted in-depth interviews among researchers conducting implementation trials. We utilized a theory-informed interview approach to elicit the barriers and possible solutions to implementation strategy fidelity assessment and reporting. Reflexive-thematic analysis guided coding and memo-writing to determine key themes regarding barriers and solutions. RESULTS Twenty-two implementation researchers were interviewed. Participants agreed that implementation strategy fidelity was an essential element of implementation trials and that its assessment and reporting should improve. Key thematic barriers focused on (1) a current lack of validated fidelity tools with the need to assess fidelity in the short term, (2) the complex nature of some implementation strategies, (3) conceptual complications when assessing fidelity within mechanisms-focused implementation research, and (4) structural issues related to funding and publishing. Researchers also suggested pragmatic solutions to overcome each barrier. Respondents reported using specification and tracking data in the short term until validated tools become available. Participants suggested that researchers with strategy-specific content expertise lead the way in identifying core components and setting fidelity requirements for them. Addressing the third barrier, participants provided examples of what pragmatic prospective and retrospective fidelity assessments might look like along a mechanistic pathway. Finally, researchers described approaches to minimize costs of data collection, as well as more structural accountability like adopting and enforcing reporting guidelines or changing the structure of funding opportunities. DISCUSSION We propose short- and long-term priorities for improving the assessment and reporting of implementation strategy fidelity and the quality of implementation research. CONCLUSIONS A better understanding of the barriers to implementation strategy fidelity assessment may pave the way towards pragmatic solutions.
Collapse
Affiliation(s)
| | - Byron J Powell
- Center for Mental Health Services Research, Brown School, Washington University in St. Louis, St. Louis, MO, USA
- Division of Infectious Diseases, John T. Milliken Department of Medicine, School of Medicine, Washington University in St. Louis, St. Louis, MO, USA
- Center for Dissemination & Implementation, Institute for Public Health, Washington University in St. Louis, St. Louis, MO, USA
| | - Brian W Pence
- Department of Epidemiology, Gillings School of Global Public Health, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
| | - Kate Muessig
- Department of Health Behavior, Gillings School of Global Public Health, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
| | - Carol E Golin
- Department of Health Behavior, Gillings School of Global Public Health, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
| | - Vivian Go
- Department of Health Behavior, Gillings School of Global Public Health, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
| |
Collapse
|
14
|
McHugh S, Presseau J, Luecking CT, Powell BJ. Examining the complementarity between the ERIC compilation of implementation strategies and the behaviour change technique taxonomy: a qualitative analysis. Implement Sci 2022; 17:56. [PMID: 35986333 PMCID: PMC9389676 DOI: 10.1186/s13012-022-01227-2] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2022] [Accepted: 07/25/2022] [Indexed: 01/08/2023] Open
Abstract
BACKGROUND Efforts to generate evidence for implementation strategies are frustrated by insufficient description. The Expert Recommendations for Implementing Change (ERIC) compilation names and defines implementation strategies; however, further work is needed to describe the actions involved. One potentially complementary taxonomy is the behaviour change techniques (BCT) taxonomy. We aimed to examine the extent and nature of the overlap between these taxonomies. METHODS Definitions and descriptions of 73 strategies in the ERIC compilation were analysed. First, each description was deductively coded using the BCT taxonomy. Second, a typology was developed to categorise the extent of overlap between ERIC strategies and BCTs. Third, three implementation scientists independently rated their level of agreement with the categorisation and BCT coding. Finally, discrepancies were settled through online consensus discussions. Additional patterns of complementarity between ERIC strategies and BCTs were labelled thematically. Descriptive statistics summarise the frequency of coded BCTs and the number of strategies mapped to each of the categories of the typology. RESULTS Across the 73 strategies, 41/93 BCTs (44%) were coded, with 'restructuring the social environment' as the most frequently coded (n=18 strategies, 25%). There was direct overlap between one strategy (change physical structure and equipment) and one BCT ('restructuring physical environment'). Most strategy descriptions (n=64) had BCTs that were clearly indicated (n=18), and others where BCTs were probable but not explicitly described (n=31) or indicated multiple types of overlap (n=15). For some strategies, the presence of additional BCTs was dependent on the form of delivery. Some strategies served as examples of broad BCTs operationalised for implementation. For eight strategies, there were no BCTs indicated, or they did not appear to focus on changing behaviour. These strategies reflected preparatory stages and targeted collective cognition at the system level rather than behaviour change at the service delivery level. CONCLUSIONS This study demonstrates how the ERIC compilation and BCT taxonomy can be integrated to specify active ingredients, providing an opportunity to better understand mechanisms of action. Our results highlight complementarity rather than redundancy. More efforts to integrate these or other taxonomies will aid strategy developers and build links between existing silos in implementation science.
Collapse
Affiliation(s)
- Sheena McHugh
- School of Public Health, University College Cork, Western Gateway Building, Western Rd, Cork, Ireland.
| | - Justin Presseau
- School of Epidemiology & Public Health, University of Ottawa and Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Canada
| | - Courtney T Luecking
- Department of Dietetics and Human Nutrition, College of Agriculture, Food and Environment, University of Kentucky, Lexington, USA
| | - Byron J Powell
- Center for Mental Health Services Research, Brown School, Washington University in St. Louis, St. Louis, Missouri, USA
- Center for Dissemination & Implementation, Institute for Public Health, Washington University in St. Louis, St. Louis, Missouri, USA
- Division of Infectious Diseases, John T. Milliken Department of Medicine, Washington University School of Medicine, St. Louis, Missouri, USA
| |
Collapse
|