1
|
Robinson CH, Damschroder LJ. A pragmatic context assessment tool (pCAT): using a Think Aloud method to develop an assessment of contextual barriers to change. Implement Sci Commun 2023; 4:3. [PMID: 36631914 PMCID: PMC9835384 DOI: 10.1186/s43058-022-00380-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2022] [Accepted: 11/27/2022] [Indexed: 01/13/2023] Open
Abstract
BACKGROUND The Consolidated Framework for Implementation Research (CFIR) is a determinant framework that can be used to guide context assessment prior to implementing change. Though a few quantitative measurement instruments have been developed based on the CFIR, most assessments using the CFIR have relied on qualitative methods. One challenge to measurement is to translate conceptual constructs which are often described using highly abstract, technical language into lay language that is clear, concise, and meaningful. The purpose of this paper is to document methods to develop a freely available pragmatic context assessment tool (pCAT). The pCAT is based on the CFIR and designed for frontline quality improvement teams as an abbreviated assessment of local facilitators and barriers in a clinical setting. METHODS Twenty-seven interviews using the Think Aloud method (asking participants to verbalize thoughts as they respond to assessment questions) were conducted with frontline employees to improve a pilot version of the pCAT. Interviews were recorded and transcribed verbatim; the CFIR guided coding and analyses. RESULTS Participants identified several areas where language in the pCAT needed to be modified, clarified, or allow more nuance to increase usefulness for frontline employees. Participants found it easier to respond to questions when they had a recent, specific project in mind. Potential barriers and facilitators tend to be unique to each specific improvement. Participants also identified missing concepts or that were conflated, leading to refinements that made the pCAT more understandable, accurate, and useful. CONCLUSIONS The pCAT is designed to be practical, using everyday language familiar to frontline employees. The pCAT is short (14 items), freely available, does not require research expertise or experience. It is designed to draw on the knowledge of individuals most familiar with their own clinical context. The pCAT has been available online for approximately two years and has generated a relatively high level of interest indicating potential usefulness of the tool.
Collapse
Affiliation(s)
- Claire H Robinson
- VA Center for Clinical Management Research, VA Ann Arbor Healthcare System, 2215 Fuller Road (152), Ann Arbor, MI, 48105, USA.
| | - Laura J Damschroder
- VA Center for Clinical Management Research, VA Ann Arbor Healthcare System, 2215 Fuller Road (152), Ann Arbor, MI, 48105, USA
| |
Collapse
|
2
|
McLoughlin GM, Walsh-Bailey C, Singleton CR, Turner L. Investigating implementation of school health policies through a health equity lens: A measures development study protocol. Front Public Health 2022; 10:984130. [PMID: 36530706 PMCID: PMC9747935 DOI: 10.3389/fpubh.2022.984130] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2022] [Accepted: 11/11/2022] [Indexed: 12/05/2022] Open
Abstract
Background School-based policies that ensure provision of nutrition, physical activity, and other health-promoting resources and opportunities are essential in mitigating health disparities among underserved populations. Measuring the implementation of such policies is imperative to bridge the gap between policy and practice. Unfortunately, limited practical, psychometrically strong measures of school policy implementation exist. Few available explicitly focus on the issues of equity and social justice as a key component of implementation, which may result in underassessment of the equity implications of policy implementation. The purpose of this study is to develop equity-focused measures in collaboration with practitioners, researchers, and other key implementation partners that will facilitate evaluation of policy implementation determinants (i.e., barriers and facilitators), processes, and outcomes. Methods We will actively seek engagement from practitioners, researchers, and advocacy partners (i.e., stakeholders) who have expertise in school health policy throughout each phase of this project. We propose a multi-phase, 1-year project comprising the following steps: (1) selection of relevant constructs from guiding frameworks related to health equity and implementation science; (2) initial measure development, including expert feedback on draft items; (3) pilot cognitive testing with representatives from key target populations (i.e., school administrators, teachers, food service staff, and students and parents/guardians); and (4) measure refinement based on testing and assessment of pragmatic properties. These steps will allow us to establish initial face and content validity of a set of instruments that can undergo psychometric testing in future studies to assess their reliability and validity. Discussion Completion of this project will result in several school policy implementation measurement tools which can be readily used by practitioners and researchers to evaluate policy implementation through a health equity lens. This will provide opportunities for better assessment and accountability of policies that aim to advance health equity among school-aged children and their families. Trial registration Open Science Framework Registration doi: 10.17605/OSF.IO/736ZU.
Collapse
Affiliation(s)
- Gabriella M. McLoughlin
- College of Public Health, Temple University, Philadelphia, PA, United States,Implementation Science Center for Cancer Control and Prevention Research Center, Brown School, Washington University in St. Louis, St. Louis, MO, United States,*Correspondence: Gabriella M. McLoughlin
| | - Callie Walsh-Bailey
- Implementation Science Center for Cancer Control and Prevention Research Center, Brown School, Washington University in St. Louis, St. Louis, MO, United States
| | - Chelsea R. Singleton
- School of Public Health and Tropical Medicine, Tulane University, New Orleans, LA, United States
| | - Lindsey Turner
- College of Education, Boise State University, Boise, ID, United States
| |
Collapse
|
3
|
Damschroder LJ, Reardon CM, Widerquist MAO, Lowery J. The updated Consolidated Framework for Implementation Research based on user feedback. Implement Sci 2022; 17:75. [DOI: 10.1186/s13012-022-01245-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2022] [Accepted: 10/06/2022] [Indexed: 11/10/2022] Open
Abstract
Abstract
Background
Many implementation efforts fail, even with highly developed plans for execution, because contextual factors can be powerful forces working against implementation in the real world. The Consolidated Framework for Implementation Research (CFIR) is one of the most commonly used determinant frameworks to assess these contextual factors; however, it has been over 10 years since publication and there is a need for updates. The purpose of this project was to elicit feedback from experienced CFIR users to inform updates to the framework.
Methods
User feedback was obtained from two sources: (1) a literature review with a systematic search; and (2) a survey of authors who used the CFIR in a published study. Data were combined across both sources and reviewed to identify themes; a consensus approach was used to finalize all CFIR updates. The VA Ann Arbor Healthcare System IRB declared this study exempt from the requirements of 38 CFR 16 based on category 2.
Results
The systematic search yielded 376 articles that contained the CFIR in the title and/or abstract and 334 unique authors with contact information; 59 articles included feedback on the CFIR. Forty percent (n = 134/334) of authors completed the survey. The CFIR received positive ratings on most framework sensibility items (e.g., applicability, usability), but respondents also provided recommendations for changes. Overall, updates to the CFIR include revisions to existing domains and constructs as well as the addition, removal, or relocation of constructs. These changes address important critiques of the CFIR, including better centering innovation recipients and adding determinants to equity in implementation.
Conclusion
The updates in the CFIR reflect feedback from a growing community of CFIR users. Although there are many updates, constructs can be mapped back to the original CFIR to ensure longitudinal consistency. We encourage users to continue critiquing the CFIR, facilitating the evolution of the framework as implementation science advances.
Collapse
|
4
|
Madrigal L, Manders OC, Kegler M, Haardörfer R, Piper S, Blais LM, Weber MB, Escoffery C. Inner and outer setting factors that influence the implementation of the National Diabetes Prevention Program (National DPP) using the Consolidated Framework for Implementation Research (CFIR): a qualitative study. Implement Sci Commun 2022; 3:104. [PMID: 36183133 PMCID: PMC9526531 DOI: 10.1186/s43058-022-00350-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2022] [Accepted: 09/19/2022] [Indexed: 11/23/2022] Open
Abstract
BACKGROUND Scaling evidence-based interventions are key to impacting population health. The National DPP lifestyle change program is one such intervention that has been scaled across the USA over the past 20 years; however, enrollment is an ongoing challenge. Furthermore, little is known about which organizations are most successful with program delivery, enrollment, and scaling. This study aims to understand more about the internal and external organization factors that impact program implementation and reach. METHODS Between August 2020 and January 2021, data were collected through semi-structured key informant interviews with 30 National DPP delivery organization implementers. This study uses a qualitative cross-case construct rating methodology to assess which Consolidated Framework for Implementation Research (CFIR) inner and outer setting constructs contributed (both in valence and magnitude) to the organization's current level of implementation reach (measured by average participant enrollment per year). A construct by case matrix was created with ratings for each CFIR construct by interviewee and grouped by implementation reach level. RESULTS Across the 16 inner and outer setting constructs and subconstructs, the interviewees with greater enrollment per year provided stronger and more positive examples related to implementation and enrollment of the program, while the lower reach groups reported stronger and more negative examples across rated constructs. Four inner setting constructs/subconstructs (structural characteristics, compatibility, goals and feedback, and leadership engagement) were identified as "distinguishing" between enrollment reach levels based on the difference between groups by average rating, the examination of the number of extreme ratings within levels, and the thematic analysis of the content discussed. Within these constructs, factors such as organization size and administrative processes; program fit with existing organization services and programs; the presence of enrollment goals; and active leadership involvement in implementation were identified as influencing program reach. CONCLUSIONS Our study identified a number of influential CFIR constructs and their impact on National DPP implementation reach. These findings can be leveraged to improve efforts in recruiting and assisting delivery organizations to increase the reach and scale of the National DPP as well as other evidence-based interventions.
Collapse
Affiliation(s)
- Lillian Madrigal
- grid.189967.80000 0001 0941 6502Rollins School of Public Health Emory University, 1518 Clifton Rd, Atlanta, GA 30322 USA
| | - Olivia C. Manders
- grid.189967.80000 0001 0941 6502Rollins School of Public Health Emory University, 1518 Clifton Rd, Atlanta, GA 30322 USA
| | - Michelle Kegler
- grid.189967.80000 0001 0941 6502Rollins School of Public Health Emory University, 1518 Clifton Rd, Atlanta, GA 30322 USA
| | - Regine Haardörfer
- grid.189967.80000 0001 0941 6502Rollins School of Public Health Emory University, 1518 Clifton Rd, Atlanta, GA 30322 USA
| | - Sarah Piper
- grid.189967.80000 0001 0941 6502Rollins School of Public Health Emory University, 1518 Clifton Rd, Atlanta, GA 30322 USA
| | - Linelle M. Blais
- grid.189967.80000 0001 0941 6502Rollins School of Public Health Emory University, 1518 Clifton Rd, Atlanta, GA 30322 USA
| | - Mary Beth Weber
- grid.189967.80000 0001 0941 6502Rollins School of Public Health Emory University, 1518 Clifton Rd, Atlanta, GA 30322 USA
| | - Cam Escoffery
- grid.189967.80000 0001 0941 6502Rollins School of Public Health Emory University, 1518 Clifton Rd, Atlanta, GA 30322 USA
| |
Collapse
|
5
|
Pilar M, Jost E, Walsh-Bailey C, Powell BJ, Mazzucca S, Eyler A, Purtle J, Allen P, Brownson RC. Quantitative measures used in empirical evaluations of mental health policy implementation: A systematic review. IMPLEMENTATION RESEARCH AND PRACTICE 2022; 3:26334895221141116. [PMID: 37091091 PMCID: PMC9924289 DOI: 10.1177/26334895221141116] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022] Open
Abstract
Background Mental health is a critical component of wellness. Public policies present an opportunity for large-scale mental health impact, but policy implementation is complex and can vary significantly across contexts, making it crucial to evaluate implementation. The objective of this study was to (1) identify quantitative measurement tools used to evaluate the implementation of public mental health policies; (2) describe implementation determinants and outcomes assessed in the measures; and (3) assess the pragmatic and psychometric quality of identified measures. Method Guided by the Consolidated Framework for Implementation Research, Policy Implementation Determinants Framework, and Implementation Outcomes Framework, we conducted a systematic review of peer-reviewed journal articles published in 1995-2020. Data extracted included study characteristics, measure development and testing, implementation determinants and outcomes, and measure quality using the Psychometric and Pragmatic Evidence Rating Scale. Results We identified 34 tools from 25 articles, which were designed for mental health policies or used to evaluate constructs that impact implementation. Many measures lacked information regarding measurement development and testing. The most assessed implementation determinants were readiness for implementation, which encompassed training (n = 20, 57%) and other resources (n = 12, 34%), actor relationships/networks (n = 15, 43%), and organizational culture and climate (n = 11, 31%). Fidelity was the most prevalent implementation outcome (n = 9, 26%), followed by penetration (n = 8, 23%) and acceptability (n = 7, 20%). Apart from internal consistency and sample norms, psychometric properties were frequently unreported. Most measures were accessible and brief, though minimal information was provided regarding interpreting scores, handling missing data, or training needed to administer tools. Conclusions This work contributes to the nascent field of policy-focused implementation science by providing an overview of existing measurement tools used to evaluate mental health policy implementation and recommendations for measure development and refinement. To advance this field, more valid, reliable, and pragmatic measures are needed to evaluate policy implementation and close the policy-to-practice gap. Plain Language Summary Mental health is a critical component of wellness, and public policies present an opportunity to improve mental health on a large scale. Policy implementation is complex because it involves action by multiple entities at several levels of society. Policy implementation is also challenging because it can be impacted by many factors, such as political will, stakeholder relationships, and resources available for implementation. Because of these factors, implementation can vary between locations, such as states or countries. It is crucial to evaluate policy implementation, thus we conducted a systematic review to identify and evaluate the quality of measurement tools used in mental health policy implementation studies. Our search and screening procedures resulted in 34 measurement tools. We rated their quality to determine if these tools were practical to use and would yield consistent (i.e., reliable) and accurate (i.e., valid) data. These tools most frequently assessed whether implementing organizations complied with policy mandates and whether organizations had the training and other resources required to implement a policy. Though many were relatively brief and available at little-to-no cost, these findings highlight that more reliable, valid, and practical measurement tools are needed to assess and inform mental health policy implementation. Findings from this review can guide future efforts to select or develop policy implementation measures.
Collapse
Affiliation(s)
- Meagan Pilar
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
- Department of Infectious Diseases, Washington University School of Medicine,
Washington University in St. Louis, St. Louis, MO, USA
| | - Eliot Jost
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
| | - Callie Walsh-Bailey
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
| | - Byron J. Powell
- Center for Mental Health Services Research, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
- Division of Infectious Diseases, John T. Milliken Department of
Medicine, Washington University School of Medicine, Washington University in St.
Louis, St. Louis, MO, USA
| | - Stephanie Mazzucca
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
| | - Amy Eyler
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
| | - Jonathan Purtle
- Department of Public Health Policy & Management, New York
University School of Global Public Health, Global Center for Implementation Science, New York University, New York, NY, USA
| | - Peg Allen
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
| | - Ross C. Brownson
- Prevention Research Center, Brown School, Washington University in St.
Louis, St. Louis, MO, USA
- Department of Surgery (Division of Public Health Sciences) and Alvin
J. Siteman Cancer Center, Washington University School of Medicine, Washington University in St.
Louis, St. Louis, MO, USA
| |
Collapse
|