1
|
Philibert I, Fletcher A, Poppert Cordts KM, Rizzo M. Evaluating governance in a clinical and translational research organization. J Clin Transl Sci 2024; 8:e42. [PMID: 38476243 PMCID: PMC10928697 DOI: 10.1017/cts.2024.25] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2023] [Revised: 12/23/2023] [Accepted: 01/29/2024] [Indexed: 03/14/2024] Open
Abstract
Institutional Development Awards for Clinical and Translational Research (IDeA-CTR) networks, funded by NIH/NIGMS, aim to advance CTR infrastructure to address historically unmet state and regional health needs. Success depends on the response to actionable feedback to IDeA-CTR leadership from network partners and governance groups through annual surveys, interviews, and governance body recommendations. The Great Plains IDeA-CTR applied internal formative meta-evaluation to evaluate dispositions of 172 governance recommendations from 2017 to 2021. Results provided insights to improve the classification and quality of recommendations, credibility of evaluation processes, responsiveness to recommendations, and communications and governance in a complex CTR network comprising multiple coalitions.
Collapse
Affiliation(s)
- Ingrid Philibert
- Great Plains IDeA CTR, University of Nebraska Medical
Center, Omaha, NE, USA
| | - Amanda Fletcher
- Great Plains IDeA CTR, University of Nebraska Medical
Center, Omaha, NE, USA
| | | | - Matthew Rizzo
- Great Plains IDeA CTR, University of Nebraska Medical
Center, Omaha, NE, USA
| |
Collapse
|
2
|
Deeming S, Hure A, Attia J, Nilsson M, Searles A. Prioritising and incentivising productivity within indicator-based approaches to Research Impact Assessment: a commentary. Health Res Policy Syst 2023; 21:136. [PMID: 38110938 PMCID: PMC10726490 DOI: 10.1186/s12961-023-01082-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2023] [Accepted: 11/26/2023] [Indexed: 12/20/2023] Open
Abstract
Research Impact Assessment (RIA) represents one of a suite of policies intended to improve the impact generated from investment in health and medical research (HMR). Positivist indicator-based approaches to RIA are widely implemented but increasingly criticised as theoretically problematic, unfair, and burdensome. This commentary proposes there are useful outcomes that emerge from the process of applying an indicator-based RIA framework, separate from those encapsulated in the metrics themselves. The aim for this commentary is to demonstrate how the act of conducting an indicator-based approach to RIA can serve to optimise the productive gains from the investment in HMR. Prior research found that the issues regarding RIA are less about the choice of indicators/metrics, and more about the discussions prompted and activities incentivised by the process. This insight provides an opportunity to utilise indicator-based methods to purposely optimise the research impact. An indicator-based RIA framework specifically designed to optimise research impacts should: focus on researchers and the research process, rather than institution-level measures; utilise a project level unit of analysis that provides control to researchers and supports collaboration and accountability; provide for prospective implementation of RIA and the prospective orientation of research; establish a line of sight to the ultimate anticipated beneficiaries and impacts; Include process metrics/indicators to acknowledge interim steps on the pathway to final impacts; integrate 'next' users and prioritise the utilisation of research outputs as a critical measure; Integrate and align the incentives for researchers/research projects arising from RIA, with those existing within the prevailing research system; integrate with existing peer-review processes; and, adopt a system-wide approach where incremental improvements in the probability of translation from individual research projects, yields higher impact across the whole funding portfolio.Optimisation of the impacts from HMR investment represents the primary purpose of Research Impact policy. The process of conducting an indicator-based approach to RIA, which engages the researcher during the inception and planning phase, can directly contribute to this goal through improvements in the probability that an individual project will generate interim impacts. The research project funding process represents a promising forum to integrate this approach within the existing research system.
Collapse
Affiliation(s)
- Simon Deeming
- Hunter Medical Research Institute, Kookaburra Circuit, New Lambton Heights, Newcastle, NSW, Australia.
- School of Medicine and Public Health, University of Newcastle, Callaghan, Newcastle, NSW, Australia.
| | - Alexis Hure
- Hunter Medical Research Institute, Kookaburra Circuit, New Lambton Heights, Newcastle, NSW, Australia
- School of Medicine and Public Health, University of Newcastle, Callaghan, Newcastle, NSW, Australia
| | - John Attia
- Hunter Medical Research Institute, Kookaburra Circuit, New Lambton Heights, Newcastle, NSW, Australia
- School of Medicine and Public Health, University of Newcastle, Callaghan, Newcastle, NSW, Australia
- Department of Medicine, John Hunter Hospital, Hunter New England Local Health District, New Lambton Heights, Newcastle, NSW, Australia
| | - Michael Nilsson
- Hunter Medical Research Institute, Kookaburra Circuit, New Lambton Heights, Newcastle, NSW, Australia
- School of Medicine and Public Health, University of Newcastle, Callaghan, Newcastle, NSW, Australia
- Centre for Rehab Innovations, College of Health, Medicine and Wellbeing, University of Newcastle, Callaghan, Newcastle, NSW, Australia
| | - Andrew Searles
- Hunter Medical Research Institute, Kookaburra Circuit, New Lambton Heights, Newcastle, NSW, Australia
- School of Medicine and Public Health, University of Newcastle, Callaghan, Newcastle, NSW, Australia
| |
Collapse
|
3
|
Hindin DI, Mazzei M, Chandragiri S, DuBose L, Threeton D, Lassa J, Azagury DE. A National Study on Training Innovation in US Medical Education. Cureus 2023; 15:e46433. [PMID: 37927762 PMCID: PMC10622174 DOI: 10.7759/cureus.46433] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/27/2023] [Indexed: 11/07/2023] Open
Abstract
Introduction Traditional medical education has leaned heavily on memorization, pattern recognition, and learned algorithmic thinking. Increasingly, however, creativity and innovation are becoming recognized as a valuable component of medical education. In this national survey of Association of American Medical Colleges (AAMC) member institutions, we seek to examine the current landscape of exposure to innovation-related training within the formal academic setting. Methods Surveys were distributed to 168 of 171 AAMC-member institutions (the remaining three were excluded from the study for lack of publicly available contact information). Questions assessed exposure for medical students among four defined innovation pillars as follows: (1) medical humanities, (2) design thinking, (3) entrepreneurship, or (4) technology transfer. Chi-squared analysis was used to assess statistical significance between schools, comparing schools ranked in the top 20 by the US News and World Report against non-top 20 respondents, and comparing schools that serve as National Institutes of Health (NIH) Clinical and Translational Science Awards (CTSA) program hubs against non-CTSA schools. Heat maps for geospatial visualization of data were created using ArcGIS (ArcMAP 10.6) software (Redlands, CA: Environmental Systems Research Institute). Results The overall response rate was 94.2% with 161 schools responding. Among respondents, 101 (63%) reported having medical humanities curricula at their institution. Design thinking offerings were noted at 51/161 (32%) institutions. Support for entrepreneurship was observed at 51/161 institutions (32%), and technology transfer infrastructure was confirmed at 42/161 (26%) of institutions. No statistically significant difference was found between top 20 schools and lower 141 schools when comparing schools with no innovation programs or one or more innovation programs (p=0.592), or all four innovation programs (p=0.108). CTSA programs, however, did show a statistically significant difference (p<0.00001) when comparing schools with no innovation programs vs. one or more programs, but not when comparing to schools with all four innovation programs (p=0.639). Conclusion This study demonstrated an overwhelming prevalence of innovation programs in today's AAMC medical schools, with over 75% of surveyed institutions offering at least one innovation program. No statistically significant trend was seen in the presence of zero programs, one or more, or all four programs between top 20 programs and the remaining 141. CTSA hub schools, however, were significantly more likely to have at least one program vs. none compared to non-CTSA hub schools. Future studies would be valuable to assess the long-term impact of this trend on medical student education.
Collapse
Affiliation(s)
- David I Hindin
- Department of Surgery, Stanford University School of Medicine, Stanford, USA
| | - Michael Mazzei
- Department of Surgery, Lewis Katz School of Medicine, Temple University, Philadelphia, USA
| | | | - Lauren DuBose
- Department of Bioengineering, Temple University, Philadelphia, USA
| | | | - Jerry Lassa
- Department of Mathematics, Northwestern University, Evanston, USA
| | - Dan E Azagury
- Department of Surgery, Stanford University School of Medicine, Stanford, USA
| |
Collapse
|
4
|
Developing and sustaining a community advisory committee to support, inform, and translate biomedical research. J Clin Transl Sci 2023; 7:e20. [PMID: 36755535 PMCID: PMC9879863 DOI: 10.1017/cts.2022.473] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2022] [Accepted: 09/26/2022] [Indexed: 01/21/2023] Open
|
5
|
Yu F, Patel T, Carnegie A, Dave G. Evaluating the impact of a CTSA program from 2008 to 2021 through bibliometrics, social network analysis, and altmetrics. J Clin Transl Sci 2023; 7:e44. [PMID: 36845314 PMCID: PMC9947612 DOI: 10.1017/cts.2022.530] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2022] [Revised: 11/21/2022] [Accepted: 12/27/2022] [Indexed: 01/12/2023] Open
Abstract
Introduction We evaluate a CTSA program hub by applying bibliometrics, social network analysis (SNA), and altmetrics and examine the changes in research productivity, citation impact, research collaboration, and CTSA-supported research topics since our pilot study in 2017. Methods The sampled data included North Carolina Translational and Clinical Science Institute (NC TraCS)-supported publications produced between September 2008 and March 2021. We applied measures and metrics from bibliometrics, SNA, and altmetrics to the dataset. In addition, we analyzed research topics and correlations between different metrics. Results 1154 NC TraCS-supported publications generated over 53,560 citation counts by April 2021. The average cites per year and the relative citation ratio (RCR) mean of these publications improved from 33 and 2.26 in 2017 to 48 and 2.58 in 2021. The number of involved UNC units in the most published authors' collaboration network increased from 7 (2017) to 10 (2021). NC TraCS-supported co-authorship involved 61 NC organizations. PlumX metrics identified articles with the highest altmetrics scores. About 96% NC TraCS-supported publications have above the average SciVal Topic Prominence Percentile; the average approximate potential to translate of the included publication was 54.2%; and 177 publications addressed health disparity issues. Bibliometric measures (e.g., citation counts, RCR) and PlumX metrics (i.e., Citations, Captures, and Social-Media) are positively correlated (p < .05). Conclusion Bibliometrics, SNA, and altmetrics offer distinctive but related perspectives to examine CTSA research performance and longitudinal growth, especially at the individual program hub level. These perspectives can help CTSAs build program foci.
Collapse
Affiliation(s)
- Fei Yu
- Health Sciences Library, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA
| | - Tanha Patel
- North Carolina Translational and Clinical Institute, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA
- School of Medicine, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA
| | - Andrea Carnegie
- North Carolina Translational and Clinical Institute, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA
- School of Medicine, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA
| | - Gaurav Dave
- North Carolina Translational and Clinical Institute, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA
- School of Medicine, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA
| |
Collapse
|
6
|
Irby MB, Moore KR, Mann-Jackson L, Hamlin D, Randall I, Summers P, Skelton JA, Daniel SS, Rhodes SD. Community-Engaged Research: Common Themes and Needs Identified by Investigators and Research Teams at an Emerging Academic Learning Health System. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2021; 18:3893. [PMID: 33917675 PMCID: PMC8068003 DOI: 10.3390/ijerph18083893] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 02/23/2021] [Revised: 04/05/2021] [Accepted: 04/07/2021] [Indexed: 12/20/2022]
Abstract
Community-engaged research (CEnR) has emerged within public health and medicine as an approach to research designed to increase health equity, reduce health disparities, and improve community and population health. We sought to understand how CEnR has been conducted and to identify needs to support CEnR within an emerging academic learning health system (aLHS). We conducted individual semi-structured interviews with investigators experienced in CEnR at an emerging aLHS in the southeastern United States. Eighteen investigators (16 faculty and 2 research associates) were identified, provided consent, and completed interviews. Half of participants were women; 61% were full professors of varied academic backgrounds and departments. Interviews were audio-recorded, transcribed, coded, and analyzed using constant comparison, an approach to grounded theory. Twenty themes emerged that were categorized into six domains: Conceptualization and Purpose, Value and Investment, Community-Academic Partnerships, Sustainability, Facilitators, and Challenges. Results also identified eight emerging needs necessary to enhance CEnR within aLHSs. The results provide insights into how CEnR approaches can be harnessed within aLHSs to build and nurture community-academic partnerships, inform research and institutional priorities, and improve community and population health. Findings can be used to guide the incorporation of CEnR within aLHSs.
Collapse
Affiliation(s)
- Megan B. Irby
- Maya Angelou Center for Health Equity, Wake Forest School of Medicine, Winston-Salem, NC 27157, USA;
| | - Keena R. Moore
- Program in Community-Engaged Research, Wake Forest School of Medicine, Winston-Salem, NC 27157, USA; (K.R.M.); (I.R.)
| | - Lilli Mann-Jackson
- Department of Social Sciences and Health Policy and Program in Community-Engaged Research, Wake Forest School of Medicine, Winston-Salem, NC 27157, USA;
| | - DeWanna Hamlin
- Formerly of the Program in Community-Engaged Research, Wake Forest School of Medicine, Winston-Salem, NC 27157, USA;
| | - Isaiah Randall
- Program in Community-Engaged Research, Wake Forest School of Medicine, Winston-Salem, NC 27157, USA; (K.R.M.); (I.R.)
| | - Phillip Summers
- Department of Radiology, Wake Forest School of Medicine, Winston-Salem, NC 27157, USA;
| | - Joseph A. Skelton
- Department of Pediatrics and Program in Community-Engaged Research, Wake Forest School of Medicine, Winston-Salem, NC 27157, USA;
| | - Stephanie S. Daniel
- Department of Family and Community Medicine and Program in Community-Engaged Research, Wake Forest School of Medicine, Winston-Salem, NC 27157, USA;
| | - Scott D. Rhodes
- Department of Social Sciences and Health Policy and Program in Community-Engaged Research, Wake Forest School of Medicine, Winston-Salem, NC 27157, USA;
| |
Collapse
|
7
|
Lessard L, Smith C, O'Connor S, Velasquez SE, Benson J, Garfield J, Onoye J, Liou L. Collaborative Assessment Of Collective Reach And Impact Among INBRE Supported Summer Undergraduate Research Programs Across The United States. JOURNAL OF STEM EDUCATION : INNOVATIONS AND RESEARCH 2021; 22:46-51. [PMID: 34413711 PMCID: PMC8373203] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
The motivational outcome of undergraduate research experiences is an increasingly common component of STEM education practices. Student benefits associated with these experiences include increased interest and retention in STEM and/or research fields. Across the country, many institutional research activities in twenty-three states and Puerto Rico are supported through the National Institutes of Health's Institutional Development Award (IDeA) Networks of Biomedical Research Excellence (INBRE) Program. INBREs are statewide collaborations of research intensive and primarily undergraduate institutions that are designed to support the biomedical research pipeline as well as faculty research. Most INBREs offer summer undergraduate research experiences to meet their program goals. While the structure and focus of these programs are tailored to state-specific needs, they typically include 10-15 week sessions and many emphasize participation from underrepresented student populations. In summer 2019, eleven INBREs collaborated to explore the collective reach and impact of their summer undergraduate research programs (SURPs). A common set of survey items were identified and added to pre- and/or post-program surveys. These items focused on the reach of the programs (e.g. demographics of participating students) and the impact of the programs on educational goals for students. In total, data from 461 students across 11 states were included in the project. One third of participating students were from underrepresented racial/ethnic groups; 28% were first-generation college students and 34% were Pell grant eligible. After the program, 72% of participants reported that they hoped to earn a doctoral-level degree. Our results suggest that INBRE-supported SURPs are successfully reaching underrepresented students and that INBRE-supported students widely anticipate pursuing graduate level study in STEM fields.
Collapse
Affiliation(s)
| | | | | | | | - Julie Benson
- Alaska INBRE at the University of Alaska Fairbanks
| | | | | | | |
Collapse
|
8
|
Improving the quality and quantity of clinical and translational research statewide: An application of group concept mapping. J Clin Transl Sci 2021; 5:e70. [PMID: 33948289 PMCID: PMC8057434 DOI: 10.1017/cts.2020.572] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022] Open
Abstract
Introduction: Advance Clinical and Translational Research (Advance-CTR) serves as a central hub to support and educate clinical and translational researchers in Rhode Island. Understanding barriers to clinical research in the state is the key to setting project aims and priorities. Methods: We implemented a Group Concept Mapping exercise to characterize the views of researchers and administrators regarding how to increase the quality and quantity of clinical and translational research in their settings. Participants generated ideas in response to this prompt and rated each unique idea in terms of how important it was and feasible it seemed to them. Results: Participants generated 78 unique ideas, from which 9 key themes emerged (e.g., Building connections between researchers). Items rated highest in perceived importance and feasibility included providing seed grants for pilot projects, connecting researchers with common interests and networking opportunities. Implications of results are discussed. Conclusions: The Group Concept Mapping exercise enabled our project leadership to better understand stakeholder-perceived priorities and to act on ideas and aims most relevant to researchers in the state. This method is well suited to translational research enterprises beyond Rhode Island when a participatory evaluation stance is desired.
Collapse
|
9
|
Participatory needs assessment and action planning for a clinical and translational research network. J Clin Transl Sci 2020; 5:e69. [PMID: 33948288 PMCID: PMC8057451 DOI: 10.1017/cts.2020.568] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022] Open
Abstract
The goal of this study was to assess the utility of participatory needs assessment processes for continuous improvement of developing clinical and translational research (CTR) networks. Our approach expanded on evaluation strategies for CTR networks, centers, and institutes, which often survey stakeholders to identify infrastructure or resource needs, using the case example of the Great Plains IDeA-CTR Network. Our 4-stage approach (i.e., pre-assessment, data collection, implementation of needs assessment derived actions, monitoring of action plan) included a member survey (n = 357) and five subsequent small group sessions (n = 75 participants) to better characterize needs identified in the survey and to provide actionable recommendations. This participatory, mixed-methods needs assessment and strategic action planning process yielded 11 inter-related recommendations. These recommendations were presented to the CTR steering committee as inputs to develop detailed, prioritized action plans. Preliminary evaluation shows progress towards improved program capacity and effectiveness of the network to respond to member needs. The participatory, mixed-methods needs assessment and strategic planning process allowed a wide range of stakeholders to contribute to the development of actionable recommendations for network improvement, in line with the principles of team science.
Collapse
|
10
|
Samuels E, Ianni PA, Chung H, Eakin B, Martina C, Murphy SL, Jones C. Guidelines for Evaluating Clinical Research Training using Competency Assessments. MEDEDPUBLISH 2020; 8:202. [PMID: 38089266 PMCID: PMC10712476 DOI: 10.15694/mep.2019.000202.2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/06/2024] Open
Abstract
This article was migrated. The article was marked as recommended. Effective training programs in clinical and translational research (CTR) are critical to the development of the research workforce. The evolution of global CTR competencies frameworks motivates many CTR institutions to align their training offerings with these professional standards. Guidelines for integrating competency-based frameworks and assessments into rigorous program evaluations are needed in order to promote the quality and impact of these training programs. These guidelines provide practical suggestions for how to ensure that subjective and objective assessments of CTR knowledge and skill can be effectively integrated in the evaluations used to improve these essential training programs. The approach presented here necessarily involves the systematic and deliberate incorporation of these particular types of assessments into comprehensive evaluation plans. While these guidelines are broadly applicable to the work of those charged with developing, administering and evaluating CTR training programs, they have been specifically designed for use by program directors.
Collapse
Affiliation(s)
- Elias Samuels
- Michigan Institute for Clinical and Health Research
- Michigan Institute for Clinical and Health Research
| | - Phillip Anton Ianni
- Michigan Institute for Clinical and Health Research
- Michigan Institute for Clinical and Health Research
| | - Haejung Chung
- Tufts Clinical and Translational Science Institute
- Tufts Clinical and Translational Science Institute
| | - Brenda Eakin
- Michigan Institute for Clinical and Health Research
- Michigan Institute for Clinical and Health Research
| | - Camille Martina
- Clinical Translational Science Institute
- Clinical Translational Science Institute
| | - Susan Lynn Murphy
- Michigan Institute for Clinical and Health Research
- Michigan Institute for Clinical and Health Research
| | - Carolynn Jones
- Center for Clinical and Translational Science
- Center for Clinical and Translational Science
| |
Collapse
|
11
|
Kamenetzky A, Hinrichs-Krapels S. How do organisations implement research impact assessment (RIA) principles and good practice? A narrative review and exploratory study of four international research funding and administrative organisations. Health Res Policy Syst 2020; 18:6. [PMID: 31959198 PMCID: PMC6971910 DOI: 10.1186/s12961-019-0515-1] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2019] [Accepted: 12/09/2019] [Indexed: 12/14/2022] Open
Abstract
Background Public research funding agencies and research organisations are increasingly accountable for the wider impacts of the research they support. While research impact assessment (RIA) frameworks and tools exist, little is known and shared of how these organisations implement RIA activities in practice. Methods We conducted a review of academic literature to search for research organisations’ published experiences of RIAs. We followed this with semi-structured interviews from a convenience sample (n = 7) of representatives of four research organisations deploying strategies to support and assess research impact. Results We found only five studies reporting empirical evidence on how research organisations put RIA principles into practice. From our interviews, we observed a disconnect between published RIA frameworks and tools, and the realities of organisational practices, which tended not to be reported. We observed varying maturity and readiness with respect to organisations’ structural set ups for conducting RIAs, particularly relating to leadership, skills for evaluation and automating RIA data collection. Key processes for RIA included efforts to engage researcher communities to articulate and plan for impact, using a diversity of methods, frameworks and indicators, and supporting a learning approach. We observed outcomes of RIAs as having supported a dialogue to orient research to impact, underpinned shared learning from analyses of research, and provided evidence of the value of research in different domains and to different audiences. Conclusions Putting RIA principles and frameworks into practice is still in early stages for research organisations. We recommend that organisations (1) get set up by considering upfront the resources, time and leadership required to embed impact strategies throughout the organisation and wider research ‘ecosystem’, and develop methodical approaches to assessing impact; (2) work together by engaging researcher communities and wider stakeholders as a core part of impact pathway planning and subsequent assessment; and (3) recognise the benefits that RIA can bring about as a means to improve mutual understanding of the research process between different actors with an interest in research.
Collapse
Affiliation(s)
- Adam Kamenetzky
- National Institute for Health Research Central Commissioning Facility, Twickenham, TW1 3NL, United Kingdom. .,Policy Institute at King's College London, Strand Campus, London, WC2B 6LE, United Kingdom.
| | - Saba Hinrichs-Krapels
- Policy Institute at King's College London, Strand Campus, London, WC2B 6LE, United Kingdom.,King's Global Health Institute, King's College London, Denmark Hill, London, SE5 9RJ, United Kingdom
| |
Collapse
|
12
|
Patel T, Rainwater J, Trochim WM, Elworth JT, Scholl L, Dave G. Opportunities for strengthening CTSA evaluation. J Clin Transl Sci 2019; 3:59-64. [PMID: 31660229 PMCID: PMC6803463 DOI: 10.1017/cts.2019.387] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2019] [Revised: 05/09/2019] [Accepted: 05/22/2019] [Indexed: 01/14/2023] Open
Abstract
The purpose of the article is to describe the progress of the Clinical and Translational Science Award (CTSA) Program to address the evaluation-related recommendations made by the 2013 Institute of Medicine's review of the CTSA Program and guidelines published in CTS Journal the same year (Trochim et al., Clinical and Translational Science 2013; 6(4): 303-309). We utilize data from a 2018 national survey of evaluators administered to all 64 CTSA hubs and a content analysis of the role of evaluation in the CTSA Program Funding Opportunity Announcements to document progress. We present four new opportunities for further strengthening CTSA evaluation efforts: (1) continue to build the collaborative evaluation infrastructure at local and national levels; (2) make better use of existing data; (3) strengthen and augment the common metrics initiative; and (4) pursue internal and external opportunities to evaluate the CTSA program at the national level. This article will be of significant interest to the funders of the CTSA Program and the multiple stakeholders in the larger consortium and will promote dialog from the broad range of CTSA stakeholders about further strengthening the CTSA Program's evaluation.
Collapse
Affiliation(s)
- Tanha Patel
- Clinical and Translational Science Institute, Wake Forest School of Medicine, Winston-Salem, NC, USA
| | - Julie Rainwater
- Clinical and Translational Science Center, University of California Davis, Sacramento, CA, USA
| | - William M. Trochim
- Department of Policy Analysis and Management, Cornell University, Ithaca, NY, USA
| | - Julie T. Elworth
- Institute of Translational Health Sciences, University of Washington, Seattle, WA, USA
| | - Linda Scholl
- Institute for Clinical and Translational Research, University of Wisconsin-Madison, Madison, WI, USA
| | - Gaurav Dave
- Department of Medicine, University of North Carolina, Chapel Hill, NC, USA
| |
Collapse
|
13
|
Jones Rhodes WC, Ritzwoller DP, Glasgow RE. Stakeholder perspectives on costs and resource expenditures: tools for addressing economic issues most relevant to patients, providers, and clinics. Transl Behav Med 2019; 8:675-682. [PMID: 29590479 DOI: 10.1093/tbm/ibx003] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Cost and other resources required are often primary considerations in whether a potential program or policy will be adopted or implemented and an important element in determining value. However, few economic analyses are conducted from the perspective of patient/family or small-scale stakeholders such as local clinics. We outline and discuss alternative cost assessment and resource expenditures options from the perspective of these small, proximal stakeholders. The perspective of these persons differs from larger societal or health plan perspectives, and often differs across individuals in terms of what they value and the types of expenditures about which they are concerned. We discuss key features of the perspectives of patients, health care clinics, and local leaders, and present brief examples and sample templates for collection of consumer/stakeholder relevant cost and return on investment issues. These tools can be used prospectively and iteratively during program planning, intervention delivery, summative analysis, and preparation for dissemination stages. There is an important need for this type of feasible, pragmatic, rapid, and iterative cost and resource expenditure analysis directly relevant to patients/families, small local stakeholders and their organizations. Future research on and use of these approaches is recommended.
Collapse
Affiliation(s)
- Whitney C Jones Rhodes
- Department of Family Medicine, University of Colorado School of Medicine, Aurora, CO, USA
| | - Debra P Ritzwoller
- Institute for Health Research, Kaiser Permanente of Colorado, Aurora, CO, USA
| | - Russell E Glasgow
- Department of Family Medicine, University of Colorado School of Medicine, Aurora, CO, USA
| |
Collapse
|
14
|
Marchand GC, Hilpert JC, Bragg KM, Cummings J. Network-based assessment of collaborative research in neuroscience. ALZHEIMER'S & DEMENTIA (NEW YORK, N. Y.) 2018; 4:433-443. [PMID: 30294659 PMCID: PMC6170254 DOI: 10.1016/j.trci.2018.08.006] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
INTRODUCTION The purpose of this study was to describe collaborative research in neuroscience within the context of the Center for Neurodegeneration and Translational Neuroscience (CNTN), a Center of Biomedical Research Excellence supported by the National Institute of General Medical Science. Drawing upon research on the science of team science, this study investigated the way that interactions around research emerged over the course of establishing a new research center. The objectives were to document changes in research activity and describe how human research support infrastructure functioned to support the production of science. METHODS Social network analyses were used to model coauthorship relationships based on publication histories from baseline (2014) through the current grant year (2017) for key personnel (n = 12), as well as survey data on collaborative engagement among CNTN members (n = 59). RESULTS Exponential random graph models indicated that over time, CNTN members were increasingly likely to form coauthorship relationships. Community detection algorithms and brokerage analyses suggested that the CNTN was functioning as intended to support scientific development. DISCUSSION Assessment of team science efforts is critical to evaluating and developing appropriate support structures that facilitate successful team science efforts in translational neuroscience.
Collapse
Affiliation(s)
- Gwen C. Marchand
- University of Nevada, Las Vegas, College of Education, Center for Research, Evaluation, and Assessment, Department of Educational Psychology and Higher Education, Las Vegas, NV, USA
| | - Jonathan C. Hilpert
- Georgia Southern University, College of Education, Department of Curriculum Foundations and Reading, Evaluation, Assessment, Research, and Learning (EARL) Program, Statesboro, GA, USA
| | - Kristine M. Bragg
- University of Nevada, Las Vegas, College of Education, Center for Research, Evaluation, and Assessment, Department of Educational Psychology and Higher Education, Las Vegas, NV, USA
| | - Jeffrey Cummings
- Cleveland Clinic Lou Ruvo Center for Brain Health, Las Vegas, NV, USA
| |
Collapse
|
15
|
The innovation scorecard for continuous improvement applied to translational science. J Clin Transl Sci 2018; 1:296-300. [PMID: 29707250 PMCID: PMC5915811 DOI: 10.1017/cts.2017.297] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2017] [Accepted: 08/18/2017] [Indexed: 01/17/2023] Open
Abstract
Introduction This paper reports on the baseline stage of a qualitative evaluation of the application of the Innovative Scorecard (ISC) to the Clinical and Translational Science Award (CTSA) at the University of Texas Medical Branch (UTMB) at Galveston. The ISC is adopted from the established Balanced Scorecard system for strategic planning and performance management. In formulating the evaluation, we focused on the organizational identity literature. Methods The initial evaluation consisted of a series of semi-structured interviews with 22 participants of the ISC Boot Camp conducted in July 2015. Results The logic of grounded theory pointed to the clustering of perceptions of the ISC around respondents’ occupational locations at UTMB. Administrators anticipate the expansion of planning activities to include a wider range of participants under the current CTSA award period (2015–2020) than under our first CTSA approval period (2009–2014). A common viewpoint among the senior scientists was that the scientific value of their work will continue to speak for itself without requiring the language of business. Junior scientists looked forward to the ISC’s emphasis on increasingly horizontal leadership that will give them more access to and more control over their work and resources. Postdocs and senior staff welcomed increased involvement in the total research process at UTMB. Conclusion The report concludes with strategies for future follow-up.
Collapse
|
16
|
Deeming S, Reeves P, Ramanathan S, Attia J, Nilsson M, Searles A. Measuring research impact in medical research institutes: a qualitative study of the attitudes and opinions of Australian medical research institutes towards research impact assessment frameworks. Health Res Policy Syst 2018; 16:28. [PMID: 29548331 PMCID: PMC5857092 DOI: 10.1186/s12961-018-0300-6] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2017] [Accepted: 02/22/2018] [Indexed: 11/12/2022] Open
Abstract
Background The question of how to measure, assess and optimise the returns from investment in health and medical research (HMR) is a highly policy-relevant issue. Research Impact Assessment Frameworks (RIAFs) provide a conceptual measurement framework to assess the impact from HMR. The aims of this study were (1) to elicit the views of Medical Research Institutes (MRIs) regarding objectives, definitions, methods, barriers, potential scope and attitudes towards RIAFs, and (2) to investigate whether an assessment framework should represent a retrospective reflection of research impact or a prospective approach integrated into the research process. The wider objective was to inform the development of a draft RIAF for Australia’s MRIs. Methods Purposive sampling to derive a heterogeneous sample of Australian MRIs was used alongside semi-structured interviews with senior executives responsible for research translation or senior researchers affected by research impact initiatives. Thematic analysis of the interview transcriptions using the framework approach was then performed. Results Interviews were conducted with senior representatives from 15 MRIs. Participants understood the need for greater research translation/impact, but varied in their comprehension and implementation of RIAFs. Common concerns included the time lag to the generation of societal impacts from basic or discovery science, and whether impact reflected a narrow commercialisation agenda. Broad support emerged for the use of metrics, case study and economic methods. Support was also provided for the rationale of both standardised and customised metrics. Engendering cultural change in the approach to research translation was acknowledged as both a barrier to greater impact and a critical objective for the assessment process. Participants perceived that the existing research environment incentivised the generation of academic publications and track records, and often conflicted with the generation of wider impacts. The potential to improve the speed of translation through prospective implementation of impact assessment was supported, albeit that the mechanism required development. Conclusion The study found that the issues raised regarding research impact assessment are less about methods and metrics, and more about the research activities that the measurement of research translation and impact may or may not incentivise. Consequently, if impact assessment is to contribute to optimisation of the health gains from the public, corporate and philanthropic investment entrusted to the institutes, then further inquiry into how the assessment process may re-align research behaviour must be prioritised.
Collapse
Affiliation(s)
- Simon Deeming
- Hunter Medical Research Institute, Lot 1, Kookaburra Circuit, New Lambton Heights, NSW, 2305, Australia. .,School of Medicine and Public Health, The University of Newcastle, University Drive, Callaghan, NSW, 2308, Australia.
| | - Penny Reeves
- Hunter Medical Research Institute, Lot 1, Kookaburra Circuit, New Lambton Heights, NSW, 2305, Australia.,School of Medicine and Public Health, The University of Newcastle, University Drive, Callaghan, NSW, 2308, Australia
| | - Shanthi Ramanathan
- Hunter Medical Research Institute, Lot 1, Kookaburra Circuit, New Lambton Heights, NSW, 2305, Australia.,School of Medicine and Public Health, The University of Newcastle, University Drive, Callaghan, NSW, 2308, Australia
| | - John Attia
- Hunter Medical Research Institute, Lot 1, Kookaburra Circuit, New Lambton Heights, NSW, 2305, Australia.,School of Medicine and Public Health, The University of Newcastle, University Drive, Callaghan, NSW, 2308, Australia.,Department of Medicine, John Hunter Hospital, Hunter New England Local Health District, New Lambton Heights, NSW, 2305, Australia
| | - Michael Nilsson
- Hunter Medical Research Institute, Lot 1, Kookaburra Circuit, New Lambton Heights, NSW, 2305, Australia.,School of Medicine and Public Health, The University of Newcastle, University Drive, Callaghan, NSW, 2308, Australia.,Department of Medicine, John Hunter Hospital, Hunter New England Local Health District, New Lambton Heights, NSW, 2305, Australia
| | - Andrew Searles
- Hunter Medical Research Institute, Lot 1, Kookaburra Circuit, New Lambton Heights, NSW, 2305, Australia.,School of Medicine and Public Health, The University of Newcastle, University Drive, Callaghan, NSW, 2308, Australia
| |
Collapse
|
17
|
Deeming S, Searles A, Reeves P, Nilsson M. Measuring research impact in Australia's medical research institutes: a scoping literature review of the objectives for and an assessment of the capabilities of research impact assessment frameworks. Health Res Policy Syst 2017; 15:22. [PMID: 28327199 PMCID: PMC5361798 DOI: 10.1186/s12961-017-0180-1] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2016] [Accepted: 02/14/2017] [Indexed: 12/23/2022] Open
Abstract
BACKGROUND Realising the economic potential of research institutions, including medical research institutes, represents a policy imperative for many Organisation for Economic Co-operation and Development nations. The assessment of research impact has consequently drawn increasing attention. Research impact assessment frameworks (RIAFs) provide a structure to assess research translation, but minimal research has examined whether alternative RIAFs realise the intended policy outcomes. This paper examines the objectives presented for RIAFs in light of economic imperatives to justify ongoing support for health and medical research investment, leverage productivity via commercialisation and outcome-efficiency gains in health systems, and ensure that translation and impact considerations are embedded into the research process. This paper sought to list the stated objectives for RIAFs, to identify existing frameworks and to evaluate whether the identified frameworks possessed the capabilities necessary to address the specified objectives. METHODS A scoping review of the literature to identify objectives specified for RIAFs, inform upon descriptive criteria for each objective and identify existing RIAFs. Criteria were derived for each objective. The capability for the existing RIAFs to realise the alternative objectives was evaluated based upon these criteria. RESULTS The collated objectives for RIAFs included accountability (top-down), transparency/accountability (bottom-up), advocacy, steering, value for money, management/learning and feedback/allocation, prospective orientation, and speed of translation. Of the 25 RIAFs identified, most satisfied objectives such as accountability and advocacy, which are largely sufficient for the first economic imperative to justify research investment. The frameworks primarily designed to optimise the speed of translation or enable the prospective orientation of research possessed qualities most likely to optimise the productive outcomes from research. However, the results show that few frameworks met the criteria for these objectives. CONCLUSION It is imperative that the objective(s) for an assessment framework are explicit and that RIAFs are designed to realise these objectives. If the objectives include the capability to pro-actively drive productive research impacts, the potential for prospective orientation and a focus upon the speed of translation merits prioritisation. Frameworks designed to optimise research translation and impact, rather than simply assess impact, offer greater promise to contribute to the economic imperatives compelling their implementation.
Collapse
Affiliation(s)
- Simon Deeming
- Hunter Medical Research Institute, Lot 1, Kookaburra Circuit, New Lambton Heights, Newcastle, 2305, NSW, Australia. .,School of Medicine and Public Health, The University of Newcastle, University Drive, Callaghan, Newcastle, 2308, NSW, Australia.
| | - Andrew Searles
- Hunter Medical Research Institute, Lot 1, Kookaburra Circuit, New Lambton Heights, Newcastle, 2305, NSW, Australia.,School of Medicine and Public Health, The University of Newcastle, University Drive, Callaghan, Newcastle, 2308, NSW, Australia
| | - Penny Reeves
- Hunter Medical Research Institute, Lot 1, Kookaburra Circuit, New Lambton Heights, Newcastle, 2305, NSW, Australia.,School of Medicine and Public Health, The University of Newcastle, University Drive, Callaghan, Newcastle, 2308, NSW, Australia
| | - Michael Nilsson
- Hunter Medical Research Institute, Lot 1, Kookaburra Circuit, New Lambton Heights, Newcastle, 2305, NSW, Australia.,School of Medicine and Public Health, The University of Newcastle, University Drive, Callaghan, Newcastle, 2308, NSW, Australia.,Department of Medicine, John Hunter Hospital, Hunter New England Local Health District, Kookaburra Circuit, New Lambton Heights, Newcastle, 2305, NSW, Australia
| |
Collapse
|
18
|
Feasibility of common bibliometrics in evaluating translational science. J Clin Transl Sci 2017; 1:45-52. [PMID: 28480055 PMCID: PMC5408837 DOI: 10.1017/cts.2016.8] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2016] [Revised: 08/04/2016] [Accepted: 10/11/2017] [Indexed: 11/07/2022] Open
Abstract
Introduction A pilot study by 6 Clinical and Translational Science Awards (CTSAs) explored how bibliometrics can be used to assess research influence. Methods Evaluators from 6 institutions shared data on publications (4202 total) they supported, and conducted a combined analysis with state-of-the-art tools. This paper presents selected results based on the tools from 2 widely used vendors for bibliometrics: Thomson Reuters and Elsevier. Results Both vendors located a high percentage of publications within their proprietary databases (>90%) and provided similar but not equivalent bibliometrics for estimating productivity (number of publications) and influence (citation rates, percentage of papers in the top 10% of citations, observed citations relative to expected citations). A recently available bibliometric from the National Institutes of Health Office of Portfolio Analysis, examined after the initial analysis, showed tremendous potential for use in the CTSA context. Conclusion Despite challenges in making cross-CTSA comparisons, bibliometrics can enhance our understanding of the value of CTSA-supported clinical and translational research.
Collapse
|
19
|
Comeau DL, Escoffery C, Freedman A, Ziegler TR, Blumberg HM. Improving clinical and translational research training: a qualitative evaluation of the Atlanta Clinical and Translational Science Institute KL2-mentored research scholars program. J Investig Med 2016; 65:23-31. [PMID: 27591319 DOI: 10.1136/jim-2016-000143] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/13/2016] [Indexed: 11/04/2022]
Abstract
A major impediment to improving the health of communities is the lack of qualified clinical and translational research (CTR) investigators. To address this workforce shortage, the National Institutes of Health (NIH) developed mechanisms to enhance the career development of CTR physician, PhD, and other doctoral junior faculty scientists including the CTR-focused K12 program and, subsequently, the KL2-mentored CTR career development program supported through the Clinical and Translational Science Awards (CTSAs). Our evaluation explores the impact of the K12/KL2 program embedded within the Atlanta Clinical and Translational Science Institute (ACTSI), a consortium linking Emory University, Morehouse School of Medicine and the Georgia Institute of Technology. We conducted qualitative interviews with program participants to evaluate the impact of the program on career development and collected data on traditional metrics (number of grants, publications). 46 combined K12/KL2 scholars were supported between 2002 and 2016. 30 (65%) of the 46 K12/KL2 scholars are women; 24 (52%) of the trainees are minorities, including 10 (22%) scholars who are members of an underrepresented minority group. Scholars reported increased research skills, strong mentorship experiences, and positive impact on their career trajectory. Among the 43 scholars who have completed the program, 39 (91%) remain engaged in CTR and received over $89 000 000 as principal investigators on federally funded awards. The K12/KL2 funding provided the training and protected time for successful career development of CTR scientists. These data highlight the need for continued support for CTR training programs for junior faculty.
Collapse
Affiliation(s)
- Dawn L Comeau
- Department of Behavioral Sciences and Health Education, Rollins School of Public Health, Emory University, Atlanta, Georgia, USA
| | - Cam Escoffery
- Department of Behavioral Sciences and Health Education, Rollins School of Public Health, Emory University, Atlanta, Georgia, USA
| | - Ariela Freedman
- Department of Behavioral Sciences and Health Education, Rollins School of Public Health, Emory University, Atlanta, Georgia, USA
| | - Thomas R Ziegler
- Division of Endocrinology, Metabolism and Lipids, Department of Medicine, Emory University School of Medicine, Atlanta, Georgia, USA
| | - Henry M Blumberg
- Division of Infectious Diseases, Department of Medicine, Emory University School of Medicine, Atlanta, Georgia, USA.,Departments of Epidemiology and Global Health, Rollins School of Public Health of Emory University, Atlanta, Georgia, USA
| |
Collapse
|
20
|
Corregano L, Bastert K, Correa da Rosa J, Kost RG. Accrual Index: A Real-Time Measure of the Timeliness of Clinical Study Enrollment. Clin Transl Sci 2015; 8:655-61. [PMID: 26573223 DOI: 10.1111/cts.12352] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022] Open
Abstract
BACKGROUND Achieving timely accrual into clinical research studies remains a challenge for clinical translational research. We developed an evaluation measure, the Accrual Index (AI), normalized for sample size and study duration, using data from the protocol and study management databases. We applied the AI retrospectively and prospectively to assess its utility. METHODS Accrual Target, Projected Time to Accrual Completion (PTAC), Evaluable Subjects, Dates of Recruitment Initiation, Analysis, and Completion were defined. AI is (% Accrual Target accrued/% PTAC elapsed). Changes to recruitment practices were described, and data extracted from study management databases. RESULTS December 2014 (or final) AI was analyzed for 101 studies initiating recruitment from 2007 to 2014. Median AI was ≥1 for protocols initiating recruitment in 2011, 2013, and 2014. The AI varied widely for studies pre-2013. Studies with AI > 4 utilized convenience samples for recruitment. Data-justified PTAC was refined in 2013-2014 after which the AI range narrowed. Protocol characteristics were not associated with study AI. CONCLUSION Protocol AI reflects the relative agreement between accrual feasibility assessment (PTAC), and accrual performance, and is affected by recruitment practices. The AI may be useful in managing accountability, modeling accrual, allocating recruitment resources, and testing innovations in recruitment practices.
Collapse
Affiliation(s)
- Lauren Corregano
- Clinical Research Support Office, The Rockefeller University Center for Clinical and Translational Science, New York, New York, USA
| | - Katelyn Bastert
- Clinical Research Support Office, The Rockefeller University Center for Clinical and Translational Science, New York, New York, USA
| | - Joel Correa da Rosa
- Department of Research Design and Biostatistics, The Rockefeller University Center for Clinical and Translational Science, New York, New York, USA
| | - Rhonda G Kost
- Clinical Research Support Office, The Rockefeller University Center for Clinical and Translational Science, New York, New York, USA
| |
Collapse
|
21
|
Rubio DM, Blank AE, Dozier A, Hites L, Gilliam VA, Hunt J, Rainwater J, Trochim WM. Developing Common Metrics for the Clinical and Translational Science Awards (CTSAs): Lessons Learned. Clin Transl Sci 2015; 8:451-9. [PMID: 26073891 PMCID: PMC4626292 DOI: 10.1111/cts.12296] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022] Open
Abstract
The National Institutes of Health (NIH) Roadmap for Medical Research initiative, funded by the NIH Common Fund and offered through the Clinical and Translational Science Award (CTSA) program, developed more than 60 unique models for achieving the NIH goal of accelerating discoveries toward better public health. The variety of these models enabled participating academic centers to experiment with different approaches to fit their research environment. A central challenge related to the diversity of approaches is the ability to determine the success and contribution of each model. This paper describes the effort by the Evaluation Key Function Committee to develop and test a methodology for identifying a set of common metrics to assess the efficiency of clinical research processes and for pilot testing these processes for collecting and analyzing metrics. The project involved more than one-fourth of all CTSAs and resulted in useful information regarding the challenges in developing common metrics, the complexity and costs of acquiring data for the metrics, and limitations on the utility of the metrics in assessing clinical research performance. The results of this process led to the identification of lessons learned and recommendations for development and use of common metrics to evaluate the CTSA effort.
Collapse
Affiliation(s)
- Doris M Rubio
- Data Center, Center for Research on Health Care, Division of General Internal Medicine, Department of Medicine, School of Medicine, University of Pittsburgh, Pittsburgh, Pennsylvania, USA
| | - Arthur E Blank
- Department of Family and Social Medicine, Department of Epidemiology and Population Health, Albert Einstein College of Medicine, Bronx, New York, USA
| | - Ann Dozier
- Department of Public Health Sciences, University of Rochester, Rochester, New York, USA
| | - Lisle Hites
- Department of Health Care Organization and Policy, University of Alabama at Birmingham, Birmingham, Alabama, USA
| | - Victoria A Gilliam
- Data Center, Center for Research on Health Care, Division of General Internal Medicine, Department of Medicine, School of Medicine, University of Pittsburgh, Pittsburgh, Pennsylvania, USA
| | - Joe Hunt
- Indiana Clinical and Translational Sciences Institute, Indiana University School of Medicine, Indianapolis, Indiana, USA
| | - Julie Rainwater
- Clinical and Translational Science Center, University of California Davis, Sacramento, California, USA
| | - William M Trochim
- Department of Policy Analysis and Management, Cornell University, Ithaca, New York, USA
| |
Collapse
|
22
|
Duda GN, Grainger DW, Frisk ML, Bruckner-Tuderman L, Carr A, Dirnagl U, Einhäupl KM, Gottschalk S, Gruskin E, Huber C, June CH, Mooney DJ, Rietschel ET, Schütte G, Seeger W, Stevens MM, Urban R, Veldman A, Wess G, Volk HD. Changing the mindset in life sciences toward translation: a consensus. Sci Transl Med 2015; 6:264cm12. [PMID: 25429054 DOI: 10.1126/scitranslmed.aaa0599] [Citation(s) in RCA: 34] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/02/2023]
Abstract
Participants at the recent Translate! 2014 meeting in Berlin, Germany, reached a consensus on the rate-limiting factor for advancing translational medicine.
Collapse
Affiliation(s)
- Georg N Duda
- Berlin-Brandenburg Center for Regenerative Therapies, Charité-Universitätsmedizin Berlin, Germany. Julius Wolff Institut, Charité-Universitätsmedizin Berlin, Germany.
| | - David W Grainger
- Department of Pharmaceutics and Pharmaceutical Chemistry, Health Sciences, University of Utah, Salt Lake City, UT 84112, USA
| | - Megan L Frisk
- Science Translational Medicine, American Association for the Advancement of Science (AAAS), Washington, DC 20005, USA
| | - Leena Bruckner-Tuderman
- Department of Dermatology, Medical Center-University of Freiburg, 79104 Freiburg, Germany. German Research Foundation (DFG), Bonn, Germany
| | - Andrew Carr
- NIHR Biomedical Research Unit, Botnar Research Centre, Nuffield Department of Orthopaedics Rheumatology and Musculoskeletal Sciences, University of Oxford, UK
| | - Ulrich Dirnagl
- Center for Stroke Research, Charité-Universitätsmedizin Berlin, Germany. Department of Experimental Neurology, Charite-Universitätsmedizin Berlin, Germany
| | - Karl Max Einhäupl
- Chairman of the Executive Board, Charité-Universitätsmedizin Berlin, Charitéplatz 1, 10117 Berlin, Germany
| | - Stephen Gottschalk
- Center for Cell and Gene Therapy, Texas Children's Hospital, Houston Methodist Hospital, Baylor College of Medicine, Houston, TX 77030, USA
| | - Elliott Gruskin
- DePuy Synthes Biomaterials, 1302 Wrights Lane East, West Chester, PA 19380, USA
| | - Christoph Huber
- Translational Oncology (TRON), Mainz University Medical Center, Germany
| | - Carl H June
- Department of Pathology and Laboratory Medicine, Perelman School of Medicine, Philadelphia, PA 19104, USA
| | - David J Mooney
- School of Engineering & Applied Sciences, Harvard University, Cambridge, MA, USA. Wyss Institute, Center for Life Science Boston Bldg., Boston, MA 02115, USA
| | | | - Georg Schütte
- Federal Ministry of Education and Research, D-53175 Bonn, Germany
| | - Werner Seeger
- Department of Pneumologie, Universities of Giessen & Marburg Lung Center (UGMLC), 35392 Giessen, Germany. German Center for Lung Research (DZL), 35392 Giessen, Germany
| | - Molly M Stevens
- Department of Materials, Imperial College London, London, SW7 2AZ, UK. Department of Bioengineering, Institute for Biomedical Engineering, Imperial College London, London, SW7 2AZ, UK
| | - Robert Urban
- J&J Boston Innovation Center, One Cambridge Center, 7th Floor, Cambridge, MA 02142, USA
| | - Alex Veldman
- Monash Newborn, Monash Medical Centre, Clayton, Melbourne, Victoria 3168, Australia. The Ritchie Centre, Monash Institute of Medical Research Prince Henry's Institute (MIMR-PHI), Institute of Medical Research, Clayton, Melbourne, Victoria 3168, Australia. Department of Paediatrics, Monash University, Melbourne, Australia
| | - Günther Wess
- Helmholtz Zentrum München-German Research Center for Environmental Health, Germany
| | - Hans-Dieter Volk
- Berlin-Brandenburg Center for Regenerative Therapies, Charité-Universitätsmedizin Berlin, Germany. Institute for Medical Immunology, Charité-Universitätsmedizin Berlin, Germany
| |
Collapse
|
23
|
Kane C, Alexander A, Hogle JA, Parsons HM, Phelps L. Heterogeneity at work: implications of the 2012 Clinical Translational Science Award evaluators survey. Eval Health Prof 2014; 36:447-63. [PMID: 24214662 DOI: 10.1177/0163278713510378] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
The Clinical and Translational Science Award (CTSA) program is an ambitious multibillion dollar initiative sponsored by the National Institutes of Health (NIH) organized around the mission of facilitating the improved quality, efficiency, and effectiveness of translational health sciences research across the country. Although the NIH explicitly requires internal evaluation, funded CTSA institutions are given wide latitude to choose the structure and methods for evaluating their local CTSA program. The National Evaluators Survey was developed by a peer-led group of local CTSA evaluators as a voluntary effort to understand emerging differences and commonalities in evaluation teams and techniques across the 61 CTSA institutions funded nationwide. This article presents the results of the 2012 National Evaluators Survey, finding significant heterogeneity in evaluation staffing, organization, and methods across the 58 CTSAs institutions responding. The variety reflected in these findings represents both a liability and strength. A lack of standardization may impair the ability to make use of common metrics, but variation is also a successful evolutionary response to complexity. Additionally, the peer-led approach and simple design demonstrated by the questionnaire itself has value as an example of an evaluation technique with potential for replication in other areas across the CTSA institutions or any large-scale investment where multiple related teams across a wide geographic area are given the latitude to develop specialized approaches to fulfilling a common mission.
Collapse
Affiliation(s)
- Cathleen Kane
- Cornell University and the Weill Cornell Clinical Translational Science Center, New York, NY, USA
| | | | | | | | | |
Collapse
|