1
|
Nicholson J, Plovnick C, van der Vleuten C, de Bruin ABH, Kalet A. Librarian-Led Assessment of Medical Students' Evidence-Based Medicine Competency: Facilitators and Barriers. PERSPECTIVES ON MEDICAL EDUCATION 2024; 13:160-168. [PMID: 38464960 PMCID: PMC10921970 DOI: 10.5334/pme.1145] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/23/2023] [Accepted: 02/08/2024] [Indexed: 03/12/2024]
Abstract
Introduction We must ensure, through rigorous assessment that physicians have the evidence-based medicine (EBM) skills to identify and apply the best available information to their clinical work. However, there is limited guidance on how to assess EBM competency. With a better understanding of their current role in EBM education, Health Sciences Librarians (HSLs), as experts, should be able to contribute to the assessment of medical student EBM competence. The purpose of this study is to explore the HSLs perspective on EBM assessment practices, both current state and potential future activities. Methods We conducted focus groups with librarians from across the United States to explore their perceptions of assessing EBM competence in medical students. Participants had been trained to be raters of EBM competence as part of a novel Objective Structured Clinical Examination (OSCE). This OSCE was just the starting point and the discussion covered topics of current EBM assessment and possibility for expanded responsibilities at their own institutions. We used a reflexive thematic analysis approach to construct themes from our conversations. Results We constructed eight themes in four broad categories that influence the success of librarians being able to engage in effective assessment of EBM: administrative, curricular, medical student, and librarian. Conclusion Our results inform medical school leadership by pointing out the modifiable factors that enable librarians to be more engaged in conducting effective assessment. They highlight the need for novel tools, like EBM OSCEs, that can address multiple barriers and create opportunities for deeper integration of librarians into assessment processes.
Collapse
Affiliation(s)
- Joey Nicholson
- NYU Health Sciences Library, NYU Grossman School of Medicine, NYU Langone Health, US
| | - Caitlin Plovnick
- NYU Health Sciences Library, NYU Grossman School of Medicine, NYU Langone Health, New York, US
| | - Cees van der Vleuten
- Department of Educational Development and Research, Faculty of Health, Medicine and Life Sciences, Maastricht University, Maastricht, NL
| | - Anique B. H. de Bruin
- School of Health Professions Education, Faculty of Health, Medicine and Life Sciences, Maastricht University, Maastricht, NL
| | - Adina Kalet
- Robert D. and Patricia E. Kern Institute for the Transformation of Medical Education, Medical College of Wisconsin, Wauwatosa, Wisconsin, US
| |
Collapse
|
2
|
Kumaravel B, Stewart C, Ilic D. Development and evaluation of a spiral model of assessing EBM competency using OSCEs in undergraduate medical education. BMC MEDICAL EDUCATION 2021; 21:204. [PMID: 33838686 PMCID: PMC8035769 DOI: 10.1186/s12909-021-02650-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/15/2021] [Accepted: 04/05/2021] [Indexed: 05/04/2023]
Abstract
BACKGROUND Medical students often struggle to understand the relevance of Evidence Based Medicine (EBM) to their clinical practice, yet it is a competence that all students must develop prior to graduation. Objective structured clinical examinations (OSCEs) are a valued assessment tool to assess critical components of EBM competency, particularly different levels of mastery as they progress through the course. This study developed and evaluated EBM based OSCE stations with an aim to establish a spiral approach for EBM OSCE stations for undergraduate medical students. METHODS OSCE stations were developed with increasingly complex EBM tasks. OSCE stations were classified according to the classification rubric for EBP assessment tools (CREATE) framework and mapped against the recently published core competencies for evidence-based practice (EBP). Performance data evaluation was undertaken using Classical Test Theory analysing mean scores, pass rates, and station item total correlation (ITC) using SPSS. RESULTS Six EBM based OSCE stations assessing various stages of EBM were created for use in high stakes summative OSCEs for different year groups across the undergraduate medical degree. All OSCE stations, except for one, had excellent correlation coefficients and hence a high reliability, ranging from 0.21-0.49. The domain mean score ranged from 13.33 to 16.83 out of 20. High reliability was demonstrated for the each of the summative OSCE circuits (Cronbach's alpha = 0.67-0.85). In the CREATE framework these stations assessed knowledge, skills, and behaviour of medical students in asking, searching, appraising, and integrating evidence in practice. The OSCE stations were useful in assessing six core evidence-based practice competencies, which are meant to be practiced with exercises. A spiral model of OSCEs of increasing complexity was proposed to assess EBM competency as students progressed through the MBChB course. CONCLUSIONS The use of the OSCEs is a feasible method of authentically assessing leaner EBM performance and behaviour in a high stakes assessment setting. Use of valid and reliable EBM-based OSCE stations provide evidence for continued development of a hierarchy of assessing scaffolded learning and mastery of EBM competency. Further work is needed to assess their predictive validity.
Collapse
Affiliation(s)
- B Kumaravel
- The University of Buckingham Medical School, Hunter Street, Buckingham, MK18 1EG, UK.
| | - C Stewart
- University of Nottingham, Nottingham, UK
| | - D Ilic
- School of Public Health and Preventive Medicine, Monash University, Melbourne, Australia
| |
Collapse
|
3
|
Kumaravel B, Hearn JH, Jahangiri L, Pollard R, Stocker CJ, Nunan D. A systematic review and taxonomy of tools for evaluating evidence-based medicine teaching in medical education. Syst Rev 2020; 9:91. [PMID: 32331530 PMCID: PMC7183115 DOI: 10.1186/s13643-020-01311-y] [Citation(s) in RCA: 23] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/22/2019] [Accepted: 02/24/2020] [Indexed: 11/23/2022] Open
Abstract
BACKGROUND The importance of teaching the skills and practice of evidence-based medicine (EBM) for medical professionals has steadily grown in recent years. Alongside this growth is a need to evaluate the effectiveness of EBM curriculum as assessed by competency in the five 'A's': asking, acquiring, appraising, applying and assessing (impact and performance). EBM educators in medical education will benefit from a compendium of existing assessment tools for assessing EBM competencies in their settings. The purpose of this review is to provide a systematic review and taxonomy of validated tools that evaluate EBM teaching in medical education. METHODS We searched MEDLINE, EMBASE, Cochrane library, Educational Resources Information Centre (ERIC), Best Evidence Medical Education (BEME) databases and references of retrieved articles published between January 2005 and March 2019. We have presented the identified tools along with their psychometric properties including validity, reliability and relevance to the five domains of EBM practice and dimensions of EBM learning. We also assessed the quality of the tools to identify high quality tools as those supported by established interrater reliability (if applicable), objective (non-self-reported) outcome measures and achieved ≥ 3 types of established validity evidence. We have reported our study in accordance with the PRISMA guidelines. RESULTS We identified 1719 potentially relevant articles of which 63 full text articles were assessed for eligibility against inclusion and exclusion criteria. Twelve articles each with a unique and newly identified tool were included in the final analysis. Of the twelve tools, all of them assessed the third step of EBM practice (appraise) and four assessed just that one step. None of the twelve tools assessed the last step of EBM practice (assess). Of the seven domains of EBM learning, ten tools assessed knowledge gain, nine assessed skills and-one assessed attitude. None addressed reaction to EBM teaching, self-efficacy, behaviours or patient benefit. Of the twelve tools identified, six were high quality. We have also provided a taxonomy of tools using the CREATE framework, for EBM teachers in medical education. CONCLUSIONS Six tools of reasonable validity are available for evaluating most steps of EBM and some domains of EBM learning. Further development and validation of tools that evaluate all the steps in EBM and all educational outcome domains are needed. SYSTEMATIC REVIEW REGISTRATION PROSPERO CRD42018116203.
Collapse
Affiliation(s)
- Bharathy Kumaravel
- University of Buckingham Medical School, Hunter Street, Buckingham, MK18 1EG UK
| | - Jasmine Heath Hearn
- Department of Psychology, Manchester Metropolitan University, Brooks Building, 53 Bonsall Street, Manchester, M15 6GX UK
| | - Leila Jahangiri
- Department of Life Sciences, Birmingham City University, Birmingham, B15 3TN UK
| | - Rachel Pollard
- Franciscan Library, University of Buckingham, Buckingham, MK18 1EG UK
| | | | - David Nunan
- Centre for Evidence Based Medicine, Nuffield Department of Primary Care Health Sciences, Oxford, OX2 6GG UK
| |
Collapse
|
4
|
Heidemann LA, Keilin CA, Santen SA, Fitzgerald JT, Zaidi NL, Whitman L, Jones EK, Lypson ML, Morgan HK. Does Performance on Evidence-Based Medicine and Urgent Clinical Scenarios Assessments Deteriorate During the Fourth Year of Medical School? Findings From One Institution. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2019; 94:731-737. [PMID: 30640259 DOI: 10.1097/acm.0000000000002583] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
PURPOSE The fourth year of medical school (M4) should prepare students for residency yet remains generally unstructured, with ill-defined goals. The primary aim of this study was to determine whether there were performance changes in evidence-based medicine (EBM) and urgent clinical scenarios (UCS) assessments before and after M4 year. METHOD University of Michigan Medical School graduates who matched into internship at Michigan Medicine completed identical assessments on EBM and UCS at the beginning of M4 year and 13 months later during postgraduate year 1 (PGY1) orientation. Individual scores on these assessments were compared using paired t test analysis. The associations of academic performance, residency specialty classification, and initial performance on knowledge changes were analyzed. RESULTS During academic years 2014 and 2015, 76 students matched into a Michigan Medicine internship; 52 completed identical EBM stations and 53 completed UCS stations. Learners' performance on the EBM assessment decreased from M4 to PGY1 (mean 93% [SD = 7%] vs. mean 80% [SD = 13%], P < .01), while performance on UCS remained stable (mean 80% [SD = 9%] vs. mean 82% [SD = 8%], P = .22). High M4 performers experienced a greater rate of decline in knowledge level compared with low M4 performers for EBM (-20% vs. -4%, P = .01). Residency specialty and academic performance did not affect performance. CONCLUSIONS This study demonstrated degradation of performance in EBM during the fourth year and adds to the growing literature that highlights the need for curricular reform during this year.
Collapse
Affiliation(s)
- Lauren A Heidemann
- L.A. Heidemann is clinical assistant professor of internal medicine, University of Michigan Medical School, Ann Arbor, Michigan. C.A. Keilin is a medical student, University of Michigan Medical School, Ann Arbor, Michigan. S.A. Santen was assistant dean of evaluation and assessment and professor of emergency medicine and learning health sciences, University of Michigan Medical School, Ann Arbor, Michigan, at the time the study was conducted. She is currently senior associate dean, Evaluation, Assessment, and Scholarship of Learning, Virginia Commonwealth University School of Medicine, Richmond, Virginia. J.T. Fitzgerald is professor, Department of Learning Health Sciences, University of Michigan, and Geriatric Research Education and Clinical Center, Arbor VA Medical Center, Ann Arbor, Michigan. N.L. Zaidi is associate director of advancing scholarship, Office of Medical Student Education, University of Michigan Medical School, Ann Arbor, Michigan. L. Whitman is standardized patient program manager and educator, University of Michigan Medical School, Ann Arbor, Michigan. E.K. Jones is clinical assistant professor of family medicine, University of Michigan Medical School, Ann Arbor, Michigan. M.L. Lypson is director of medical and dental education, Office of Academic Affiliations, Department of Veterans Affairs, Washington, DC, and adjunct professor of internal medicine and learning health sciences, University of Michigan Medical School, Ann Arbor, Michigan. H.K. Morgan is director, Comprehensive Clinical Assessment, and clinical associate professor of obstetrics and gynecology and learning health sciences, University of Michigan Medical School, Ann Arbor, Michigan
| | | | | | | | | | | | | | | | | |
Collapse
|
5
|
Aranda JP, Davies ML, Jackevicius CA. Student pharmacists' performance and perceptions on an evidence-based medicine objective structured clinical examination. CURRENTS IN PHARMACY TEACHING & LEARNING 2019; 11:302-308. [PMID: 30904154 DOI: 10.1016/j.cptl.2018.12.012] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/18/2018] [Revised: 10/13/2018] [Accepted: 12/06/2018] [Indexed: 05/26/2023]
Abstract
BACKGROUND AND PURPOSE Studies have examined evidence-based medicine (EBM) focused objective structured clinical examinations (OSCEs) in medical training, but data are lacking in pharmacy trainees. This study sought to assess student pharmacists' performance on and perceptions of a novel EBM OSCE. EDUCATIONAL ACTIVITY AND SETTING This EBM OSCE included answering a drug-information inquiry, researching background questions, calling a simulated provider to acquire specific patient information, developing a foreground clinical question, reviewing pre-appraised trial synopses, and applying evidence to write a recommendation. Pharmacy faculty served as simulated providers and assessed students on knowledge/analytical (AC) and global communication (GC) skills. Students completed a worksheet (WS) that included developing a patient, intervention, comparison, outcome (PICO) statement, trial selection, and clinical recommendation. After OSCE completion, students were surveyed regarding perceptions of their performance and OSCE applicability. Outcomes assessed were performance scores (AC, GC, WS) and student perceptions. FINDINGS One-hundred twenty-nine students completed the survey and were included in analysis. AC, WS, and GC performance [median (IQR)] were 75.0 (37.8), 86.4 (36.9), and 88.9 (22.2), respectively, on a 100-point scale. On the WS, 89% of students developed a suitable searchable clinical question and 61% selected the correct trial synopsis to apply to the case. Students felt literature application and WS development were most challenging. A majority of students felt this OSCE increased comfort in engaging with providers (74%) and that these skills correlate with real clinical scenarios (77%). SUMMARY OSCEs can be a valuable tool for simulating clinical scenarios and assessing student pharmacists' EBM skills.
Collapse
Affiliation(s)
- Josephine P Aranda
- Western University of Health Sciences, College of Pharmacy, 309 E. Second St., Pomona, CA, United States.
| | - Marie L Davies
- Western University of Health Sciences, College of Pharmacy, 309 E. Second St., Pomona, CA, United States.
| | - Cynthia A Jackevicius
- Western University of Health Sciences, College of Pharmacy, 309 E. Second St., Pomona, CA, United States; VA Greater Los Angeles Healthcare System, Los Angeles, CA, United States; Institute for Clinical Evaluative Sciences, Toronto, Ontario, Canada; University Health Network, Toronto, Ontario, Canada; University of Toronto, Toronto, Ontario, Canada.
| |
Collapse
|
6
|
Saunders H, Vehviläinen‐Julkunen K. Key considerations for selecting instruments when evaluating healthcare professionals’ evidence‐based practice competencies: A discussion paper. J Adv Nurs 2018; 74:2301-2311. [DOI: 10.1111/jan.13802] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2018] [Revised: 06/02/2018] [Accepted: 06/27/2018] [Indexed: 12/01/2022]
Affiliation(s)
- Hannele Saunders
- Faculty of Health Sciences Department of Nursing Science University of Eastern Finland Kuopio Finland
- South‐Eastern Finland University of Applied Sciences (Xamk) Kuopio Finland
| | - Katri Vehviläinen‐Julkunen
- Faculty of Health Sciences Department of Nursing Science University of Eastern Finland and Kuopio University Hospital Kuopio Finland
| |
Collapse
|
7
|
Elnicki DM, Aiyer MK, Cannarozzi ML, Carbo A, Chelminski PR, Chheda SG, Chudgar SM, Harrell HE, Hood LC, Horn M, Johl K, Kane GC, McNeill DB, Muntz MD, Pereira AG, Stewart E, Tarantino H, Vu TR. An Entrustable Professional Activity (EPA)-Based Framework to Prepare Fourth-Year Medical Students for Internal Medicine Careers. J Gen Intern Med 2017. [PMID: 28634908 PMCID: PMC5653547 DOI: 10.1007/s11606-017-4089-8] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
The purpose of the fourth year of medical school remains controversial. Competing demands during this transitional phase cause confusion for students and educators. In 2014, the Association of American Medical Colleges (AAMC) released 13 Core Entrustable Professional Activities for Entering Residency (CEPAERs). A committee comprising members of the Clerkship Directors in Internal Medicine and the Association of Program Directors in Internal Medicine applied these principles to preparing students for internal medicine residencies. The authors propose a curricular framework based on five CEPAERs that were felt to be most relevant to residency preparation, informed by prior stakeholder surveys. The critical areas outlined include entering orders, forming and answering clinical questions, conducting patient care handovers, collaborating interprofessionally, and recognizing patients requiring urgent care and initiating that care. For each CEPAER, the authors offer suggestions about instruction and assessment of competency. The fourth year of medical school can be rewarding for students, while adequately preparing them to begin residency, by addressing important elements defined in the core entrustable activities. Thus prepared, new residents can function safely and competently in supervised postgraduate settings.
Collapse
Affiliation(s)
- D Michael Elnicki
- University of Pittsburgh School of Medicine, Pittsburgh, PA, USA. .,University of Pittsburgh, Pittsburgh, PA, USA.
| | - Meenakshy K Aiyer
- University of Illinois College of Medicine at Peoria, Peoria, IL, USA
| | | | - Alexander Carbo
- Harvard Medical School, Beth Israel Deaconess Medical Center, Boston, MA, USA
| | - Paul R Chelminski
- University of North Carolina School of Medicine, Chapel Hill, NC, USA
| | - Shobhina G Chheda
- University of Wisconsin School of Medicine and Public Health, Madison, WI, USA
| | | | | | - L Chad Hood
- University of Central Florida College of Medicine, Orlando, FL, USA
| | - Michelle Horn
- University of Mississippi School of Medicine, Jackson, MS, USA
| | - Karnjit Johl
- University of California-Davis School of Medicine, Sacramento, CA, USA
| | - Gregory C Kane
- Sidney Kimmel Medical College at Thomas Jefferson University, Philadelphia, PA, USA
| | | | | | - Anne G Pereira
- University of Minnesota Medical School, Minneapolis, MN, USA
| | - Emily Stewart
- Sidney Kimmel Medical College at Thomas Jefferson University, Philadelphia, PA, USA
| | | | - T Robert Vu
- Indiana University School of Medicine, Charlotte, NC, USA
| |
Collapse
|
8
|
Maggio LA, Durieux N, Tannery NH. Librarians in Evidence-Based Medicine Curricula: A Qualitative Study of Librarian Roles, Training, and Desires for Future Development. Med Ref Serv Q 2017; 34:428-40. [PMID: 26496397 DOI: 10.1080/02763869.2015.1082375] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
This study aims to describe librarians' roles in evidence-based medicine (EBM) from the librarian perspective, identify how librarians are trained to teach, and highlight preferences for professional development. A multiinstitution qualitative study was conducted. Nine medical librarians identified by their faculty as integrated into EBM training were interviewed. Participants' descriptions indicated that they were active in curriculum development, deployment (including teaching activities), and assessment to support EBM. Participants identified direct experience and workshop participation as primary methods of learning to teach. Participants desired continuing development as teachers and requested opportunities for in-person workshops, shadowing physicians, and online training.
Collapse
Affiliation(s)
- Lauren A Maggio
- a Stanford University School of Medicine , Stanford , California , USA
| | - Nancy Durieux
- b Life Sciences Library , University of Liege , Liege , Belgium
| | - Nancy H Tannery
- c Health Sciences Library System , University of Pittsburgh , Pittsburgh , Pennsylvania , USA
| |
Collapse
|
9
|
Umscheid CA, Maenner MJ, Mull N, Veesenmeyer AF, Farrar JT, Goldfarb S, Morrison G, Albanese MA, Frohna JG, Feldstein DA. Using educational prescriptions to teach medical students evidence-based medicine. MEDICAL TEACHER 2016; 38:1112-1117. [PMID: 27075864 PMCID: PMC5866052 DOI: 10.3109/0142159x.2016.1170775] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/21/2023]
Abstract
PURPOSE To evaluate feasibility and impact of evidence-based medicine (EBM) educational prescriptions (EPs) in medical student clerkships. METHODS Students answered clinical questions during clerkships using EPs, which guide learners through the "four As" of EBM. Epidemiology fellows graded EPs using a rubric. Feasibility was assessed using descriptive statistics and student and fellow end-of-study questionnaires, which also measured impact. In addition, for each EP, students reported patient impact. Impact on EBM skills was assessed by change in EP scores over time and scores on an EBM objective structured clinical exam (OSCE) that were compared to controls from the prior year. RESULTS 117 students completed 402 EPs evaluated by 24 fellows. Average score was 7.34/9.00 (SD 1.58). 69 students (59%) and 21 fellows (88%) completed questionnaires. Most students thought EPs improved "Acquiring" and "Appraising". Almost half thought EPs improved "Asking" and "Applying". Fellows did not value grading EPs. For 18% of EPs, students reported a "change" or "potential change" in treatment. 56% "confirmed" treatment. EP scores increased by 1.27 (95% CI: 0.81-1.72). There were no differences in OSCE scores between cohorts. CONCLUSIONS Integrating EPs into clerkships is feasible and has impact, yet OSCEs were unchanged, and research fellows had limitations as evaluators.
Collapse
Affiliation(s)
- Craig A Umscheid
- a Center for Evidence-based Practice , University of Pennsylvania , Philadelphia , PA , USA
- b Department of Medicine , University of Pennsylvania , Philadelphia , PA , USA
- c Center for Clinical Epidemiology and Biostatistics , University of Pennsylvania , Philadelphia , PA , USA
- d Department of Biostatistics and Epidemiology , University of Pennsylvania , Philadelphia , PA , USA
- e Leonard Davis Institute of Health Economics , University of Pennsylvania , Philadelphia , PA , USA
- f Institute for Translational Medicine and Therapeutics , University of Pennsylvania , Philadelphia , PA , USA
- g Institute of Biomedical Informatics , University of Pennsylvania , Philadelphia , PA , USA
| | - Matthew J Maenner
- h School of Medicine and Public Health, University of Wisconsin , Madison , WI , USA
| | - Nikhil Mull
- a Center for Evidence-based Practice , University of Pennsylvania , Philadelphia , PA , USA
- b Department of Medicine , University of Pennsylvania , Philadelphia , PA , USA
| | | | - John T Farrar
- c Center for Clinical Epidemiology and Biostatistics , University of Pennsylvania , Philadelphia , PA , USA
- d Department of Biostatistics and Epidemiology , University of Pennsylvania , Philadelphia , PA , USA
| | - Stanley Goldfarb
- b Department of Medicine , University of Pennsylvania , Philadelphia , PA , USA
| | - Gail Morrison
- b Department of Medicine , University of Pennsylvania , Philadelphia , PA , USA
| | - Mark A Albanese
- h School of Medicine and Public Health, University of Wisconsin , Madison , WI , USA
| | - John G Frohna
- h School of Medicine and Public Health, University of Wisconsin , Madison , WI , USA
| | - David A Feldstein
- h School of Medicine and Public Health, University of Wisconsin , Madison , WI , USA
| |
Collapse
|
10
|
Kersten HB, Frohna JG, Giudice EL. Validation of an Evidence-Based Medicine Critically Appraised Topic Presentation Evaluation Tool (EBM C-PET). J Grad Med Educ 2013; 5:252-6. [PMID: 24404268 PMCID: PMC3693689 DOI: 10.4300/jgme-d-12-00049.1] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/24/2012] [Revised: 07/03/2012] [Accepted: 09/09/2012] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Competence in evidence-based medicine (EBM) is an important clinical skill. Pediatrics residents are expected to acquire competence in EBM during their education, yet few validated tools exist to assess residents' EBM skills. OBJECTIVE We sought to develop a reliable tool to evaluate residents' EBM skills in the critical appraisal of a research article, the development of a written EBM critically appraised topic (CAT) synopsis, and a presentation of the findings to colleagues. METHODS Instrument development used a modified Delphi technique. We defined the skills to be assessed while reviewing (1) a written CAT synopsis and (2) a resident's EBM presentation. We defined skill levels for each item using the Dreyfus and Dreyfus model of skill development and created behavioral anchors using a frame-of-reference training technique to describe performance for each skill level. We evaluated the assessment instrument's psychometric properties, including internal consistency and interrater reliability. RESULTS The EBM Critically Appraised Topic Presentation Evaluation Tool (EBM C-PET) is composed of 14 items that assess residents' EBM and global presentation skills. Resident presentations (N = 27) and the corresponding written CAT synopses were evaluated using the EBM C-PET. The EBM C-PET had excellent internal consistency (Cronbach α = 0.94). Intraclass correlation coefficients were used to assess interrater reliability. Intraclass correlation coefficients for individual items ranged from 0.31 to 0.74; the average intraclass correlation coefficients for the 14 items was 0.67. CONCLUSIONS We identified essential components of an assessment tool for an EBM CAT synopsis and presentation with excellent internal consistency and a good level of interrater reliability across 3 different institutions. The EBM C-PET is a reliable tool to document resident competence in higher-level EBM skills.
Collapse
|
11
|
Wyer PC, Naqvi Z, Dayan PS, Celentano JJ, Eskin B, Graham MJ. Do workshops in evidence-based practice equip participants to identify and answer questions requiring consideration of clinical research? A diagnostic skill assessment. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2009; 14:515-533. [PMID: 18766450 DOI: 10.1007/s10459-008-9135-1] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/23/2007] [Accepted: 08/05/2008] [Indexed: 05/26/2023]
Abstract
Evidence-based practice (EBP) requires practitioners to identify and formulate questions in response to patient encounters, and to seek, select, and appraise applicable clinical research. A standardized workshop format serves as the model for training of medical educators in these skills. We developed an evaluation exercise to assess the ability to identify and solve a problem requiring the use of targeted skills and administered it to 47 North American junior faculty and residents in various specialties at the close of two short workshops in EBP. Prior to the workshop, subjects reported prior training in EBP and completed a previously validated knowledge test. Our post-workshop exercise differed from the baseline measures and required participants to spontaneously identify a suitable question in response to a simulated clinical encounter, followed by a description of a stepwise approach to answering it. They then responded to successively more explicitly prompted queries relevant to their question. We analyzed responses to identify areas of skill deficiency and potential reasons for these deficiencies. Twelve respondents (26%) initially failed to identify a suitable question in response to the clinical scenario. Ability to choose a suitable question correlated with the ability to connect an original question to an appropriate study design. Prior EBP training correlated with the pretest score but not with performance on our exercise. Overall performance correlated with ability to correctly classify their questions as pertaining to therapy, diagnosis, prognosis, or harm. We conclude that faculty and residents completing standard workshops in EBP may still lack the ability to initiate and investigate original clinical inquiries using EBP skills.
Collapse
Affiliation(s)
- Peter C Wyer
- Department of Medicine, College of Physicians & Surgeons, Columbia University, New York, NY, USA.
| | | | | | | | | | | |
Collapse
|
12
|
Chen HC, Tan JPG, O'Sullivan P, Boscardin C, Li A, Muller J. Impact of an information retrieval and management curriculum on medical student citations. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2009; 84:S38-41. [PMID: 19907382 DOI: 10.1097/acm.0b013e3181b36fba] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/08/2023]
Abstract
BACKGROUND Increasingly, schools have integrated evidence-based medicine into their curricula. Most efforts focus on critical appraisal of literature rather than information retrieval and management (IRAM) skills. We implemented two versions of an IRAM curriculum (workshop alone and workshop with librarian visit) and evaluated their effectiveness. METHOD First-year medical students in a problem-based learning course researched six learning issues (LIs). We compared the number and completeness of LI citations of students receiving a Workshop/Librarian or a Workshop intervention with those of a control group. RESULTS A total of 2,415 LIs containing 6,717 citations from 429 students were scored. Among the Workshop/Librarian, Workshop, and control groups, respectively, the percentage of LIs without citations was 9.3%, 11.0%, and 14.0%, the percentage of citations with complete documentation was 64.9%, 61.0%, and 29.4%, and the frequency of citing primary articles was 24.7%, 13.2%, and 18.8% (P < .05). CONCLUSIONS An IRAM curriculum that includes a workshop plus librarian participation produced the best student citation habits.
Collapse
Affiliation(s)
- Huiju C Chen
- UCSF Department of Pediatrics, 500 Parnassus Avenue, MUE 406, Box 0136, San Francisco, CA 94143-0136, USA.
| | | | | | | | | | | |
Collapse
|
13
|
Ilic D. Assessing competency in Evidence Based Practice: strengths and limitations of current tools in practice. BMC MEDICAL EDUCATION 2009; 9:53. [PMID: 19656412 PMCID: PMC2728711 DOI: 10.1186/1472-6920-9-53] [Citation(s) in RCA: 43] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/11/2009] [Accepted: 08/06/2009] [Indexed: 05/21/2023]
Abstract
BACKGROUND Evidence Based Practice (EBP) involves making clinical decisions informed by the most relevant and valid evidence available. Competence can broadly be defined as a concept that incorporates a variety of domains including knowledge, skills and attitudes. Adopting an evidence-based approach to practice requires differing competencies across various domains including literature searching, critical appraisal and communication. This paper examines the current tools available to assess EBP competence and compares their applicability to existing assessment techniques used in medicine, nursing and health sciences. DISCUSSION Only two validated assessment tools have been developed to specifically assess all aspects of EBP competence. Of the two tools (Berlin and Fresno tools), only the Fresno tool comprehensively assesses EBP competency across all relevant domains. However, both tools focus on assessing EBP competency in medical students; therefore neither can be used for assessing EBP competency across different health disciplines. The Objective Structured Clinical Exam (OSCE) has been demonstrated as a reliable and versatile tool to assess clinical competencies, practical and communication skills. The OSCE has scope as an alternate method for assessing EBP competency, since it combines assessment of cognitive skills including knowledge, reasoning and communication. However, further research is needed to develop the OSCE as a viable method for assessing EBP competency. SUMMARY Demonstrating EBP competence is a complex task - therefore no single assessment method can adequately provide all of the necessary data to assess complete EBP competence. There is a need for further research to explore how EBP competence is best assessed; be it in written formats, such as the Fresno tool, or another format, such as the OSCE. Future tools must also incorporate measures of assessing how EBP competence affects clinician behaviour and attitudes as well as clinical outcomes in real-time situations. This research should also be conducted across a variety of health disciplines to best inform practice.
Collapse
Affiliation(s)
- Dragan Ilic
- Monash Institute of Health Services Research, Monash University, Clayton, VIC 3168, Australia.
| |
Collapse
|
14
|
Casey PM, Goepfert AR, Espey EL, Hammoud MM, Kaczmarczyk JM, Katz NT, Neutens JJ, Nuthalapaty FS, Peskin E. To the point: reviews in medical education--the Objective Structured Clinical Examination. Am J Obstet Gynecol 2009; 200:25-34. [PMID: 19121656 DOI: 10.1016/j.ajog.2008.09.878] [Citation(s) in RCA: 46] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2008] [Revised: 09/19/2008] [Accepted: 09/30/2008] [Indexed: 11/25/2022]
Abstract
This article, the eighth in the To the Point Series prepared by the Association of Professors of Gynecology and Obstetrics Undergraduate Medical Education Committee, discusses the effectiveness of the Objective Structured Clinical Examination (OSCE) for assessment of learners' knowledge, skills, and behaviors. The OSCE has also been used for the appraisal of residents and physicians undergoing licensure examinations; herein we focus on its application to undergraduate medical education. We review evidence for best practices and recommendations on effective use of the OSCE and requirements for and challenges to its implementation, including creative ways to design an OSCE program with a limited budget. We discuss its role in providing formative and summative feedback and describe learner performance on the OSCE as the OSCE relates to subsequent testing, including US Medical Licensing Examination step 1. A representative case with assessment used at the authors' medical schools is included.
Collapse
|
15
|
Levine AE, Bebermeyer RD, Chen JW, Davis D, Harty C. Development of an Interdisciplinary Course in Information Resources and Evidence-Based Dentistry. J Dent Educ 2008. [DOI: 10.1002/j.0022-0337.2008.72.9.tb04581.x] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Affiliation(s)
- Alan E. Levine
- Department of Biochemistry and Molecular Biology; University of Texas Health Science Center at Houston Medical School
| | - Richard D. Bebermeyer
- Department of Restorative Dentistry and Biomaterials; University of Texas Health Science Center at Houston Dental Branch
| | - Jung-Wei Chen
- Department of Pediatric Dentistry; University of Texas Health Science Center at Houston Dental Branch
| | - Dell Davis
- Houston Academy of Medicine-Texas Medical Center Library; University of Texas Health Science Center at Houston Dental Branch
| | - Carolyn Harty
- Houston Academy of Medicine-Texas Medical Center Library; University of Texas Health Science Center at Houston Dental Branch
| |
Collapse
|
16
|
Kronfly Rubiano E, Ricarte Díez JI, Juncosa Font S, Martínez Carretero JM. [Evaluation of the clinical competence of Catalonian medicine schools 1994-2006. Evolution of examination formats until the objective and structured clinical evaluation (ECOE)]. Med Clin (Barc) 2008; 129:777-84. [PMID: 18093480 DOI: 10.1157/13113768] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022]
|
17
|
Stark R, Helenius IM, Schimming LM, Takahara N, Kronish I, Korenstein D. Real-time EBM: from bed board to keyboard and back. J Gen Intern Med 2007; 22:1656-60. [PMID: 17922170 PMCID: PMC2219829 DOI: 10.1007/s11606-007-0387-x] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/21/2007] [Revised: 07/18/2007] [Accepted: 09/13/2007] [Indexed: 11/28/2022]
Abstract
BACKGROUND To practice Evidence-Based Medicine (EBM), physicians must quickly retrieve evidence to inform medical decisions. Internal Medicine (IM) residents receive little formal education in electronic database searching, and have identified poor searching skills as a barrier to practicing EBM. OBJECTIVE To design and implement a database searching tutorial for IM residents on inpatient rotations and to evaluate its impact on residents' skill and comfort searching MEDLINE and filtered EBM resources. DESIGN Randomized controlled trial. Residents randomized to the searching tutorial met for up to 6 1-hour small group sessions to search for answers to questions about current hospitalized patients. PARTICIPANTS Second- and 3rd-year IM residents. MEASUREMENTS Residents in both groups completed an Objective Structured Searching Evaluation (OSSE), searching for primary evidence to answer 5 clinical questions. OSSE outcomes were the number of successful searches, search times, and techniques utilized. Participants also completed self-assessment surveys measuring frequency and comfort using EBM databases. RESULTS During the OSSE, residents who participated in the intervention utilized more searching techniques overall (p < .01) and used PubMed's Clinical Queries more often (p < .001) than control residents. Searching "success" and time per completed search did not differ between groups. Compared with controls, intervention residents reported greater comfort using MEDLINE (p < .05) and the Cochrane Library (p < .05) on post-intervention surveys. The groups did not differ in comfort using ACP Journal Club, or in self-reported frequency of use of any databases. CONCLUSIONS An inpatient EBM searching tutorial improved searching techniques of IM residents and resulted in increased comfort with MEDLINE and the Cochrane Library, but did not impact overall searching success.
Collapse
Affiliation(s)
- Rachel Stark
- Department of Medicine, Montefiore Medical Center/Albert Einstein College of Medicine, Bronx, NY, USA
| | | | | | | | | | | |
Collapse
|