1
|
Sidhu NS, Fleming S. Re-examining single-moment-in-time high-stakes examinations in specialist training: A critical narrative review. MEDICAL TEACHER 2024; 46:528-536. [PMID: 37740944 DOI: 10.1080/0142159x.2023.2260081] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/25/2023]
Abstract
In this critical narrative review, we challenge the belief that single-moment-in-time high-stakes examinations (SMITHSEx) are an essential component of contemporary specialist training. We explore the arguments both for and against SMITHSEx, examine potential alternatives, and discuss the barriers to change.SMITHSEx are viewed as the "gold standard" assessment of competence but focus excessively on knowledge assessment rather than capturing essential competencies required for safe and competent workplace performance. Contrary to popular belief, regulatory bodies do not mandate SMITHSEx in specialist training. Though acting as significant drivers of learning and professional identity formation, these attributes are not exclusive to SMITHSEx.Skills such as crisis management, procedural skills, professionalism, communication, collaboration, lifelong learning, reflection on practice, and judgement are often overlooked by SMITHSEx. Their inherent design raises questions about the validity and objectivity of SMITHSEx as a measure of workplace competence. They have a detrimental impact on trainee well-being, contributing to burnout and differential attainment.Alternatives to SMITHSEx include continuous low-stakes assessments throughout training, ongoing evaluation of competence in the workplace, and competency-based medical education (CBME) concepts. These aim to provide a more comprehensive and context-specific assessment of trainees' competence while also improving trainee welfare.Specialist training colleges should evolve from exam providers to holistic education sources. Assessments should emphasise essential practical knowledge over trivia, align with clinical practice, aid learning, and be part of a diverse toolkit. Eliminating SMITHSEx from specialist training will foster a competency-based approach, benefiting future medical professionals' well-being and success.
Collapse
Affiliation(s)
- Navdeep S Sidhu
- Department of Anaesthesiology, School of Medicine, University of Auckland, Auckland, New Zealand
- Department of Anaesthesia and Perioperative Medicine, North Shore Hospital, Auckland, New Zealand
| | - Simon Fleming
- Department of Hand Surgery, Royal North Shore Hospital, Sydney, New South Wales, Australia
| |
Collapse
|
2
|
Progress testing: An educational perspective exploring the rationale for progress testing and its introduction into a Diagnostic Radiography curriculum. J Med Imaging Radiat Sci 2023; 54:35-42. [PMID: 36681618 DOI: 10.1016/j.jmir.2022.12.009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2022] [Revised: 12/06/2022] [Accepted: 12/13/2022] [Indexed: 01/22/2023]
Abstract
INTRODUCTION In March 2020, the first diagnostic radiography degree apprenticeship programme in England was launched at the authors' institution. As part of the programme development and design, the programme development team explored and then implemented progress testing into a strand of the programme. The objective of this educational perspective is to scrutinise the literature around the use of progress testing in higher education programmes, namely medicine, to explain how and why this decision was reached. METHODS The initial search strategy was developed using the electronic databases CINHAL Complete and SCOPUS. Key words included 'progress test' and 'medicine' or 'health' or 'education' or 'higher education'. Eliminating articles that were not relevant, and also identifying and adding additional articles by key authors and experts resulted in thirty-three key articles being considered for review. RESULTS The thirty-three articles were a mixture of review articles, empirical research, case studies and conference presentations. Five key themes were identified which are discussed in this article; the evolution of progress testing; advantages of progress testing, disadvantages of progress testing, developing a test framework and academic progression and student feedback. DISCUSSION Progress testing is now well established in pre-registration medical programmes globally. The advantages of progress testing and the use of frequent look rapid remediation appear to be undisputed. Key disadvantages with progress testing were identified as it being an administrative heavy assessment process as well as a perceived bias towards male students undertaking this type of assessment. CONCLUSION Now this assessment practice is established within medicine, it seems reasonable to explore its use in other areas of healthcare, such as radiography.
Collapse
|
3
|
Dion V, St-Onge C, Bartman I, Touchie C, Pugh D. Written-Based Progress Testing: A Scoping Review. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2022; 97:747-757. [PMID: 34753858 DOI: 10.1097/acm.0000000000004507] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
PURPOSE Progress testing is an increasingly popular form of assessment in which a comprehensive test is administered to learners repeatedly over time. To inform potential users, this scoping review aimed to document barriers, facilitators, and potential outcomes of the use of written progress tests in higher education. METHOD The authors followed Arksey and O'Malley's scoping review methodology to identify and summarize the literature on progress testing. They searched 6 databases (Academic Search Complete, CINAHL, ERIC, Education Source, MEDLINE, and PsycINFO) on 2 occasions (May 22, 2018, and April 21, 2020) and included articles written in English or French and pertaining to written progress tests in higher education. Two authors screened articles for the inclusion criteria (90% agreement), then data extraction was performed by pairs of authors. Using a snowball approach, the authors also screened additional articles identified from the included reference lists. They completed a thematic analysis through an iterative process. RESULTS A total of 104 articles were included. The majority of progress tests used a multiple-choice and/or true-or-false question format (95, 91.3%) and were administered 4 times a year (38, 36.5%). The most documented source of validity evidence was internal consistency (38, 36.5%). Four major themes were identified: (1) barriers and challenges to the implementation of progress testing (e.g., need for additional resources); (2) established collaboration as a facilitator of progress testing implementation; (3) factors that increase the acceptance of progress testing (e.g., formative use); and (4) outcomes and consequences of progress test use (e.g., progress testing contributes to an increase in knowledge). CONCLUSIONS Progress testing appears to have a positive impact on learning, and there is significant validity evidence to support its use. Although progress testing is resource- and time-intensive, strategies such as collaboration with other institutions may facilitate its use.
Collapse
Affiliation(s)
- Vincent Dion
- V. Dion is an undergraduate medical education student, Faculty of Medicine and Health Sciences, Université de Sherbrooke, Sherbrooke, Québec, Canada. He was a research assistant to the Paul Grand'Maison de la Société des médecins de l'Université de Sherbrooke research chair in medical education, Sherbrooke, Québec, Canada, at the time this work was completed
| | - Christina St-Onge
- C. St-Onge is professor, Department of Medicine, Faculty of Medicine and Health Sciences, Université de Sherbrooke, and the Paul Grand'Maison de la Société des médecins de l'Université de Sherbrooke research chair in medical education, Sherbrooke, Québec, Canada; ORCID: https://orcid.org/0000-0001-5313-0456
| | - Ilona Bartman
- I. Bartman is medical education research associate, Medical Council of Canada, Ottawa, Ontario, Canada; ORCID: https://orcid.org/0000-0002-2056-479X
| | - Claire Touchie
- C. Touchie is professor of medicine, University of Ottawa, Ottawa, Ontario, Canada. She was chief medical education officer, Medical Council of Canada, Ottawa, Ontario, Canada, at the time this work was completed; ORCID: https://orcid.org/0000-0001-7926-9720
| | - Debra Pugh
- D. Pugh is medical education advisor, Medical Council of Canada, and associate professor, Department of Medicine, University of Ottawa and The Ottawa Hospital, Ottawa, Ontario, Canada; ORCID: https://orcid.org/0000-0003-4076-9669
| |
Collapse
|
4
|
Iyeyasu JN, Cecilio-Fernandes D, de Carvalho KM. Longitudinal evaluation of the Ophthalmology residents in Brazil: an observational prospective study. SAO PAULO MED J 2022; 141:e202292. [PMID: 36197351 PMCID: PMC10065116 DOI: 10.1590/1516-3180.2022.0092.r1.01072022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/03/2022] [Accepted: 07/01/2022] [Indexed: 11/21/2022] Open
Abstract
BACKGROUND The longitudinal evaluation of students seems to be a better way to assess their knowledge compared with that of the traditional methods of evaluation, such as modular and final tests. Currently, progress testing is the most consolidated type of longitudinal testing method. However, despite being well consolidated as an assessment tool in medical education, the use of this type of test in residency programs is scarce. OBJECTIVES This study aimed to investigate residents' knowledge growth regarding residency training and to describe the implementation of a longitudinal evaluation test in ophthalmological residency training across several medical schools in Brazil. Finally, the study aimed to check whether performance in the tests can be used as a predictor of the results of the specialist title test. DESIGN AND SETTING This was a prospective observational study. This study was conducted using an online platform. METHODS Online tests were developed following the same pattern as the Brazilian Ophthalmology Council specialist tests. All the residents performed the test simultaneously. The tests were conducted once a year at the end of the school year. RESULTS A progress test was conducted across 13 services with 259 residents. Our results demonstrated that resident scores improved over the years (P < 0.0001) and had a moderate correlation with the Brazilian Opthalmology Council specialist test (P = 0.0156). CONCLUSION The progress test can be considered a valuable tool to assess knowledge, meaning their knowledge increased over residency training. In addition, it can be used as a predictor of the result in the specialist title test.
Collapse
Affiliation(s)
- Josie Naomi Iyeyasu
- MD. Ophthalmologist and Assistant Doctor, Low Vision and
Strabismus Sector, Hospital de Clínicas, Faculdade de Ciências Médicas da
Universidade Estadual de Campinas (HC/FCM/UNICAMP), Campinas (SP), Brazil
| | - Dario Cecilio-Fernandes
- Psy, MSc, PhD. Psycologist and Researcher, Department of
Psychology and Psychiatry, Faculdade de Ciências Médicas da Universidade
Estadual de Campinas (FCM/UNICAMP), Campinas (SP), Brazil
| | - Keila Monteiro de Carvalho
- MD. Ophthalomologist and Full Professor, Faculty of Medical
Sciences, Universidade Estadual de Campinas (UNICAMP); and Chief Department of
Ophthalmo-Otorrynolaringology, Faculdade de Ciências Médicas da Universidade
Estadual de Campinas (FCM/UNICAMP), Campinas (SP), Brazil
| |
Collapse
|
5
|
Sosna J, Pyatigorskaya N, Krestin G, Denton E, Stanislav K, Morozov S, Kumamaru KK, Jankharia B, Mildenberger P, Forster B, Schouman-Clayes E, Bradey A, Akata D, Brkljacic B, Grassi R, Plako A, Papanagiotou H, Maksimović R, Lexa F. International survey on residency programs in radiology: similarities and differences among 17 countries. Clin Imaging 2021; 79:230-234. [PMID: 34119915 DOI: 10.1016/j.clinimag.2021.05.011] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2020] [Revised: 05/05/2021] [Accepted: 05/10/2021] [Indexed: 11/16/2022]
Abstract
OBJECTIVE With the initiative of the ACR International Economics Committee, a multinational survey was conducted to evaluate radiology residency programs around the world. METHODS A 31-question survey was developed. It included: economic issues, program size and length, resident's activities during daytime and call, academic aspects including syllabus and examinations. Data was tabulated using the forementioned thematic framework and was qualitatively analyzed. RESULTS Responses were received from all 17 countries that were invited to participate (France, Netherlands, Israel, UK, Russia, USA, Japan, India, Germany, Canada, Turkey, Croatia, Serbia, Italy, Ireland, Hungary, and Greece). Residency length varied between 2 and 5 years. The certificate of residency completion is provided by a local hospital [4/17 (23%)], University [6/17 (36%)], National Board [6/17 (36%)], and Ministry of Health [1/17 (6%)]. There was variability among the number of residency programs and residents per program ranging from 15 to 300 programs per nation with a 1-700 residents in each one respectively. Salaries varied significantly and ranged from 8000 to 75,000 USD equivalent. Exams are an integral part of training in all surveyed countries. Length of call varied between 5 and 26 h and the number of monthly calls ranged from 3 to 6. The future of radiology was judged as growing in [12/17 (70%)] countries and stagnant in [5/17 (30%)] countries. DISCUSSION Radiology residency programs worldwide have many similarities. The differences are in the structure of the residency programs. Stagnation and uncertainties need to be addressed to ensure the continued development of the next generation of radiologists. SUMMARY STATEMENT There are many similarities in the academic aims and approach to education and training of radiology residency programs worldwide. The differences are in the structure of the residency programs and payments to individual residents.
Collapse
Affiliation(s)
- Jacob Sosna
- Department of Radiology, Hadassah Medical Center, Hebrew University Faculty of Medicine, Jerusalem, Israel.
| | - Nadya Pyatigorskaya
- Department of Radiology Pitié-Salpêtrière, Sorbonne Universités, UPMC Univ Paris 6, Paris, France; Department of Radiology, Erasmus Medical Center, Rotterdam, the Netherlands
| | - Gabriel Krestin
- Department of Radiology Pitié-Salpêtrière, Sorbonne Universités, UPMC Univ Paris 6, Paris, France
| | - Erika Denton
- Department of Radiology, Norfolk & Norwich University Hospital, UK
| | - Kim Stanislav
- Radiology Research and Practical Centre, Moscow, Russia
| | | | | | | | - Peter Mildenberger
- Department of Radiology, Universitätsmedizin Mainz Klinik und Poliklinik Radiologie, Mainz, Germany
| | - Bruce Forster
- Department of Radiology, University of British Columbia, Faculty of Medicine, Vancouver, Canada
| | | | - Adrian Bradey
- Department of Radiology, Mercy University Hospital, Cork, Ireland
| | - Deniz Akata
- Department of Radiology, Hacettepe University, Ankara, Turkey
| | - Boris Brkljacic
- Department of Radiology, Dubrava Hospital, University of Zagreb, Croatia
| | - Roberto Grassi
- Department of Radiology, Università della Campania Luigi Vanvitelli, Naples, Italy
| | - Andras Plako
- Department of Radiology, University of Szeged, Hungary
| | | | | | - Frank Lexa
- The Radiology Leadership Institute and Chair of the Commission on Leadership and Practice Development of the American College of Radiology, Reston, VA, United States of America
| |
Collapse
|
6
|
van Montfort D, Kok E, Vincken K, van der Schaaf M, van der Gijp A, Ravesloot C, Rutgers D. Expertise development in volumetric image interpretation of radiology residents: what do longitudinal scroll data reveal? ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2021; 26:437-466. [PMID: 33030627 PMCID: PMC8041671 DOI: 10.1007/s10459-020-09995-6] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/03/2019] [Accepted: 09/28/2020] [Indexed: 06/11/2023]
Abstract
The current study used theories on expertise development (the holistic model of image perception and the information reduction hypothesis) as a starting point to identify and explore potentially relevant process measures to monitor and evaluate expertise development in radiology residency training. It is the first to examine expertise development in volumetric image interpretation (i.e., CT scans) within radiology residents using scroll data collected longitudinally over five years of residency training. Consistent with the holistic model of image perception, the percentage of time spent on full runs, i.e. scrolling through more than 50% of the CT-scan slices (global search), decreased within residents over residency training years. Furthermore, the percentage of time spent on question-relevant areas in the CT scans increased within residents over residency training years, consistent with the information reduction hypothesis. Second, we examined if scroll patterns can predict diagnostic accuracy. The percentage of time spent on full runs and the percentage of time spent on question-relevant areas did not predict diagnostic accuracy. Thus, although scroll patterns over training years are consistent with visual expertise theories, they could not be used as predictors of diagnostic accuracy in the current study. Therefore, the relation between scroll patterns and performance needs to be further examined, before process measures can be used to monitor and evaluate expertise development in radiology residency training.
Collapse
Affiliation(s)
- Dorien van Montfort
- Department of Education, Utrecht University, Heidelberglaan 1, 3584CS, Utrecht, The Netherlands
| | - Ellen Kok
- Department of Education, Utrecht University, Heidelberglaan 1, 3584CS, Utrecht, The Netherlands.
| | - Koen Vincken
- Image Sciences Institute, Imaging Dept, University Medical Center, Utrecht, The Netherlands
| | - Marieke van der Schaaf
- Department of Education, Utrecht University, Heidelberglaan 1, 3584CS, Utrecht, The Netherlands
- Center for Research and Development of Education, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Anouk van der Gijp
- Department of Radiology, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Cécile Ravesloot
- Department of Radiology, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Dirk Rutgers
- Department of Radiology, University Medical Center Utrecht, Utrecht, The Netherlands
| |
Collapse
|
7
|
Rutgers D, van der Gijp A, Vincken K, Mol C, van der Schaaf M, Cate TT. Heat Map Analysis in Radiological Image Interpretation: An Exploration of Its Usefulness for Feedback About Image Interpretation Skills in Learners. Acad Radiol 2021; 28:414-423. [PMID: 31926860 DOI: 10.1016/j.acra.2019.11.017] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2019] [Revised: 10/25/2019] [Accepted: 11/21/2019] [Indexed: 11/25/2022]
|
8
|
Alkhalaf ZSA, Yakar D, de Groot JC, Dierckx RAJO, Kwee TC. Medical knowledge and clinical productivity: independently correlated metrics during radiology residency. Eur Radiol 2021; 31:5344-5350. [PMID: 33449176 PMCID: PMC8213654 DOI: 10.1007/s00330-020-07646-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2020] [Revised: 12/07/2020] [Accepted: 12/17/2020] [Indexed: 11/24/2022]
Abstract
Objective To determine the association between medical knowledge relevant to radiology practice (as measured by the Dutch radiology progress test [DRPT]) and clinical productivity during radiology residency. Methods This study analyzed the results of 6 DRPTs and time period–matched clinical production points of radiology residents affiliated to a tertiary care academic medical center between 2013 and 2016. The Spearman correlation analysis was performed to determine the association between DRPT percentile scores and average daily clinical production points. Linear regression analyses were performed to determine the association of DRPT percentile scores with average daily clinical production points, adjusted for age and gender of the radiology resident, and postgraduate year. Results Eighty-four DRPTs with time period–matched clinical production points were included. These 84 DRPTs were made by 29 radiology residents (18 males and 11 females) with a median age of 31 years (range: 26–38 years). The Spearman correlation coefficient between DRPT percentile scores and average daily clinical production points was 0.550 (95% confidence interval: 0.381–0.694) (p < 0.001), indicating a significant moderate positive association. On multivariate analysis, average daily clinical production points (β coefficient of 0.035, p = 0.003), female gender of the radiology resident (β coefficient of 12.690, p = 0.001), and postgraduate year (β coefficient of 10.179, p < 0.001) were significantly associated with DRPT percentile scores. These three independent variables achieved an adjusted R2 of 0.527. Conclusion Clinical productivity is independently associated with medical knowledge relevant to radiology practice during radiology residency. These findings indicate that clinical productivity of a resident could be a potentially relevant metric in a radiology training program. Key Points • There is a significant moderate correlation between medical knowledge relevant to radiology practice and clinical productivity during radiology residency. • Medical knowledge relevant to radiology practice remains independently associated with clinical productivity during radiology residency after adjustment for postgraduate year and gender. • Clinical productivity of a resident may be regarded as a potentially relevant metric in a radiology training program.
Collapse
Affiliation(s)
- Zahraa S A Alkhalaf
- Department of Radiology, Nuclear Medicine and Molecular Imaging, Medical Imaging Center, University Medical Center Groningen, University of Groningen, P.O. Box 30.001, 9700, RB, Groningen, The Netherlands
| | - Derya Yakar
- Department of Radiology, Nuclear Medicine and Molecular Imaging, Medical Imaging Center, University Medical Center Groningen, University of Groningen, P.O. Box 30.001, 9700, RB, Groningen, The Netherlands
| | - Jan Cees de Groot
- Department of Radiology, Nuclear Medicine and Molecular Imaging, Medical Imaging Center, University Medical Center Groningen, University of Groningen, P.O. Box 30.001, 9700, RB, Groningen, The Netherlands
| | - Rudi A J O Dierckx
- Department of Radiology, Nuclear Medicine and Molecular Imaging, Medical Imaging Center, University Medical Center Groningen, University of Groningen, P.O. Box 30.001, 9700, RB, Groningen, The Netherlands
| | - Thomas C Kwee
- Department of Radiology, Nuclear Medicine and Molecular Imaging, Medical Imaging Center, University Medical Center Groningen, University of Groningen, P.O. Box 30.001, 9700, RB, Groningen, The Netherlands.
| |
Collapse
|
9
|
Castro D, Yang J, Greer ML, Kwan B, Sauerbrei E, Hopman W, Soboleski D. Competency Based Medical Education-Towards the Development of a Standardized Pediatric Radiology Testing Module. Acad Radiol 2020; 27:1622-1632. [PMID: 32029374 DOI: 10.1016/j.acra.2019.12.005] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2019] [Revised: 11/07/2019] [Accepted: 12/15/2019] [Indexed: 10/25/2022]
Abstract
RATIONALE AND OBJECTIVES To introduce a process that allows for development of standardized competency based testing modules (CBTM) for evaluating resident progress and competence during their radiology training. This work focuses on the development of pediatric imaging CBTMs to be utilized during general radiology residency. MATERIALS AND METHODS Multiple in-patient and ER imaging request audits along with surveys of training programs and text recommendations were obtained. A list of 200 total diagnoses accumulated by review was distributed into one of four CBTM folders. Imaging cases which made ≥90% of the indications of the audits were added to Folder 1. Distribution of remaining imaging diagnoses was based on consensus by three subspecialists. A pilot study was performed with residents dictating selected imaging cases in their usual manner mimicking a typical rotation. RESULTS The pilot study demonstrated resident grading mean scores significantly associated with both the American College of Radiology (ACR) rank (rho = 0.636, p = 0.035) and the objective structured clinical examinations (OSCE) scores (rho = 0.694, p = 0.018). The mean scores positively associated with the ACR score (rho = 0.466), but fell short of statistical significance (p = 0.149). As expected, the ACR score, ACR rank and OSCE scores all significantly correlated with each other ( < 0.01). PGY also significantly correlated with the ACR score (rho = 0.683, p = 0.021) and the OSCE (rho = 0.767, p = 0.006) but not with the ACR rank (rho = 0.408, p = 0.213). CONCLUSION The process utilized to develop a standardized CBTM can be used as a simulation tool to assess radiology resident competence during their training. The format allows for assessment of resident reasoning skills and knowledge base, which provides documentation of progression and throughout residency.
Collapse
|
10
|
Herrmann L, Beitz-Radzio C, Bernigau D, Birk S, Ehlers JP, Pfeiffer-Morhenn B, Preusche I, Tipold A, Schaper E. Status Quo of Progress Testing in Veterinary Medical Education and Lessons Learned. Front Vet Sci 2020; 7:559. [PMID: 32974407 PMCID: PMC7472598 DOI: 10.3389/fvets.2020.00559] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2020] [Accepted: 07/14/2020] [Indexed: 11/13/2022] Open
Abstract
Progress testing is an assessment tool for longitudinal measurement of increase in knowledge of a specific group, e.g., students, which is well-known in medical education. This article gives an overview of progress testing in veterinary education with a focus on the progress test of the German-speaking countries. The "progress test veterinary medicine" (PTT) was developed in 2013 as part of a project by the Competence Centre for E-Learning, Didactics and Educational Research in Veterinary Medicine-a project cooperation of all German-speaking institutes for veterinary medicine in Germany, Austria, and Switzerland. After the end of the project, the PTT was still continued at six locations, at each of the five German schools for veterinary medicine and additionally in Austria. Further changes to the PTT platform and the analysis were carried out to optimize the PTT for continuing to offer the test from 2017 to 2019. The PTT is an interdisciplinary, formative electronic online test. It is taken annually and is composed of 136 multiple-choice single best answer questions. In addition, a "don't know" option is given. The content of the PTT refers to the day 1 competencies described by the European Association of Establishments for Veterinary Education. The platform Q-Exam® Institutions (IQuL GmbH, Bergisch Gladbach, Germany) is used for creating and administrating the PTT questions, the review processes and organizing of the online question database. After compiling the test by means of a blueprint, the PTT file is made available at every location. After the last PTT in 2018, the link to an evaluation was sent to the students from four out of these six partner Universities. The 450 analyzed questionnaires showed that the students mainly use the PTT to compare their individual results with those of fellow students in the respective semester. To conclude our study, a checklist with our main findings for implementing progress testing was created.
Collapse
Affiliation(s)
- Lisa Herrmann
- Centre for E-Learning, Didactics and Educational Research (ZELDA), University of Veterinary Medicine Hannover, Foundation, Hanover, Germany
| | | | - Dora Bernigau
- Faculty of Veterinary Medicine, Leipzig University, Leipzig, Germany
| | - Stephan Birk
- Faculty of Veterinary Medicine, Freie Universität Berlin, Berlin, Germany
| | - Jan P Ehlers
- Didactics and Educational Research in Health Science, Faculty of Health, Witten/Herdecke University, Witten, Germany
| | | | - Ingrid Preusche
- Centre for Study Affairs, University of Veterinary Medicine, Vienna (Vetmeduni Vienna), Vienna, Austria
| | - Andrea Tipold
- Small Animal Clinic, Neurology, University of Veterinary Medicine Hannover, Foundation, Hanover, Germany
| | - Elisabeth Schaper
- Centre for E-Learning, Didactics and Educational Research (ZELDA), University of Veterinary Medicine Hannover, Foundation, Hanover, Germany
| |
Collapse
|
11
|
Rutgers DR, van Schaik JPJ, Kruitwagen CLJJ, Haaring C, van Lankeren W, van Raamt AF, ten Cate O. Introducing Summative Progress Testing in Radiology Residency: Little Change in Residents' Test Results After Transitioning from Formative Progress Testing. MEDICAL SCIENCE EDUCATOR 2020; 30:943-953. [PMID: 34457753 PMCID: PMC8368876 DOI: 10.1007/s40670-020-00977-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
INTRODUCTION Educational effects of transitioning from formative to summative progress testing are unclear. Our purpose was to investigate whether such transitioning in radiology residency is associated with a change in progress test results. METHODS We investigated a national cohort of radiology residents (N > 300) who were semi-annually assessed through a mandatory progress test. Until 2014, this test was purely formative for all residents, but in 2014/2015, it was transitioned (as part of a national radiology residency program revision) to include a summative pass requirement for new residents. In 7 posttransitioning tests in 2015-2019, including summatively and formatively tested residents who followed the revised and pre-transitioning residency program, respectively, we assessed residents' relative test scores and percentage of residents that reached pass standards. RESULTS Due to our educational setting, most posttransitioning tests had no residents in the summative condition in postgraduate year 4-5, nor residents in the formative condition in year 0.5-2. Across the 7 tests, relative test scores in postgraduate year 1-3 of the summative resident group and year 3.5-4.5 of the formative group differed significantly (p < 0.01 and p < 0.05, respectively, Kruskal-Wallis test). However, scores fluctuated without consistent time trends and without consistent differences between both resident groups. Percentage of residents reaching the pass standard did not differ significantly across tests or between groups. DISCUSSION Transitioning from formative to summative progress testing was associated with overall steady test results of the whole resident group in 4 post-transitioning years. We do not exclude that transitioning may have positive educational effects for resident subgroups.
Collapse
Affiliation(s)
- D. R. Rutgers
- Department of Radiology, University Medical Center, Utrecht University, Heidelberglaan 100, 3584 CX Utrecht, The Netherlands
- Examination Committee of the Radiological Society of the Netherlands, Utrecht, The Netherlands
| | - J. P. J. van Schaik
- Department of Radiology, University Medical Center, Utrecht University, Heidelberglaan 100, 3584 CX Utrecht, The Netherlands
| | - C. L. J. J. Kruitwagen
- Julius Center, Department of Biostatistics, University Medical Center, Utrecht University, Utrecht, The Netherlands
| | - C. Haaring
- Department of Radiology, University Medical Center, Utrecht University, Heidelberglaan 100, 3584 CX Utrecht, The Netherlands
| | - W. van Lankeren
- Department of Radiology, Erasmus MC, Rotterdam, The Netherlands
- Radiological Society of the Netherlands, Utrecht, The Netherlands
| | - A. F. van Raamt
- Examination Committee of the Radiological Society of the Netherlands, Utrecht, The Netherlands
- Department of Radiology, Gelre Hospital, Apeldoorn, The Netherlands
| | - O. ten Cate
- Center for Research and Development of Education, University Medical Center, Utrecht University, Utrecht, The Netherlands
| |
Collapse
|
12
|
Dumas D, McNeish D, Schreiber-Gregory D, Durning SJ, Torre DM. Dynamic Measurement in Health Professions Education: Rationale, Application, and Possibilities. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2019; 94:1323-1328. [PMID: 31460924 DOI: 10.1097/acm.0000000000002729] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Dynamic measurement modeling (DMM) is a psychometric paradigm that uses longitudinal data to estimate individual students' growth in measured skills over the course of an educational program (i.e., growth scores). DMM represents a more formal way of assessing learning progress across the health professions education continuum. In this article, the authors provide justification for this approach in health professions education and demonstrate its proof-of-concept use with three time points of United States Medical Licensing Examination Step exams to generate growth scores for 454 current and recent medical learners. The authors demonstrate that learners vary substantially on their growth scores, and those growth scores exhibit psychometric reliability. In addition, growth scores significantly and positively correlated with indicators of medical learner readiness (e.g., undergraduate grade point average and Medical College Admission Test scores). These growth scores were also capable of significantly and positively correlating with future ratings of clinical competencies during internship as assessed through a survey sent to their program directors at the end of the first postgraduate year (e.g., patient care, interpersonal skills). These preliminary findings of reliability and validity for DMM growth scores provide initial evidence for further investigation into the suitability of a dynamic measurement paradigm in health professions education.
Collapse
Affiliation(s)
- Denis Dumas
- D. Dumas is assistant professor of research methods and information science, University of Denver, Denver, Colorado. D. McNeish is assistant professor of quantitative psychology, Arizona State University, Phoenix, Arizona. D. Schreiber-Gregory is data analyst, Uniformed Services University of the Health Sciences, Bethesda, Maryland. S.J. Durning is professor of medicine, Uniformed Services University of the Health Sciences, Bethesda, Maryland. D.M. Torre is associate professor of medicine, Uniformed Services University of the Health Sciences, Bethesda, Maryland
| | | | | | | | | |
Collapse
|
13
|
Rutgers DR, van Raamt F, ten Cate TJ. Development of competence in volumetric image interpretation in radiology residents. BMC MEDICAL EDUCATION 2019; 19:122. [PMID: 31046749 PMCID: PMC6498553 DOI: 10.1186/s12909-019-1549-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/03/2019] [Accepted: 04/08/2019] [Indexed: 06/09/2023]
Abstract
BACKGROUND During residency, radiology residents learn to interpret volumetric radiological images. The development of their competence for volumetric image interpretation, as opposed to 2D image reading, is not completely understood. The purpose of the present study was to investigate how competence for volumetric image interpretation develops in radiology residents and how this compares with competence development for 2D image interpretation, by studying resident scores on image-based items in digital radiology tests. METHODS We reviewed resident scores on volumetric and 2D image-based test items in 9 consecutive semi-annual digital radiology tests that were carried out from November 2013 to April 2018. We assessed percentage-correct sum scores for all test items about volumetric images and for all test items about 2D images in each test as well as for all residents across the 9 tests (i.e. 4.5 years of test materials). We used a paired t-test to analyze whether scores differed between volumetric and 2D image-based test items in individual residents in postgraduate year (PGY) 0-5, subdivided in 10 half-year phases (PGY 0-0.5, 0.5-1.0, 1.0-1.5 et cetera). RESULTS The percentage-correct scores on volumetric and 2D image-based items showed a comparable trend of development, increasing in the first half of residency and flattening off in the second half. Chance-corrected scores were generally lower in volumetric than in 2D items (on average 1-5% points). In PGY 1.5-4.5, this score difference was statistically significant (p-values ranging from 0.02 to < 0.001), with the largest difference found in PGY 2.5 (mean: 5% points; 95% CI: -7.3 - -3.4). At the end of training in PGY 5, there was no statistically significant score difference between both item types. CONCLUSIONS The development of competence in volumetric image interpretation fits a similar curvilinear growth curve during radiology residency as 2D image interpretation competence in digital radiology tests. Although residents performed significantly lower on volumetric than 2D items in PGY 1.5-4.5, we consider the magnitude of this difference as relatively small for our educational setting and we suggest that throughout radiology training there are no relevant differences in the development of both types of competences, as investigated by digital radiology tests.
Collapse
Affiliation(s)
- D. R. Rutgers
- Department of Radiology, University Medical Center, Heidelberglaan 100, 3584 CX Utrecht, The Netherlands
- Radiological Society of the Netherlands, Mercatorlaan 1200, 3528 BL Utrecht, The Netherlands
| | - F. van Raamt
- Department of Radiology, Gelre Hospitals, Albert Schweitzerlaan 31, 7334 DZ Apeldoorn, The Netherlands
- Radiological Society of the Netherlands, Mercatorlaan 1200, 3528 BL Utrecht, The Netherlands
| | - Th. J. ten Cate
- Center for Research and Development of Education, University Medical Center, P.O. Box # 85500, 3508 GA Utrecht, The Netherlands
| |
Collapse
|
14
|
Rutgers DR, van Schaik JPJ, van Lankeren W, van Raamt F, Cate TJT. Resident and Faculty Attitudes Toward the Dutch Radiology Progress Test as It Transitions from a Formative to a Summative Measure of Licensure Eligibility. MEDICAL SCIENCE EDUCATOR 2018; 28:639-647. [PMID: 30931160 PMCID: PMC6404798 DOI: 10.1007/s40670-018-0605-7] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
BACKGROUND Progress testing, a regularly administered comprehensive test of a complete knowledge domain, usually serves to provide learners feedback and has a formative nature. OBJECTIVE Our study aimed to investigate the acceptability of introducing a summative component in the postgraduate Dutch Radiology Progress Test (DRPT) among residents and program directors in a competency-based training program. METHODS A 15-item questionnaire with 3 items on acceptability of summative postgraduate knowledge testing, 7 on acceptability of the summative DRPT regulations, 4 on self-reported educational effects, and 1 open comment item was distributed nationally among 349 residents and 81 radiology program directors. RESULTS The questionnaire was filled out by 330 residents (95%) and 48 (59%) program directors. Summative postgraduate knowledge testing was regarded as acceptable by both groups, but more so by program directors than residents. The transition toward summative assessment in the DRPT was received neutrally to slightly positively by residents, while program directors regarded it as an improvement and estimated the summative criteria to be lighter and less stressful than did residents. The residents' self-reported educational effects of summative assessment in the DRPT were limited, whereas program directors expected a greater end-of-training knowledge improvement than residents. CONCLUSIONS Both residents and program directors support summative postgraduate knowledge testing, although it is more accepted by program directors. Residents receive summative radiological progress testing neutrally to slightly positively, while program directors generally value it more positively than residents. Directors should be aware of these different perspectives when introducing or developing summative progress testing in residency programs.
Collapse
Affiliation(s)
- D. R. Rutgers
- Department of Radiology, University Medical Center, Utrecht, The Netherlands
| | - J. P. J. van Schaik
- Department of Radiology, University Medical Center, Utrecht, The Netherlands
| | - W. van Lankeren
- Department of Radiology, Erasmus University, Rotterdam, The Netherlands
| | - F. van Raamt
- Department of Radiology, Gelre Hospital, Apeldoorn, The Netherlands
| | - Th. J. ten Cate
- Center for Research and Development of Education, University Medical Center, Utrecht, The Netherlands
| |
Collapse
|