1
|
Alavarce DC, de Medeiros ML, de Araújo Viana D, Abade F, Vieira JE, Machado JLM, Collares CF. The progress test as a structuring initiative for programmatic assessment. BMC MEDICAL EDUCATION 2024; 24:555. [PMID: 38773470 PMCID: PMC11110289 DOI: 10.1186/s12909-024-05537-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/22/2022] [Accepted: 05/08/2024] [Indexed: 05/23/2024]
Abstract
BACKGROUND The Progress Test is an individual assessment applied to all students at the same time and on a regular basis. The test was structured in the medical undergraduate education of a conglomerate of schools to structure a programmatic assessment integrated into teaching. This paper presents the results of four serial applications of the progress test and the feedback method to students. METHODS This assessment comprises 120 items offered online by means of a personal password. Items are authored by faculty, peer-reviewed, and approved by a committee of experts. The items are classified by five major areas, by topics used by the National Board of Medical Examiners and by medical specialties related to a national Unified Health System. The correction uses the Item Response Theory with analysis by the "Rasch" model that considers the difficulty of the item. RESULTS Student participation increased along the four editions of the tests, considering the number of enrollments. The median performances increased in the comparisons among the sequential years in all tests, except for test1 - the first test offered to schools. Between subsequent years of education, 2nd-1st; 4th-3rd and 5th-4th there was an increase in median scores from progress tests 2 through 4. The final year of undergraduate showed a limited increase compared to the 5th year. There is a consistent increase in the median, although with fluctuations between the observed intervals. CONCLUSION The progress test promoted the establishment of regular feedback among students, teachers and coordinators and paved the road to engagement much needed to construct an institutional programmatic assessment.
Collapse
|
2
|
Appelhaus S, Werner S, Grosse P, Kämmer JE. Feedback, fairness, and validity: effects of disclosing and reusing multiple-choice questions in medical schools. MEDICAL EDUCATION ONLINE 2023; 28:2143298. [PMID: 36350605 PMCID: PMC9662023 DOI: 10.1080/10872981.2022.2143298] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/26/2022] [Revised: 10/30/2022] [Accepted: 10/31/2022] [Indexed: 06/16/2023]
Abstract
BACKGROUND Disclosure of items used in multiple-choice-question (MCQ) exams may decrease student anxiety and improve transparency, feedback, and test-enhanced learning but potentially compromises the reliability and fairness of exams if items are eventually reused. Evidence regarding whether disclosure and reuse of test items change item psychometrics is scarce and inconclusive. METHODS We retrospectively analysed difficulty and discrimination coefficients of 10,148 MCQ items used between fall 2017 and fall 2019 in a large European medical school in which items were disclosed from fall 2017 onwards. We categorised items as 'new'; 'reused, not disclosed'; or 'reused, disclosed'. For reused items, we calculated the difference from their first ever use, that is, when they were new. Differences between categories and terms were analysed with one-way analyses of variance and independent-samples t tests. RESULTS The proportion of reused, disclosed items grew from 0% to 48.4%; mean difficulty coefficients increased from 0.70 to 0.76; that is, items became easier, P < .001, ηp2 = 0.011. On average, reused, disclosed items were significantly easier (M = 0.83) than reused, not disclosed items (M = 0.71) and entirely new items (M = 0.66), P < .001, ηp2 = 0.087. Mean discrimination coefficients increased from 0.21 to 0.23; that is, item became slightly more discriminating, P = .002, ηp2 = 0.002. CONCLUSIONS Disclosing test items provides the opportunity to enhance feedback and transparency in MCQ exams but potentially at the expense of decreased item reliability. Discrimination was positively affected. Our study may help weigh advantages and disadvantages of using previously disclosed items.
Collapse
Affiliation(s)
- Stefan Appelhaus
- Institute of Medical Sociology and Rehabilitation Science, Charité—Universitätsmedizin Berlin, corporate member of Freie Universität Berlin, Humboldt-Universität zu Berlin, and Berlin Institute of Health, Berlin, Germany
- Department of Radiology and Nuclear Medicine, Universitätsmedizin Mannheim, Heidelberg University, Mannheim, Germany
| | - Susanne Werner
- Assessment Unit, Charité—Universitätsmedizin Berlin, corporate member of Freie Universität Berlin, Humboldt-Universität zu Berlin, and Berlin Institute of Health, Berlin, Germany
| | - Pascal Grosse
- Dean of Students Office and Department of Neurology, Charité—Universitätsmedizin Berlin, corporate member of Freie Universität Berlin, Humboldt-Universität zu Berlin, and Berlin Institute of Health, Berlin, Germany
| | - Juliane E. Kämmer
- Department of Emergency Medicine, University of Bern, Bern, Switzerland
| |
Collapse
|
3
|
Schaper E, van Haeften T, Wandall J, Iivanainen A, Penell J, Press CM, Lekeux P, Holm P. Development of a shared item repository for progress testing in veterinary education. Front Vet Sci 2023; 10:1296514. [PMID: 38026654 PMCID: PMC10652386 DOI: 10.3389/fvets.2023.1296514] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2023] [Accepted: 10/16/2023] [Indexed: 12/01/2023] Open
Abstract
Introduction Progress testing in education is an assessment principle for the measurement of students' progress over time, e.g., from start to graduation. Progress testing offers valid longitudinal formative measurement of the growth in the cognitive skills of the individual students within the subjects of the test as well as a tool for educators to monitor potential educational gaps and mismatches within the curriculum in relation to the basic veterinary learning outcomes. Methods Six veterinary educational establishments in Denmark, Finland, Germany (Hannover), the Netherlands, Norway, and Sweden established in cooperation with the European Association of Establishments for Veterinary Education (EAEVE) a common veterinary item repository that can be used for progress testing in European Veterinary Education Establishments (VEEs), linear as well as computer adaptive, covering the EAEVE veterinary subjects and theoretical "Day One Competencies." First, a blueprint was created, suitable item formats were identified, and a quality assurance process for reviewing and approving items was established. The items were trialed to create a database of validated and calibrated items, and the responses were subsequently psychometrically analyzed according to Modern Test Theory. Results In total, 1,836 items were submitted of which 1,342 were approved by the reviewers for trial testing. 1,119 students from all study years and all partners VEEs participated in one or more of six item trials, and 1,948 responses were collected. Responses were analyzed using Rasch Modeling (analysis of item-fit, differential item function, item-response characteristics). A total of 821 calibrated items of various difficulty levels matching the veterinary students' abilities and covering the veterinary knowledge domains have been banked. Discussion The item bank is now ready to be used for formative progress testing in European veterinary education. This paper presents and discusses possible pitfalls, problems, and solutions when establishing an international veterinary progress test.
Collapse
Affiliation(s)
- Elisabeth Schaper
- Centre for E-learning, Didactics and Educational Research, University of Veterinary Medicine Hannover, Foundation, Hannover, Germany
| | - Theo van Haeften
- Department of Biomolecular Health Sciences, Faculty of Veterinary Medicine & Centre for Academic Teaching and Learning, Utrecht University, Utrecht, Netherlands
| | - Jakob Wandall
- NordicMetrics Aps, Copenhagen, Denmark
- Department of Veterinary and Animal Sciences, Faculty of Health and Medical Sciences, University of Copenhagen, Frederiksberg, Denmark
| | - Antti Iivanainen
- Department of Veterinary Biosciences, Faculty of Veterinary Medicine, University of Helsinki, Helsinki, Finland
| | - Johanna Penell
- Department of Clinical Sciences, Faculty of Veterinary Medicine and Animal Science, Swedish University of Agricultural Sciences, Uppsala, Sweden
| | - Charles McLean Press
- Department of Preclinical Sciences and Pathology, Faculty of Veterinary Medicine, Norwegian University of Life Sciences, Aas, Norway
| | - Pierre Lekeux
- European Association of Establishments for Veterinary Education (EAEVE), Vienna, Austria
| | - Peter Holm
- Department of Veterinary and Animal Sciences, Faculty of Health and Medical Sciences, University of Copenhagen, Frederiksberg, Denmark
| |
Collapse
|
4
|
Simões RL, Bicudo AM, Passeri SMRR, Calderan TRA, Rizoli S, Fraga GP. Can trauma leagues contribute to better cognitive performance and technical skills of medical students? The experience of the Unicamp trauma league. Eur J Trauma Emerg Surg 2023; 49:1909-1916. [PMID: 37264152 DOI: 10.1007/s00068-023-02283-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2022] [Accepted: 05/14/2023] [Indexed: 06/03/2023]
Abstract
PURPOSE Trauma leagues (TLs) are extracurricular programs that offer medical students supervised exposure to trauma and acute care surgery, mentorship, and participation in other academic activities. TLs are fully approved medical schools, and currently, over 100 TLs exist in Brazil. We hypothesized that the performance/competence of medical students who participated in TLs was superior compared to non-participants. This study evaluated and compared the cognitive performance and technical skills of the two groups. METHODS This retrospective cohort study evaluated the performance of TL medical students to non-TL alumni from 2005 to 2017, using the students' academic performance coefficient, Clinical Competence Assessment, and Progress Test results. SigmaPlot 12.0 software was used to perform statistical analyses, including Mann-Whitney comparison tests and the Kruskal-Wallis test to confirm the data. RESULTS Of the 1366 medical students who graduated from a Brazilian university, 966 were included, with 17.9% having participated in TL. Compared to non-TL participants, TL students demonstrated better cognitive performance according to the performance coefficient (p = 0.017) and Progress Test result (p < 0.001), and higher achievement in the Clinical Competence Assessment (p < 0.001). CONCLUSION The academic performance of TL students was superior to that of non-TL students at the University of Campinas (Unicamp), suggesting a positive impact of TL in the preparation of future doctors. The study findings suggest that participation in TL at Unicamp was beneficial in preparing better doctors and should be considered by medical schools worldwide. EVIDENCE LEVEL II (Retrospective cohort).
Collapse
Affiliation(s)
- Romeo Lages Simões
- Medical School of the Vale do Rio Doce University (Univale), Governador Valadares, MG, Brazil.
- Federal University of Juiz de Fora, Governador Valadares Campus (UFJF-GV), Governador Valadares, MG, Brazil.
| | - Angélica Maria Bicudo
- Department of Pediatrics, School of Medical Sciences, University of Campinas, R. Tessália Vieira de Camargo, 126 - Cidade Universitária, PO BOX 13083-887, Campinas, SP, Brazil
| | - Sílvia Maria Riceto Ronchin Passeri
- School of Medical Sciences, University of Campinas, R. Tessália Vieira de Camargo, 126 - Cidade Universitária, PO BOX 13083-887, Campinas, SP, Brazil
| | - Thiago Rodrigues Araújo Calderan
- Division of Trauma Surgery, School of Medical Sciences, University of Campinas, R. Tessália Vieira de Camargo, 126 - Cidade Universitária, PO BOX 13083-887, Campinas, SP, Brazil
| | - Sandro Rizoli
- Medical Director for Trauma, Hamad General Hospital, PO BOX 3050, Doha, Qatar
| | - Gustavo Pereira Fraga
- Division of Trauma Surgery, School of Medical Sciences, University of Campinas, R. Tessália Vieira de Camargo, 126 - Cidade Universitária, PO BOX 13083-887, Campinas, SP, Brazil
| |
Collapse
|
5
|
Wearn A, Bindra V, Patten B, Loveday BPT. Relationship between medical programme progress test performance and surgical clinical attachment timing and performance. MEDICAL TEACHER 2023:1-8. [PMID: 36905609 DOI: 10.1080/0142159x.2023.2186205] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/18/2023]
Abstract
Purpose: Progress tests (PTs) assess applied knowledge, promote knowledge integration, and facilitate retention. Clinical attachments catalyse learning through an appropriate learning context. The relationship between PT results and clinical attachment sequence and performance are under-explored. Aims: (1) Determine the effect of Year 4 general surgical attachment (GSA) completion and sequence on overall PT performance, and for surgically coded items; (2) Determine the association between PT results in the first 2 years and GSA assessment outcomes.Materials and methods: All students enrolled in the medical programme, who started Year 2 between January 2013 and January 2016, were included; with follow up until December 2018. A linear mixed model was applied to study the effect of undertaking a GSA on subsequent PT results. Logistic regressions were used to explore the effect of past PT performance on the likelihood of a student receiving a distinction grade in the GSA.Results: 965 students were included, representing 2191 PT items (363 surgical items). Sequenced exposure to the GSA in Year 4 was associated with increased performance on surgically coded PT items, but not overall performance on the PT, with the difference decreasing over the year. PT performance in Years 2-3 was associated with an increased likelihood of being awarded a GSA distinction grade (OR 1.62, p < 0.001), with overall PT performance a better predictor than performance on surgically coded items.Conclusions: Exposure to a surgical attachment improves PT results in surgically coded PT items, although with a diminishing effect over time, implying clinical exposure may accelerate subject specific learning. Timing of the GSA did not influence end of year performance in the PT. There is some evidence that students who perform well on PTs in preclinical years are more likely to receive a distinction grade in a surgical attachment than those with lower PT scores.
Collapse
Affiliation(s)
- Andy Wearn
- Medical Programme Directorate, Faculty of Medical and Health Sciences, The University of Auckland, Auckland, New Zealand
| | - Vanshay Bindra
- Medical Programme Directorate, Faculty of Medical and Health Sciences, The University of Auckland, Auckland, New Zealand
| | - Bradley Patten
- Medical Programme Directorate, Faculty of Medical and Health Sciences, The University of Auckland, Auckland, New Zealand
| | - Benjamin P T Loveday
- Department of Surgery, The University of Auckland, Auckland, New Zealand
- Department of Surgery, University of Melbourne, Melbourne, Australia
- Hepatobiliary and Upper Gastrointestinal Unit, Peter MacCallum Cancer Centre, Royal Melbourne Hospital, Melbourne, Australia
| |
Collapse
|
6
|
Collares CF. Cognitive diagnostic modelling in healthcare professions education: an eye-opener. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2022; 27:427-440. [PMID: 35201484 PMCID: PMC8866928 DOI: 10.1007/s10459-022-10093-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/23/2021] [Accepted: 01/23/2022] [Indexed: 06/14/2023]
Abstract
Criticisms about psychometric paradigms currently used in healthcare professions education include claims of reductionism, objectification, and poor compliance with assumptions. Nevertheless, perhaps the most crucial criticism comes from learners' difficulty in interpreting and making meaningful use of summative scores and the potentially detrimental impact these scores have on learners. The term "post-psychometric era" has become popular, despite persisting calls for the sensible use of modern psychometrics. In recent years, cognitive diagnostic modelling has emerged as a new psychometric paradigm capable of providing meaningful diagnostic feedback. Cognitive diagnostic modelling allows the classification of examinees in multiple cognitive attributes. This measurement is obtained by modelling these attributes as categorical, discrete latent variables. Furthermore, items can reflect more than one latent variable simultaneously. The interactions between latent variables can be modelled with flexibility, allowing a unique perspective on complex cognitive processes. These characteristic features of cognitive diagnostic modelling enable diagnostic classification over a large number of constructs of interest, preventing the necessity of providing numerical scores as feedback to test takers. This paper provides an overview of cognitive diagnostic modelling, including an introduction to its foundations and illustrating potential applications, to help teachers be involved in developing and evaluating assessment tools used in healthcare professions education. Cognitive diagnosis may represent a revolutionary new psychometric paradigm, overcoming the known limitations found in frequently used psychometric approaches, offering the possibility of robust qualitative feedback and better alignment with competency-based curricula and modern programmatic assessment frameworks.
Collapse
Affiliation(s)
- Carlos Fernando Collares
- Department of Educational Development and Research, Faculty of Health, Medicine and Life Sciences, School of Health Professions Education (SHE), Maastricht University, Postbus 616, 6200, Maastricht, The Netherlands.
- European Board of Medical Assessors, Edinburgh, UK.
- Stichting Aphasia.help, Maastricht, The Netherlands.
| |
Collapse
|
7
|
Kennedy N, McKeown P, Boohan M. C25 in QUB: a transformed curriculum for a transformed healthcare system. THE ULSTER MEDICAL JOURNAL 2021; 90:138-141. [PMID: 34815590 PMCID: PMC8581695] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
Affiliation(s)
- Neil Kennedy
- Director, Centre for Medical Education, Queen’s University Belfast Nuffield Chair of Child Health, Honorary Consultant Paediatrician, Royal Belfast Hospital for Sick Children,Correspondence to: Professor Neil Kennedy. Email:
| | - Pascal McKeown
- Director, Centre for Medical Education, Queen’s University Belfast Nuffield Chair of Child Health, Honorary Consultant Paediatrician, Royal Belfast Hospital for Sick Children
| | - Mairead Boohan
- Director, Centre for Medical Education, Queen’s University Belfast Nuffield Chair of Child Health, Honorary Consultant Paediatrician, Royal Belfast Hospital for Sick Children
| |
Collapse
|
8
|
de Sá MFS, Romão GS, Fernandes CE, Filho ALDS. The Individual Progress Test of Gynecology and Obstetrics Residents (TPI-GO): The Brazilian Experience by FEBRASGO. REVISTA BRASILEIRA DE GINECOLOGIA E OBSTETRÍCIA 2021; 43:425-428. [PMID: 34318467 PMCID: PMC10411146 DOI: 10.1055/s-0041-1731803] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022] Open
|
9
|
Ryan AT, Wilkinson TJ. Rethinking Assessment Design: Evidence-Informed Strategies to Boost Educational Impact in the Anatomical Sciences. ANATOMICAL SCIENCES EDUCATION 2021; 14:361-367. [PMID: 33752261 DOI: 10.1002/ase.2075] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/12/2020] [Revised: 03/16/2021] [Accepted: 03/17/2021] [Indexed: 06/12/2023]
Abstract
University assessment is in the midst of transformation. Assessments are no longer designed solely to determine that students can remember and regurgitate lecture content, nor in order to rank students to aid with some future selection process. Instead, assessments are expected to drive, support, and enhance learning and to contribute to student self-assessment and development of skills and attributes for a lifetime of learning. While traditional purposes of certifying achievement and determining readiness to progress remain important, these new expectations for assessment can create tensions in assessment design, selection, and deployment. With the recognition of these tensions, three contemporary approaches to assessment in medical education are described. These approaches include careful consideration of the educational impact of assessment-before, during (test or recall enhanced learning) and after assessments; development of student (and staff) assessment literacy; and planning of cohesive systems of assessment (with a range of assessment tools) designed to assess the various competencies demanded of future graduates. These approaches purposefully straddle the cross purposes of assessment in modern health professions education. The implications of these models are explored within the context of medical education and then linked with contemporary work in the anatomical sciences in order to highlight current synergies and potential future innovations when using evidence-informed strategies to boost the educational impact of assessments.
Collapse
Affiliation(s)
- Anna T Ryan
- Department of Medical Education, Melbourne Medical School, Faculty of Medicine, Dentistry and Health Sciences, University of Melbourne, Melbourne, Victoria, Australia
| | - Tim J Wilkinson
- Education Unit, Otago Medical School, University of Otago, Christchurch, New Zealand
| |
Collapse
|
10
|
Hamamoto Filho PT, de Arruda Lourenção PLT, do Valle AP, Abbade JF, Bicudo AM. The Correlation Between Students' Progress Testing Scores and Their Performance in a Residency Selection Process. MEDICAL SCIENCE EDUCATOR 2019; 29:1071-1075. [PMID: 34457585 PMCID: PMC8368402 DOI: 10.1007/s40670-019-00811-4] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Brazil is currently seeing an increased number of medical schools, leading to high competition for medical residency vacancies. Public managers have thus considered Progress Testing scores potentially useful as part of the final decision in the medical residency selection process. We analyzed whether there is a correlation between students' Progress Testing scores and their performances in medical residency selection. We examined four subsequent cohorts of students who attempted Progress Testing yearly and compared their accumulated scores with their medical residency selection scores from Botucatu Medical School, Universidade Estadual Paulista. We included 212 students who finished the 6-year medical course in 2013, 2014, 2015, and 2016. The comparison between the area under the Progress Testing curve and the medical residency selection score was performed using a Pearson correlation, with a p value set at < 0.05. We found a positive association between the two scores (p < 0.05 for the 4 years). Next, the students were grouped according to their performance in Progress Testing: above one, within one, and below one standard deviation. A chi-square test was used to compare the rates of approval with the second step of the medical residency selection process. Approval rates were 91.7%, 69.2%, and 42.1%, respectively (p < 0.05). We conclude that, in fact, there is a correlation between students' performance on these measures. This is partially explained by the fact that both instruments measure cognitive competencies and knowledge. These data may support national policy changes for medical residency selection.
Collapse
Affiliation(s)
- Pedro Tadao Hamamoto Filho
- Botucatu Medical School, Department of Neurology, Psychology and Psychiatry, UNESP – Univ Estadual Paulista, Botucatu, Brazil
| | | | - Adriana Polachini do Valle
- Botucatu Medical School, Department of Internal Medicine, UNESP – Univ Estadual Paulista, Botucatu, Brazil
| | - Joélcio Francisco Abbade
- Botucatu Medical School, Department of Gynecology and Obstetrics, UNESP – Univ Estadual Paulista, Botucatu, Brazil
| | - Angélica Maria Bicudo
- School of Medical Sciences, Department of Pediatrics, UNICAMP – Univ Estadual de Campinas, Campinas, Brazil
| |
Collapse
|