1
|
Muacevic A, Adler JR, Danforth D, Fine L, Foster J, Jacomino M, Johnson M, Keller B, Mendez P, Saunders JM, Scalese R, Schocken DM, Stalvey C, Stevens M, Suchak N, Syms S, Uchiyama E, Velazquez M. The Florida Clinical Skills Collaborative: A New Regional Consortium for the Assessment of Clinical Skills. Cureus 2022; 14:e31263. [PMID: 36514606 PMCID: PMC9733824 DOI: 10.7759/cureus.31263] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2022] [Accepted: 11/07/2022] [Indexed: 11/10/2022] Open
Abstract
Discontinuation of the United States Medical Licensing Examination (USMLE) Step 2 Clinical Skills (CS) exam and Comprehensive Osteopathic Medical Licensing Examination (COMLEX) Level 2 Performance Evaluation (2-PE) raised questions about the ability of medical schools to ensure the clinical skills competence of graduating students. In February 2021, representatives from all Florida, United States, allopathic and osteopathic schools initiated a collaboration to address this critically important issue in the evolving landscape of medical education. A 5-point Likert scale survey of all members (n=18/20 individuals representing 10/10 institutions) reveals that initial interest in joining the collaboration was high among both individuals (mean 4.78, SD 0.43) and institutions (mean 4.69, SD 0.48). Most individuals (mean 4.78, SD 0.55) and institutions (mean 4.53, SD 0.72) are highly satisfied with their decision to join. Members most commonly cited a "desire to establish a shared assessment in place of Step 2 CS/2-PE" as their most important reason for joining. Experienced benefits of membership were ranked as the following: 1) Networking, 2) Shared resources for curriculum implementation, 3) Scholarship, and 4) Work towards a shared assessment in place of Step 2 CS/2-PE. Challenges of membership were ranked as the following: 1) Logistics such as scheduling and technology, 2) Agreement on common goals, 3) Total time commitment, and 4) Large group size. Members cited the "administration of a joint assessment pilot" as the highest priority for the coming year. Florida has successfully launched a regional consortium for the assessment of clinical skills competency with high levels of member satisfaction which may serve as a model for future regional consortia.
Collapse
|
2
|
Dion V, St-Onge C, Bartman I, Touchie C, Pugh D. Written-Based Progress Testing: A Scoping Review. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2022; 97:747-757. [PMID: 34753858 DOI: 10.1097/acm.0000000000004507] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
PURPOSE Progress testing is an increasingly popular form of assessment in which a comprehensive test is administered to learners repeatedly over time. To inform potential users, this scoping review aimed to document barriers, facilitators, and potential outcomes of the use of written progress tests in higher education. METHOD The authors followed Arksey and O'Malley's scoping review methodology to identify and summarize the literature on progress testing. They searched 6 databases (Academic Search Complete, CINAHL, ERIC, Education Source, MEDLINE, and PsycINFO) on 2 occasions (May 22, 2018, and April 21, 2020) and included articles written in English or French and pertaining to written progress tests in higher education. Two authors screened articles for the inclusion criteria (90% agreement), then data extraction was performed by pairs of authors. Using a snowball approach, the authors also screened additional articles identified from the included reference lists. They completed a thematic analysis through an iterative process. RESULTS A total of 104 articles were included. The majority of progress tests used a multiple-choice and/or true-or-false question format (95, 91.3%) and were administered 4 times a year (38, 36.5%). The most documented source of validity evidence was internal consistency (38, 36.5%). Four major themes were identified: (1) barriers and challenges to the implementation of progress testing (e.g., need for additional resources); (2) established collaboration as a facilitator of progress testing implementation; (3) factors that increase the acceptance of progress testing (e.g., formative use); and (4) outcomes and consequences of progress test use (e.g., progress testing contributes to an increase in knowledge). CONCLUSIONS Progress testing appears to have a positive impact on learning, and there is significant validity evidence to support its use. Although progress testing is resource- and time-intensive, strategies such as collaboration with other institutions may facilitate its use.
Collapse
Affiliation(s)
- Vincent Dion
- V. Dion is an undergraduate medical education student, Faculty of Medicine and Health Sciences, Université de Sherbrooke, Sherbrooke, Québec, Canada. He was a research assistant to the Paul Grand'Maison de la Société des médecins de l'Université de Sherbrooke research chair in medical education, Sherbrooke, Québec, Canada, at the time this work was completed
| | - Christina St-Onge
- C. St-Onge is professor, Department of Medicine, Faculty of Medicine and Health Sciences, Université de Sherbrooke, and the Paul Grand'Maison de la Société des médecins de l'Université de Sherbrooke research chair in medical education, Sherbrooke, Québec, Canada; ORCID: https://orcid.org/0000-0001-5313-0456
| | - Ilona Bartman
- I. Bartman is medical education research associate, Medical Council of Canada, Ottawa, Ontario, Canada; ORCID: https://orcid.org/0000-0002-2056-479X
| | - Claire Touchie
- C. Touchie is professor of medicine, University of Ottawa, Ottawa, Ontario, Canada. She was chief medical education officer, Medical Council of Canada, Ottawa, Ontario, Canada, at the time this work was completed; ORCID: https://orcid.org/0000-0001-7926-9720
| | - Debra Pugh
- D. Pugh is medical education advisor, Medical Council of Canada, and associate professor, Department of Medicine, University of Ottawa and The Ottawa Hospital, Ottawa, Ontario, Canada; ORCID: https://orcid.org/0000-0003-4076-9669
| |
Collapse
|
3
|
Ali K, Cockerill J, Zahra D, Tredwin C, Ferguson C. Impact of Progress testing on the learning experiences of students in medicine, dentistry and dental therapy. BMC MEDICAL EDUCATION 2018; 18:253. [PMID: 30413204 PMCID: PMC6230280 DOI: 10.1186/s12909-018-1357-1] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/08/2018] [Accepted: 10/22/2018] [Indexed: 06/08/2023]
Abstract
AIMS To investigate the impact of progress testing on the learning experiences of undergraduate students in three programs namely, medicine, dentistry and dental therapy. METHODS Participants were invited to respond to an online questionnaire to share their perceptions and experiences of progress testing. Responses were recorded anonymously, but data on their program, year of study, age, gender, and ethnicity were also captured on a voluntary basis. RESULTS A total of 167 participants completed the questionnaire yielding a response rate of 27.2% (n = 167). These included 96 BMBS students (27.4%), 56 BDS students (24.7%), and 15 BScDTH students (39.5%). A 3 -Program (BMBS, BDS, BScDTH) by 8-Topic (A-H) mixed analysis of variance (ANOVA) was conducted on the questionnaire responses. This revealed statistically significant main effects of Program and Topic, as well as a statistically significant interaction between the two (i.e. the pattern of topic differences was different across programs). CONCLUSIONS Undergraduate students in medicine, dentistry, and dental therapy and hygiene regarded PT as a useful assessment to support their learning needs. However, in comparison to students in dentistry and dental therapy and hygiene, the perceptions of medical students were less positive in several aspects of PT.
Collapse
Affiliation(s)
- Kamran Ali
- Faculty of Medicine and Dentistry, University of Plymouth, Plymouth, UK
- Peninsula Dental School, University of Plymouth, C523 Portland Square, Drake Circus, Plymouth, PL4 8AA UK
| | | | - Daniel Zahra
- Faculty of Medicine and Dentistry, University of Plymouth, Plymouth, UK
| | - Christopher Tredwin
- Peninsula Dental School, University of Plymouth, C523 Portland Square, Drake Circus, Plymouth, PL4 8AA UK
| | - Colin Ferguson
- Faculty of Medicine and Dentistry, University of Plymouth, Plymouth, UK
| |
Collapse
|
4
|
Kirnbauer B, Avian A, Jakse N, Rugani P, Ithaler D, Egger R. First reported implementation of a German-language progress test in an undergraduate dental curriculum: A prospective study. EUROPEAN JOURNAL OF DENTAL EDUCATION : OFFICIAL JOURNAL OF THE ASSOCIATION FOR DENTAL EDUCATION IN EUROPE 2018; 22:e698-e705. [PMID: 29961963 PMCID: PMC6220869 DOI: 10.1111/eje.12381] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Accepted: 06/11/2018] [Indexed: 06/08/2023]
Abstract
INTRODUCTION Progress testing is a special form of longitudinal and feedback-oriented assessment. Even though well established in human medical curricula, this is not the case in dental education. The aim was the prospective development and implementation of the first reported German-language Dental Progress Test (DPT) for the undergraduate dental curriculum at the Medical University of Graz, Austria. MATERIAL AND METHODS Participation in DPT was compulsory for all dental students in terms 7-12 (years 4-6). Three tests, each consisting of 100 items out of a pool of 375, were administered within 3 consecutive terms in 2016 and 2017. Rasch analyses were used to evaluate the questionnaire and identify misfitting items. RESULTS In the item responses, 59.7% were "correct," 27.0% were "false" and 13.3% were answered with "don't know," with similar results at all 3 time points. The assumption of parallel ICC was met (T1: χ2 = 51.071, df = 74, P = .981; T2: χ2 = 57.044, df = 67, P = .802; T3: χ2 = 58.443, df = 72, P = .876) and item difficulties for the thematic fields were similarly distributed across the latent dimensions. CONCLUSION The newly introduced DPT is appropriate for testing dental students and is well balanced for the tested target group.
Collapse
Affiliation(s)
- B. Kirnbauer
- Division of Oral surgery and OrthodonticsMedical University of GrazGrazAustria
| | - A. Avian
- Institute for Medical Informatics, Statistics and DocumentationMedical University of GrazGrazAustria
| | - N. Jakse
- Division of Oral surgery and OrthodonticsMedical University of GrazGrazAustria
| | - P. Rugani
- Division of Oral surgery and OrthodonticsMedical University of GrazGrazAustria
| | - D. Ithaler
- Organizational Unit for Teaching and StudiesMedical University of GrazGrazAustria
| | - R. Egger
- Institute for Educational ScienceKarl‐Franzens University GrazGrazAustria
| |
Collapse
|
5
|
Ali K, Coombes L, Kay E, Tredwin C, Jones G, Ricketts C, Bennett J. Progress testing in undergraduate dental education: the Peninsula experience and future opportunities. EUROPEAN JOURNAL OF DENTAL EDUCATION : OFFICIAL JOURNAL OF THE ASSOCIATION FOR DENTAL EDUCATION IN EUROPE 2016; 20:129-134. [PMID: 25874344 DOI: 10.1111/eje.12149] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 03/16/2015] [Indexed: 06/04/2023]
Abstract
BACKGROUND Progress testing is well established as a longitudinal form of assessment in undergraduate medical programmes to measure growth in knowledge. Peninsula Dental School is the first school to use progress testing and remains the only one to do so. AIMS To share the experience of developing progress testing in an undergraduate dental programme as a major summative assessment tool at a newly established dental school in the United Kingdom. METHODS Data were collected for progress tests conducted from 2007 to 14. The tests were formative in the first 2 years of the programme and summative in subsequent years. Each test was based on 100 single best answer multiple-choice items with an appropriate vignette. The students chose their answer from 5 options. A score 1 mark is awarded for each correct answer, minus 0.25 for an incorrect answer and 0 for 'don't know' (DK). The standard setting for each sitting was carried out using Angoff and Hofstee methods. RESULTS There were two tests per year with each cohort undertaking eight tests in their 4 years of study providing a total 14 test occasions. The reliability of each test for each student cohort tests was measured using Cronbach's alpha. The average reliability over 42 test/cohort combinations was 0.753 (±SD 0.08). Data analyses show growth in knowledge of dental students across successive years with the largest increase in knowledge observed between tests 1 and 5 and concomitant reduction in DK responses. CONCLUSION This is the first study to report the establishment and use of progress testing as the principle form of written summative testing in an undergraduate dental curriculum. Progress testing is a valid and reliable tool to assess growth in knowledge longitudinally over the duration of a dental programme. Although a labour-intensive process, progress testing merits more widespread use in dental programmes.
Collapse
Affiliation(s)
- K Ali
- Peninsula Schools of Medicine & Dentistry, Plymouth University, Plymouth, UK
| | - L Coombes
- Peninsula Schools of Medicine & Dentistry, Plymouth University, Plymouth, UK
| | - E Kay
- Peninsula Schools of Medicine & Dentistry, Plymouth University, Plymouth, UK
| | - C Tredwin
- Peninsula Schools of Medicine & Dentistry, Plymouth University, Plymouth, UK
| | - G Jones
- Peninsula Schools of Medicine & Dentistry, Plymouth University, Plymouth, UK
| | - C Ricketts
- Peninsula Schools of Medicine & Dentistry, Plymouth University, Plymouth, UK
| | - J Bennett
- Peninsula Schools of Medicine & Dentistry, Plymouth University, Plymouth, UK
| |
Collapse
|
6
|
Ravesloot CJ, Van der Schaaf MF, Muijtjens AMM, Haaring C, Kruitwagen CLJJ, Beek FJA, Bakker J, Van Schaik JPJ, Ten Cate TJ. The don't know option in progress testing. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2015; 20:1325-38. [PMID: 25912621 PMCID: PMC4639571 DOI: 10.1007/s10459-015-9604-2] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/30/2014] [Accepted: 03/25/2015] [Indexed: 05/12/2023]
Abstract
Formula scoring (FS) is the use of a don't know option (DKO) with subtraction of points for wrong answers. Its effect on construct validity and reliability of progress test scores, is subject of discussion. Choosing a DKO may not only be affected by knowledge level, but also by risk taking tendency, and may thus introduce construct-irrelevant variance into the knowledge measurement. On the other hand, FS may result in more reliable test scores. To evaluate the impact of FS on construct validity and reliability of progress test scores, a progress test for radiology residents was divided into two tests of 100 parallel items (A and B). Each test had a FS and a number-right (NR) version, A-FS, B-FS, A-NR, and B-NR. Participants (337) were randomly divided into two groups. One group took test A-FS followed by B-NR, and the second group test B-FS followed by A-NR. Evidence for impaired construct validity was sought in a hierarchical regression analysis by investigating how much of the participants' FS-score variance was explained by the DKO-score, compared to the contribution of the knowledge level (NR-score), while controlling for Group, Gender, and Training length. Cronbach's alpha was used to estimate NR and FS-score reliability per year group. NR score was found to explain 27 % of the variance of FS [F(1,332) = 219.2, p < 0.0005], DKO-score, and the interaction of DKO and Gender were found to explain 8 % [F(2,330) = 41.5, p < 0.0005], and the interaction of DKO and NR 1.6 % [F(1,329) = 16.6, p < 0.0005], supporting our hypothesis that FS introduces construct-irrelevant variance into the knowledge measurement. However, NR-scores showed considerably lower reliabilities than FS-scores (mean year-test group Cronbach's alphas were 0.62 and 0.74, respectively). Decisions about FS with progress tests should be a careful trade-off between systematic and random measurement error.
Collapse
Affiliation(s)
- C J Ravesloot
- Radiology Department, University Medical Center Utrecht, Heidelberglaan 100, Room E01.132, 3508 GA, Utrecht, The Netherlands.
| | | | | | - C Haaring
- Radiology Department, University Medical Center Utrecht, Heidelberglaan 100, 3508 GA, Utrecht, The Netherlands
| | - C L J J Kruitwagen
- Julius Center, University Medical Center Utrecht, Heidelberglaan 100, 3508 GA, Utrecht, The Netherlands
| | - F J A Beek
- Radiology Department, University Medical Center Utrecht, Heidelberglaan 100, 3508 GA, Utrecht, The Netherlands
| | - J Bakker
- Albert Schweitzer Hospital, Dordrecht, The Netherlands
| | - J P J Van Schaik
- Radiology Department, University Medical Center Utrecht, Heidelberglaan 100, Room E01.132, 3508 GA, Utrecht, The Netherlands
| | - Th J Ten Cate
- Center for Research and Development of Education, University Medical Center Utrecht, Heidelberglaan 100, 3508 GA, Utrecht, The Netherlands
| |
Collapse
|
7
|
Totten V, Bellou A. Development of emergency medicine in Europe. Acad Emerg Med 2013; 20:514-21. [PMID: 23672367 DOI: 10.1111/acem.12126] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2012] [Revised: 11/26/2012] [Accepted: 11/30/2012] [Indexed: 11/28/2022]
Abstract
Emergency medicine (EM) is emerging worldwide. Its development as a recognized specialty is proceeding at difference rates in different countries. Europe is a region with complex political affiliations and is composed of countries both within and outside the European Union (EU). Europe is seeking greater standardization (harmonization) for mutually improved economic development. Medicine in general, and EM in particular, is no exception. In Europe, as in other regions, EM is struggling for acceptance as a valid field of specialization. The European Union of Medical Specialists requires that once two-fifths of countries acknowledge a specialty, all EU countries must address the question. EM had achieved the needed majority by 2011. This article briefly describes the European road to specialty acceptance.
Collapse
Affiliation(s)
- Vicken Totten
- University Hospitals Case Medical Center; Case School of Medicine ; Cleveland; OH
| | - Abdelouahab Bellou
- President of the European Society for Emergency Medicine; Faculty of Medicine; University Hospital ; Rennes; France
| |
Collapse
|
8
|
Schuwirth LWT, van der Vleuten CPM. The use of progress testing. PERSPECTIVES ON MEDICAL EDUCATION 2012; 1:24-30. [PMID: 23316456 PMCID: PMC3540387 DOI: 10.1007/s40037-012-0007-2] [Citation(s) in RCA: 73] [Impact Index Per Article: 6.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/11/2023]
Abstract
Progress testing is gaining ground rapidly after having been used almost exclusively in Maastricht and Kansas City. This increased popularity is understandable considering the intuitive appeal longitudinal testing has as a way to predict future competence and performance. Yet there are also important practicalities. Progress testing is longitudinal assessment in that it is based on subsequent equivalent, yet different, tests. The results of these are combined to determine the growth of functional medical knowledge for each student, enabling more reliable and valid decision making about promotion to a next study phase. The longitudinal integrated assessment approach has a demonstrable positive effect on student learning behaviour by discouraging binge learning. Furthermore, it leads to more reliable decisions as well as good predictive validity for future competence or retention of knowledge. Also, because of its integration and independence of local curricula, it can be used in a multi-centre collaborative production and administration framework, reducing costs, increasing efficiency and allowing for constant benchmarking. Practicalities include the relative unfamiliarity of faculty with the concept, the fact that remediation for students with a series of poor results is time consuming, the need to embed the instrument carefully into the existing assessment programme and the importance of equating subsequent tests to minimize test-to-test variability in difficulty. Where it has been implemented-collaboratively-progress testing has led to satisfaction, provided the practicalities are heeded well.
Collapse
Affiliation(s)
- Lambert W. T. Schuwirth
- Flinders Innovation in Clinical Education, Flinders University, Adelaide, Australia
- Department of Educational Development and Research, Maastricht University, Maastricht, the Netherlands
| | - Cees P. M. van der Vleuten
- Chair Department of Educational Development and Research, Maastricht University, Maastricht, the Netherlands
| |
Collapse
|
9
|
Wrigley W, van der Vleuten CPM, Freeman A, Muijtjens A. A systemic framework for the progress test: strengths, constraints and issues: AMEE Guide No. 71. MEDICAL TEACHER 2012; 34:683-97. [PMID: 22905655 DOI: 10.3109/0142159x.2012.704437] [Citation(s) in RCA: 72] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/11/2023]
Abstract
There has been increasing use and significance of progress testing in medical education. It is used in many ways and with several formats to reflect the variety of curricula and assessment purposes. These developments have occurred alongside a recognised sensitivity for error variance inherent in multiple choice tests from which challenges to its validity and reliability have arisen. This Guide presents a generic, systemic framework to help identify and explore improvements in the quality and defensibility of progress test data. The framework draws on the combined experience of the Dutch consortium, an individual medical school in the United Kingdom, and the bulk of the progress test literature to date. It embeds progress testing as a quality-controlled assessment tool for improving learning, teaching and the demonstration of educational standards. The paper describes strengths, highlights constraints and explores issues for improvement. These may assist in the establishment of potential or new progress testing in medical education programmes. They can also guide the evaluation and improvement of existing programmes.
Collapse
Affiliation(s)
- William Wrigley
- Department of Educational Development and Research, Maastricht University, The Netherlands
| | | | | | | |
Collapse
|
10
|
Dezee KJ, Artino AR, Elnicki DM, Hemmer PA, Durning SJ. Medical education in the United States of America. MEDICAL TEACHER 2012; 34:521-5. [PMID: 22489971 DOI: 10.3109/0142159x.2012.668248] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/15/2023]
Abstract
This article was written to provide a brief history of the medical educational system in the USA, the current educational structure, and the current topics and challenges facing USA medical educators today. The USA is fortunate to have a robust educational system, with over 150 medical schools, thousands of graduate medical education programs, well-accepted standardized examinations throughout training, and many educational research programs. All levels of medical education, from curriculum reform in medical schools and the integration of competencies in graduate medical education, to the maintenance of certification in continuing medical education, have undergone rapid changes since the turn of the millennium. The intent of the changes has been to involve the patient sooner in the educational process, use better educational strategies, link educational processes more closely with educational outcomes, and focus on other skills besides knowledge. However, with the litany of changes have come increased regulation without (as of yet) clear evidence as to which of the changes will result in better physicians. In addition, the USA governmental debt crisis threatens the current educational structure. The next wave of changes in the USA medical system needs to focus on what particular educational strategies result in the best physicians and how to fund the system over the long term.
Collapse
Affiliation(s)
- Kent J Dezee
- Department of Medicine, Uniformed Services University of the Health Sciences, Bethesda, MD 20814, USA.
| | | | | | | | | |
Collapse
|
11
|
Abstract
Progress testing offers a unique contribution to medical education assessment in that it provides a tool for monitoring examinees' growth over time and their progress toward graduation objectives. However, one of the most important psychometric requirements of progress testing has typically been neglected in past applications: scores across the time and test forms are not commonly placed on the same scale. Equating is a method used to achieve this property. This article discusses the basic principles of equating and the particular challenges posed by progress testing. A hybrid equating method, along with the details of a recent application, is presented as a potential solution, and alternate approaches to equating are discussed. It is hoped that future applications of progress testing will both utilize equating and demonstrate its value as a tool in medical education assessment.
Collapse
Affiliation(s)
- Michelle M Langer
- National Board of Medical Examiners, Philadelphia, PA 19104-3102, USA.
| | | |
Collapse
|