1
|
Himmelbauer M, Koller D, Bäwert A, Horn W. [The examination mix at the Medical University Vienna]. Wien Med Wochenschr 2018; 169:101-109. [PMID: 30267247 DOI: 10.1007/s10354-018-0662-y] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2018] [Accepted: 09/07/2018] [Indexed: 10/28/2022]
Abstract
The present work describes a curriculum design with the instrument of an examination mix for the study of medicine at the medical University of Vienna. The fields of application of special examination formats as well as the advantages and disadvantages of the individual examination forms are presented. Types of summative written examinations, assessment of practical skills and abilities (OSCE), oral-practical examinations, as well as formative examinations are illustrated. Studies show that repeated testing leads to higher learning gains. Therefore, the challenge is to develop suitable methods for the continuous assessment of the learning progress in the study and to apply accordingly. Thus, in addition to the written summative exams, more and more oral and practical forms of examinations in courses with continuous assessment character should be used.
Collapse
Affiliation(s)
- Monika Himmelbauer
- Teaching Center, Medical University Vienna, Spitalgasse 23, 1090, Vienna, Österreich.
| | - Desiree Koller
- Teaching Center, Medical University Vienna, Spitalgasse 23, 1090, Vienna, Österreich
| | - Andjela Bäwert
- Teaching Center, Medical University Vienna, Spitalgasse 23, 1090, Vienna, Österreich
| | - Werner Horn
- Teaching Center, Medical University Vienna, Spitalgasse 23, 1090, Vienna, Österreich
| |
Collapse
|
2
|
[Examinations while studying medicine - more than simply grades]. Wien Med Wochenschr 2018; 169:126-131. [PMID: 30084089 DOI: 10.1007/s10354-018-0650-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2018] [Accepted: 07/11/2018] [Indexed: 10/28/2022]
Abstract
Assessment drives learning. Examinations need to be aligned primarily with learning objectives, as well as teaching and assessment methods of the courses on offer. In doing so, various examination instruments are required to measure on levels of competency that build on one another. An appropriate mix is essential to reflect the variety of learning outcomes of a chosen curriculum. Furthermore, examinations also possess the characteristics of evaluation: They reflect the knowledge and abilities of students and assess the teaching at a defined location. Digital examinations in the form of multiple-choice-question (MCQ) testing enable a higher degree of automation and accelerate the processes of creation, implementation, and evaluation of the examination results. Thus, they enjoy increasing popularity, provided that the technical requirements for large semester cohorts are met. Shifting examination processes to computers or tablets entails not only a wealth of new challenges but also opportunities.
Collapse
|
3
|
Gerhard-Szep S, Güntsch A, Pospiech P, Söhnel A, Scheutzel P, Wassmann T, Zahn T. Assessment formats in dental medicine: An overview. GMS JOURNAL FOR MEDICAL EDUCATION 2016; 33:Doc65. [PMID: 27579365 PMCID: PMC5003142 DOI: 10.3205/zma001064] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Received: 10/23/2015] [Revised: 03/24/2016] [Accepted: 05/09/2016] [Indexed: 05/25/2023]
Abstract
AIM At the annual meeting of German dentists in Frankfurt am Main in 2013, the Working Group for the Advancement of Dental Education (AKWLZ) initiated an interdisciplinary working group to address assessments in dental education. This paper presents an overview of the current work being done by this working group, some of whose members are also actively involved in the German Association for Medical Education's (GMA) working group for dental education. The aim is to present a summary of the current state of research on this topic for all those who participate in the design, administration and evaluation of university-specific assessments in dentistry. METHOD Based on systematic literature research, the testing scenarios listed in the National Competency-based Catalogue of Learning Objectives (NKLZ) have been compiled and presented in tables according to assessment value. RESULTS Different assessment scenarios are described briefly in table form addressing validity (V), reliability (R), acceptance (A), cost (C), feasibility (F), and the influence on teaching and learning (EI) as presented in the current literature. Infoboxes were deliberately chosen to allow readers quick access to the information and to facilitate comparisons between the various assessment formats. Following each description is a list summarizing the uses in dental and medical education. CONCLUSION This overview provides a summary of competency-based testing formats. It is meant to have a formative effect on dental and medical schools and provide support for developing workplace-based strategies in dental education for learning, teaching and testing in the future.
Collapse
Affiliation(s)
- Susanne Gerhard-Szep
- Goethe-Universität, Carolinum Zahnärztliches Universitäts-Institut gGmbH, Poliklinik Zahnerhaltungskunde, Frankfurt am Main, Deutschland
| | - Arndt Güntsch
- Marquette University School of Dentistry, Department of Surgical Sciences, Milwaukee, USA und Universitätsklinikum Jena, Zentrum für Zahn-, Mund- und Kieferheilkunde, Jena, Deutschland
| | - Peter Pospiech
- Universität Würzburg, Poliklinik für Zahnärztliche Prothetik, Würzburg, Deutschland
| | - Andreas Söhnel
- Universitätsmedizin Greifswald, Poliklinik für Zahnärztliche Prothetik, Alterszahnheilkunde und medizinischer Werkstoffkunde, Greifswald, Deutschland
| | - Petra Scheutzel
- Universitätsklinikum Münster, Poliklinik für Prothetische Zahnmedizin & Biomaterialien, Münster, Deutschland
| | - Torsten Wassmann
- Universitätsmedizin Göttingen, Poliklinik für Zahnärztliche Prothetik, Göttingen, Deutschland
| | - Tugba Zahn
- Goethe-Universität, Carolinum Zahnärztliches Universitäts-Institut gGmbH, Poliklinik für Zahnärztliche Prothetik, Frankfurt am Main, Deutschland
| |
Collapse
|
4
|
Schickler A, Brüstle P, Biller S. The Final Oral/Practical State Examination at Freiburg Medical Faculty in 2012--Analysis of grading to test quality assurance. GMS ZEITSCHRIFT FUR MEDIZINISCHE AUSBILDUNG 2015; 32:Doc39. [PMID: 26483852 PMCID: PMC4606482 DOI: 10.3205/zma000981] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/14/2014] [Revised: 10/23/2014] [Accepted: 01/05/2015] [Indexed: 11/30/2022]
Abstract
AIM The aim of this study is to analyze the grades given for the oral/practical part of the German State Examination at the Medical Faculty of Freiburg. We examined whether or not the grades given for the written and the oral/practical examinations correlated and if differences in grading between the Freiburg University Medical Center (UMC) and the other teaching hospitals could be found. In order to improve the quality of the state examination, the medical school has been offering standardized training for examiners for several years. We evaluated whether or not trained and untrained examiners differed in their grading of the exam and how these differences have changed over time. METHODS The results of the 2012 spring and fall exams were analyzed (N=315). The relevant data set was made available to us by the Baden-Württemberg Examination Office (Landesprüfungsamt). The data were analyzed by means of descriptive and inferential statistics. RESULTS We observed a correlation of ρ=0.460** between the grades for the written and the oral/practical exams. The UMC and the teaching hospitals did not differ significantly in their grade distributions. Compared to untrained examiners, trained ones assigned the grade of "very good" less often. Furthermore, they displayed a significantly higher variance in the grades given (p=0.007, phi=0.165). This effect is stronger when concentrating specifically on those examiners who took part in the training less than a year before. CONCLUSION The results of this study suggest that the standardized training for examiners at the Medical Faculty of Freiburg is effective for quality assurance. As a consequence, more examiners should be motivated to take part in the training.
Collapse
Affiliation(s)
- Angela Schickler
- Uni Freiburg, Kompetenzzentrum Lehrevaluation in der Medizin Baden-Württemberg, Sitz Freiburg, Freiburg, Deutschland
| | - Peter Brüstle
- Uni Freiburg, Kompetenzzentrum Lehrevaluation in der Medizin Baden-Württemberg, Sitz Freiburg, Freiburg, Deutschland
| | - Silke Biller
- Uni Freiburg, Kompetenzzentrum Lehrevaluation in der Medizin Baden-Württemberg, Sitz Freiburg, Freiburg, Deutschland
| |
Collapse
|
5
|
Biller S, Boeker M, Fabry G, Giesler M. Impact of the Medical Faculty on Study Success in Freiburg: Results from Graduate Surveys. GMS ZEITSCHRIFT FUR MEDIZINISCHE AUSBILDUNG 2015; 32:Doc44. [PMID: 26483857 PMCID: PMC4606483 DOI: 10.3205/zma000986] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/14/2014] [Revised: 06/11/2015] [Accepted: 08/05/2015] [Indexed: 11/30/2022]
Abstract
AIM Using the data from graduate surveys, this study aims to analyze which factors related to teaching and learning at the Freiburg Faculty of Medicine can influence study success. BACKGROUND Study success and the factors influencing it have long been the subject of investigation, with study success being measured in terms of easily quantifiable indicators (final grades, student satisfaction, etc.). In recent years, it has also frequently been assessed in terms of graduate competency levels. Graduate surveys are considered suitable instruments for measuring these dimensions of study success. METHOD Data from three Freiburg graduate surveys conducted one and a half years after graduation were drawn upon for the analysis. Study success was operationalized using four indicators: results on the written section of the M2 exam, self-assessment of medical expertise and scientific expertise, and student satisfaction. Using multiple regression analyses, the predictive power was calculated for selected variables, also measured by the graduate surveys, for the different study success indicators. RESULTS It was possible to identify models that contribute slightly or moderately to the prediction of study success. The score earned on the university entrance qualification demonstrated itself to be the strongest predictor for forecasting the M2 written exam: R(2) is between 0.08 and 0.22 for the three surveys. Different variables specific to degree program structure and teaching are helpful for predicting medical expertise (R(2)=0.04-0.32) and student satisfaction (R(2)=0.12-0.35). The two variables, structure and curricular sequencing of the degree program and combination of theory and practice, show themselves to be significant, sample-invariant predictors (β-weight(Structure)=0.21-0.58, β-weight(Combination)=0.27-0.56). For scientific expertise, no sample-independent predictors could be determined. CONCLUSION Factors describing teaching hardly provide any assistance when predicting the written M2 exam score, which makes sense to the extent that teaching goes far beyond the heavily knowledge-based content of the written M2 exam. The lack of predictability for scientific expertise is most likely explained in that these have been only rarely included in the curriculum and often inexplicitly so. The variable combination of theory and practice appears to be significant for imparting medical expertise and the development of student satisfaction. The extent to which these relationships are practically relevant needs to be explored in further studies. A specific limitation is that the measurement of expertise and skill is based solely on self-assessments.
Collapse
Affiliation(s)
- Silke Biller
- Universität Basel, Medizinische Fakultät, Studiendekanat, Basel, Schweiz
| | - Martin Boeker
- Universitätsklinikum Freiburg, Department für Medizinische Biometrie und Medizinische Informatik, Freiburg, Deutschland
| | - Götz Fabry
- Albert-Ludwigs-Universität Freiburg, Bereich für Medizinische Psychologie und Medizinische Soziologie, Freiburg, Deutschland
| | - Marianne Giesler
- Universität Freiburg, Medizinische Fakultät, Studiendekanat, Freiburg, Deutschland
- Kompetenzzentrum Lehrevaluation in der Medizin Baden-Württemberg, Sitz Freiburg, Deutschland
| |
Collapse
|
6
|
Raes P, Angstwurm M, Berberat P, Kadmon M, Rotgans J, Streitlein-Böhme I, Burckhardt G, Fischer MR. Quality management of clinical-practical instruction for Practical Year medical students in Germany - proposal for a catalogue of criteria from the German Society of Medical Education. GMS ZEITSCHRIFT FUR MEDIZINISCHE AUSBILDUNG 2014; 31:Doc49. [PMID: 25489349 PMCID: PMC4259068 DOI: 10.3205/zma000941] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/14/2003] [Revised: 08/18/2014] [Accepted: 09/24/2014] [Indexed: 11/30/2022]
Abstract
Objectives: Amended in 2013, the current version of the German Medical Licensure Regulation contains structural specifications that are also required of non-university institutions involved in Practical Year clinical training. The criteria are worded in relatively general terms. Furthermore, not all of the structural specifications can be readily applied to every subject area. In order to ensure commensurability in Practical Year instruction in Germany, not least in light of recently introduced Practical Year mobility, it is necessary to define consistent quality criteria for Practical Year training. The authors therefore propose a catalogue of criteria for the quality management process in Practical Year instruction facilities. Methods: In January 2014, the board of directors of the German Society for Medical Education decided to establish a committee comprised of representatives from various German medical faculties. In a process similar to the Delphi methodology, the group developed criteria for structure, process and outcome quality in Practical Year training in Germany. Results: The criteria developed for structure, process and outcome quality apply to Practical Year training in academic teaching hospitals and university medical centres. Furthermore, modalities for review are proposed. Conclusions: The present catalogue of criteria is intended to contribute to the formation of a basis for the most consistent quality standards possible for Practical Year instruction in Germany.
Collapse
Affiliation(s)
- Patricia Raes
- Ludwig Maximilian University of Munich, Faculty of Medicine, Office of the Dean, Munich, Germany
| | | | - Pascal Berberat
- Technical University of Munich, University Hospital Klinikum Rechts der Isar, Faculty of Medicine, TUM MeDiCAL (Medical Didactics Centre for Educational Research and Teaching), Munich, Germany
| | - Martina Kadmon
- Carl von Ossietzky University Oldenburg, Campus Wechloy, Oldenburg, Germany
| | - Jerome Rotgans
- Committee of the German Society for Medical Education Accreditation and Certification, c/o RWTH Aachen, Faculty of Medicine, Clinic for Conservative Dentistry, Periodontics and Preventative Dentistry, Aachen, Germany
| | - Irmgard Streitlein-Böhme
- Albert-Ludwigs-University Freiburg, Faculty of Medicine, Office of the Dean, Freiburg/Brsg., Germany
| | - Gerhard Burckhardt
- University of Göttingen, Faculty of Medicine, Office of the Dean, Göttingen, Germany
| | - Martin R Fischer
- Ludwig Maximilian University of Munich, Faculty of Medicine, Office of the Dean, Munich, Germany ; University Hospital of Munich, Institute for Medical Education, Munich, Germany
| |
Collapse
|
7
|
Heinke W, Rotzoll D, Hempel G, Zupanic M, Stumpp P, Kaisers UX, Fischer MR. Students benefit from developing their own emergency medicine OSCE stations: a comparative study using the matched-pair method. BMC MEDICAL EDUCATION 2013; 13:138. [PMID: 24098996 PMCID: PMC3852440 DOI: 10.1186/1472-6920-13-138] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/12/2012] [Accepted: 10/04/2013] [Indexed: 06/02/2023]
Abstract
BACKGROUND Students can improve the learning process by developing their own multiple choice questions. If a similar effect occurred when creating OSCE (objective structured clinical examination) stations by themselves it could be beneficial to involve them in the development of OSCE stations. This study investigates the effect of students developing emergency medicine OSCE stations on their test performance. METHOD In the 2011/12 winter semester, an emergency medicine OSCE was held for the first time at the Faculty of Medicine at the University of Leipzig. When preparing for the OSCE, 13 students (the intervention group) developed and tested emergency medicine examination stations as a learning experience. Their subsequent OSCE performance was compared to that of 13 other students (the control group), who were parallelized in terms of age, gender, semester and level of previous knowledge using the matched-pair method. In addition, both groups were compared to 20 students who tested the OSCE prior to regular emergency medicine training (test OSCE group). RESULTS There were no differences between the three groups regarding age (24.3 ± 2.6; 24.2 ± 3.4 and 24 ± 2.3 years) or previous knowledge (29.3 ± 3.4; 29.3 ± 3.2 and 28.9 ± 4.7 points in the multiple choice [MC] exam in emergency medicine). Merely the gender distribution differed (8 female and 5 male students in the intervention and control group vs. 3 males and 17 females in the test OSCE group).In the exam OSCE, participants in the intervention group scored 233.4 ± 6.3 points (mean ± SD) compared to 223.8 ± 9.2 points (p < 0.01) in the control group. Cohen's effect size was d = 1.24. The students of the test OSCE group scored 223.2 ± 13.4 points. CONCLUSIONS Students who actively develop OSCE stations when preparing for an emergency medicine OSCE achieve better exam results.
Collapse
Affiliation(s)
- Wolfgang Heinke
- Department of Anaesthesiology and Intensive Care Medicine, University of Leipzig, Liebigstrasse 20, Leipzig 04103, Germany
| | - Daisy Rotzoll
- Training Clinic of the Faculty of Medicine, University of Leipzig, Liebigstrasse 27, Leipzig 04103, Germany
| | - Gunther Hempel
- Department of Anaesthesiology and Intensive Care Medicine, University of Leipzig, Liebigstrasse 20, Leipzig 04103, Germany
| | - Michaela Zupanic
- Faculty of Health, University of Witten/Herdecke, Alfred-Herrhausen-Straße 50, Witten 58448, Germany
| | - Patrick Stumpp
- Department of Diagnostic and Interventional Radiology, University of Leipzig, Liebigstrasse 20, Leipzig 04103, Germany
| | - Udo X Kaisers
- Department of Anaesthesiology and Intensive Care Medicine, University of Leipzig, Liebigstrasse 20, Leipzig 04103, Germany
| | - Martin R Fischer
- Department of Medical Education, Munich University Hospital, Ludwig-Maximilians-University Munich, Ziemssenstraße 1, Munich 80336, Germany
| |
Collapse
|