1
|
Gupta SK, Srivastava T. Assessment in Undergraduate Competency-Based Medical Education: A Systematic Review. Cureus 2024; 16:e58073. [PMID: 38738047 PMCID: PMC11088485 DOI: 10.7759/cureus.58073] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/11/2024] [Indexed: 05/14/2024] Open
Abstract
BACKGROUND Studies that have methodically compiled the body of research on the competency-based medical education (CBME) assessment procedure and pinpointed knowledge gaps about the structure of the assessment process are few. Thus, the goals of the study were to create a model assessment framework for competency-based medical education that would be applicable in the Indian setting as well as to thoroughly examine the competency-based medical education assessment framework. METHODS PubMed, MEDLINE (Ovid), EMBASE (Ovid), Scopus, Web of Science, and Google Scholar were the databases that were searched. The search parameters were restricted to English language publications about competency-based education and assessment methods, which were published between January 2006 and December 2020. A descriptive overview of the included research (in tabular form) served as the foundation for the data synthesis. RESULTS Databases provided 732 records; out of which 36 fulfilled the inclusion and exclusion criteria. Thirty-six studies comprised a mix of randomized controlled trials, focus group interviews, and questionnaire studies, including cross-sectional studies, qualitative studies (03), mixed-method studies, etc. The papers were published in 10 different journals. The greatest number was published in BMC Medical Education (18). The average quality score for included studies was 62.53% (range: 35.71-83.33%). Most authors are from the UK (07), followed by the USA (05). The included studies were grouped into seven categories based on their dominant focus: moving away from a behavioristic approach to a constructive approach of assessment (01 studies), formative assessment (FA) and feedback (10 studies), the hurdles in the implementation of feedback (04 studies), utilization of computer or online based formative test with automated feedback (05 studies), video feedback (02 studies), e-learning platforms for formative assessment (04 studies), studies related to workplace-based assessment (WBA)/mini-clinical evaluation exercise (mini-CEX)/direct observation of procedural skills (DOPS) (10 studies). CONCLUSIONS Various constructivist techniques, such as concept maps, portfolios, and rubrics, can be used for assessments. Self-regulated learning, peer feedback, online formative assessment, an online computer-based formative test with automated feedback, the use of a computerized web-based objective structured clinical examination (OSCE) evaluation system, and the use of narrative feedback instead of numerical scores in mini-CEX are all ways to increase student involvement in the design and implementation of the formative assessment.
Collapse
Affiliation(s)
- Sandeep K Gupta
- Pharmacology, Heritage Institute of Medical Sciences, Varanasi, IND
| | | |
Collapse
|
2
|
Shafqat S, Tejani I, Ali M, Tariq H, Sabzwari S. Feasibility and Effectiveness of Mini-Clinical Evaluation Exercise (Mini-CEX) in an Undergraduate Medical Program: A Study From Pakistan. Cureus 2022; 14:e29563. [PMID: 36312643 PMCID: PMC9595266 DOI: 10.7759/cureus.29563] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/25/2022] [Indexed: 11/07/2022] Open
Abstract
Background In clinical settings, direct observation (DO) with feedback is an effective method to assess and improve learner performance. One tool used for DO is the mini-clinical evaluation exercise (Mini-CEX). We conducted a study to assess the effectiveness and feasibility of Mini-CEX for medical students at Aga Khan University, Karachi. Methods Utilizing a purposive sampling technique, a total of 199 students in six core clerkships of Years 3 and 4 were selected for this study. Participating faculty underwent training workshops for use of Mini-CEX and feedback strategies. Each student was assessed twice by one faculty, using a modified version of the Mini-CEX, which assessed four domains of clinical skills: Data Gathering, Communication, Diagnosis/Differential, and Management Plan and Organization. Feedback was given after each encounter. Faculty and students also provided detailed feedback regarding the process of DO. Data were analyzed using Statistical Package for Social Sciences (SPSS) version 26 (IBM Corp., Armonk, NY, USA), with categorical variables arranged as frequencies and percentages. The Chi-squared test was used for further statistical analyses, and a P-value of < 0.05 was considered statistically significant. Effectiveness was assessed via a change in student performance between the first and second Mini-CEX, and feasibility was assessed via qualitative feedback. Results We obtained three sets of results: Mini-CEX forms (523), from which we included a total 350 evaluations for analysis, 216 from Year 3 and 134 from Year 4, and feedback on DO: student (70) and faculty (18). Year 3 students performed significantly better in all foci of the Mini-CEX between the first and second assessment (P ≤ 0.001), whereas in Year 4, significant improvement was limited to only two domains of the Mini-CEX [Communication of History/Physical Examination (P = 0.040) and Diagnosis/Differential and Management Plan (P < 0.001)]. Students (65.7%) and faculty (94.4%) felt this exercise improved their interaction. 83.3% faculty recommended its formal implementation compared to 27.1% of students, who reported challenges in implementation of the Mini-CEX such as time constraints, logistics, the subjectivity of assessment, and varying interest by faculty. Conclusion Direct observation using Mini-CEX is effective in improving the clinical and diagnostic skills of medical students and strengthens student-faculty interaction. While challenges exist in its implementation, the strategic placement of Mini-CEX may enhance its utility in measuring student competency.
Collapse
|
3
|
Véliz C, Fuentes-Cimma J, Fuentes-López E, Riquelme A. Adaptation, psychometric properties, and implementation of the Mini-CEX in dental clerkship. J Dent Educ 2020; 85:300-310. [PMID: 33094514 DOI: 10.1002/jdd.12462] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2020] [Revised: 09/24/2020] [Accepted: 10/06/2020] [Indexed: 11/10/2022]
Abstract
BACKGROUND Workplace-based assessment is a key component of dental-student clerkships, allowing students to demonstrate clinical proficiency. PURPOSE This study adapts the Mini-Clinical Evaluation Exercise (Mini-CEX) to a dentistry-program clerkship, analyzing the results and examining the psychometric properties of Mini-CEX. METHODS First, Delphi panel methodology was used to ensure content validity. Mini-CEX was then piloted in the dental-clerkship program, with each student assessed by at least 2 supervisors and a peer student. Subsequently, psychometric properties, acceptability, and observation time were analyzed. RESULTS The study was conducted between July and November 2019. Overall, 140 Mini-CEX evaluation exercises were carried out on 30 students by 84 supervisors and 56 peers. The adapted instrument was found to be unidimensional, obtaining an acceptable internal consistency (α = 0.74). As the assessor type changed, there were differences in observation time; the medians (Q1-Q3) were 10 minutes (5-15) for supervisors and 30 minutes (20-45) for peer students (P < 0.001). This difference was also observed in assessor perceptions (P < 0.001), with supervisors scoring a median of 6 (6-6.75) and peer students scoring a median of 7 (6-7). No differences were found between supervisor and peer scores. CONCLUSION The adapted version of Mini-CEX can objectively assess the clinical performance of dental students, achieving validity and reliability values similar to those obtained in the original instrument.
Collapse
Affiliation(s)
- Claudia Véliz
- School of Dentistry, Faculty of Medicine, Pontificia Universidad Católica de Chile, Santiago, Chile
| | - Javiera Fuentes-Cimma
- Physiotherapy Program, Department of Health Sciences, Faculty of Medicine, Pontificia Universidad Católica de Chile, Santiago, Chile
| | - Eduardo Fuentes-López
- Speech-Language Pathology Program, Department of Health Sciences, Faculty of Medicine, Pontificia Universidad Católica de Chile, Santiago, Chile
| | - Arnoldo Riquelme
- Department of Gastroenterology, Centre for Medical Education and Health Sciences, Department of Health Sciences, Faculty of Medicine, Pontificia Universidad Católica de Chile, Santiago, Chile
| |
Collapse
|
4
|
Hyde C, Yardley S, Lefroy J, Gay S, McKinley RK. Clinical assessors' working conceptualisations of undergraduate consultation skills: a framework analysis of how assessors make expert judgements in practice. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2020; 25:845-875. [PMID: 31997115 PMCID: PMC7471149 DOI: 10.1007/s10459-020-09960-3] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/19/2019] [Accepted: 01/18/2020] [Indexed: 06/10/2023]
Abstract
Undergraduate clinical assessors make expert, multifaceted judgements of consultation skills in concert with medical school OSCE grading rubrics. Assessors are not cognitive machines: their judgements are made in the light of prior experience and social interactions with students. It is important to understand assessors' working conceptualisations of consultation skills and whether they could be used to develop assessment tools for undergraduate assessment. To identify any working conceptualisations that assessors use while assessing undergraduate medical students' consultation skills and develop assessment tools based on assessors' working conceptualisations and natural language for undergraduate consultation skills. In semi-structured interviews, 12 experienced assessors from a UK medical school populated a blank assessment scale with personally meaningful descriptors while describing how they made judgements of students' consultation skills (at exit standard). A two-step iterative thematic framework analysis was performed drawing on constructionism and interactionism. Five domains were found within working conceptualisations of consultation skills: Application of knowledge; Manner with patients; Getting it done; Safety; and Overall impression. Three mechanisms of judgement about student behaviour were identified: observations, inferences and feelings. Assessment tools drawing on participants' conceptualisations and natural language were generated, including 'grade descriptors' for common conceptualisations in each domain by mechanism of judgement and matched to grading rubrics of Fail, Borderline, Pass, Very good. Utilising working conceptualisations to develop assessment tools is feasible and potentially useful. Work is needed to test impact on assessment quality.
Collapse
Affiliation(s)
- Catherine Hyde
- School of Medicine, Keele University, Keele, Staffordshire, ST5 5BG, UK
| | - Sarah Yardley
- School of Medicine, Keele University, Keele, Staffordshire, ST5 5BG, UK.
- Palliative Care Service, Central and North West London NHS Foundation Trust, St Pancras Hospital, 5th Floor South Wing, 4 St. Pancras Way, London, NW1 0PE, UK.
| | - Janet Lefroy
- School of Medicine, Keele University, Keele, Staffordshire, ST5 5BG, UK
| | - Simon Gay
- University of Leicester School of Medicine, Leicester, UK
| | - Robert K McKinley
- School of Medicine, Keele University, Keele, Staffordshire, ST5 5BG, UK
| |
Collapse
|
5
|
Komasawa N, Terasaki F, Nakano T, Kawata R. Relationships between objective structured clinical examination, computer-based testing, and clinical clerkship performance in Japanese medical students. PLoS One 2020; 15:e0230792. [PMID: 32214357 PMCID: PMC7098585 DOI: 10.1371/journal.pone.0230792] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2019] [Accepted: 03/08/2020] [Indexed: 11/18/2022] Open
Abstract
Background It is unclear how comprehensive evaluations conducted prior to clinical clerkships (CC), such as the objective structured clinical examination (OSCE) and computer-based testing (CBT), reflect the performance of medical students in CC. Here we retrospectively analyzed correlations between OSCE and CBT scores and CC performance. Methods Ethical approval was obtained from our institutional review board. We analyzed correlations between OSCE and CBT scores and CC performance in 94 medical students who took the OSCE and CBT in 2017 when they were 4th year students, and who participated in the basic CC in 2018 when they were 5th year students. Results Total scores for OSCE and CBT were significantly correlated with CC performance (P<0.001, each). More specifically, medical interview and chest examination components of the OSCE were significantly correlated with CC performance (P = 0.001, each), while the remaining five components of the OSCE were not. Conclusion Our findings suggest that the OSCE and CBT play important roles in predicting CC performance in Japanese medical education context. Among OSCE components, medical interview and chest examination were suggested to be important for predicting CC performance.
Collapse
Affiliation(s)
- Nobuyasu Komasawa
- Medical Education Center, Osaka Medical College, Takatsuki, Japan
- * E-mail:
| | - Fumio Terasaki
- Medical Education Center, Osaka Medical College, Takatsuki, Japan
| | - Takashi Nakano
- Medical Education Center, Osaka Medical College, Takatsuki, Japan
| | - Ryo Kawata
- Medical Education Center, Osaka Medical College, Takatsuki, Japan
| |
Collapse
|
6
|
Mortaz Hejri S, Jalili M, Masoomi R, Shirazi M, Nedjat S, Norcini J. The utility of mini-Clinical Evaluation Exercise in undergraduate and postgraduate medical education: A BEME review: BEME Guide No. 59. MEDICAL TEACHER 2020; 42:125-142. [PMID: 31524016 DOI: 10.1080/0142159x.2019.1652732] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Background: This BEME review aims at exploring, analyzing, and synthesizing the evidence considering the utility of the mini-CEX for assessing undergraduate and postgraduate medical trainees, specifically as it relates to reliability, validity, educational impact, acceptability, and cost.Methods: This registered BEME review applied a systematic search strategy in seven databases to identify studies on validity, reliability, educational impact, acceptability, or cost of the mini-CEX. Data extraction and quality assessment were carried out by two authors. Discrepancies were resolved by a third reviewer. Descriptive synthesis was mainly used to address the review questions. A meta-analysis was performed for Cronbach's alpha.Results: Fifty-eight papers were included. Only two studies evaluated all five utility criteria. Forty-seven (81%) of the included studies met seven or more of the quality criteria. Cronbach's alpha ranged from 0.58 to 0.97 (weighted mean = 0.90). Reported G coefficients, Standard error of measurement, and confidence interval were diverse and varied based on the number of encounters and the nested or crossed design of the study. The calculated number of encounters needed for a desirable G coefficient also varied greatly. Content coverage was reported satisfactory in several studies. Mini-CEX discriminated between various levels of competency. Factor analyses revealed a single dimension. The six competencies showed high levels of correlation with statistical significance with the overall competence. Moderate to high correlations between mini-CEX scores and other clinical exams were reported. The mini-CEX improved students' performance in other examinations. By providing a framework for structured observation and feedback, the mini-CEX exerts a favorable educational impact. Included studies revealed that feedback was provided in most encounters but its quality was questionable. The completion rates were generally above 50%. Feasibility and high satisfaction were reported.Conclusion: The mini-CEX has reasonable validity, reliability, and educational impact. Acceptability and feasibility should be interpreted given the required number of encounters.
Collapse
Affiliation(s)
- Sara Mortaz Hejri
- Department of Medical Education, School of Medicine, Tehran University of Medical Sciences, Tehran, Iran
| | - Mohammad Jalili
- Department of Medical Education, School of Medicine, Tehran University of Medical Sciences, Tehran, Iran
- Department of Emergency Medicine, School of Medicine, Tehran University of Medical Sciences, Tehran, Iran
| | - Rasoul Masoomi
- Department of Medical Education, School of Medicine, Tehran University of Medical Sciences, Tehran, Iran
| | - Mandana Shirazi
- Department of Medical Education, School of Medicine, Tehran University of Medical Sciences, Tehran, Iran
- Department of Clinical Science and Education at SOS Hospital, Karolina Institute, Stockholm, Sweden
| | - Saharnaz Nedjat
- Department of Epidemiology and Biostatistics, School of Public Health, Tehran University of Medical Sciences, Tehran, Iran
| | - John Norcini
- Foundation for Advancement of International Medical Education and Research (FAIMER), Philadelphia, PA, USA
| |
Collapse
|
7
|
Lee V, Brain K, Martin J. From opening the 'black box' to looking behind the curtain: cognition and context in assessor-based judgements. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2019; 24:85-102. [PMID: 30302670 DOI: 10.1007/s10459-018-9851-0] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/25/2018] [Accepted: 09/06/2018] [Indexed: 06/08/2023]
Abstract
The increasing use of direct observation tools to assess routine performance has resulted in the growing reliance on assessor-based judgements in the workplace. However, we have a limited understanding of how assessors make judgements and formulate ratings in real world contexts. The current research on assessor cognition has largely focused on the cognitive domain but the contextual factors are equally important, and both are closely interconnected. This study aimed to explore the perceived cognitive and contextual factors influencing Mini-CEX assessor judgements in the Emergency Department setting. We used a conceptual framework of assessor-based judgement to develop a sequential mixed methods study. We analysed and integrated survey and focus group results to illustrate self-reported cognitive and contextual factors influencing assessor judgements. We used situated cognition theory as a sensitizing lens to explore the interactions between people and their environment. The major factors highlighted through our mixed methods study were: clarity of the assessment, reliance on and variable approach to overall impression (gestalt), role tension especially when giving constructive feedback, prior knowledge of the trainee and case complexity. We identified prevailing tensions between participants (assessors and trainees), interactions (assessment and feedback) and setting. The two practical implications of our research are the need to broaden assessor training to incorporate both cognitive and contextual domains, and the need to develop a more holistic understanding of assessor-based judgements in real world contexts to better inform future research and development in workplace-based assessments.
Collapse
Affiliation(s)
- Victor Lee
- Department of Emergency Medicine, Austin Health, P.O. Box 5555, Heidelberg, VIC, 3084, Australia.
| | | | - Jenepher Martin
- Eastern Health Clinical School, Monash University and Deakin University, Box Hill, VIC, Australia
| |
Collapse
|
8
|
Duijn CCMA, Dijk EJV, Mandoki M, Bok HGJ, Cate OTJT. Assessment Tools for Feedback and Entrustment Decisions in the Clinical Workplace: A Systematic Review. JOURNAL OF VETERINARY MEDICAL EDUCATION 2019; 46:340-352. [PMID: 31460844 DOI: 10.3138/jvme.0917-123r] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
BACKGROUND: Entrustable Professional Activities (EPAs) combine feedback and evaluation with a permission to act under a specified level of supervision and the possibility to schedule learners for clinical service. This literature review aims to identify workplace-based assessment tools that indicate progression toward unsupervised practice, suitable for entrustment decisions and feedback to learners. METHODS: A systematic search was performed in the PubMed, Embase, ERIC, and PsycINFO databases. Based on title/abstract and full text, articles were selected using predetermined inclusion and exclusion criteria. Information on workplace-based assessment tools was extracted using data coding sheets. The methodological quality of studies was assessed using the medical education research study quality instrument (MERSQI). RESULTS: The search yielded 6,371 articles (180 were evaluated in full text). In total, 80 articles were included, identifying 67 assessment tools. Only a few studies explicitly mentioned assessment tools used as a resource for entrustment decisions. Validity evidence was frequently reported, and the MERSQI score was 10.0 on average. CONCLUSIONS: Many workplace-based assessment tools were identified that potentially support learners with feedback on their development and support supervisors with providing feedback. As expected, only few articles referred to entrustment decisions. Nevertheless, the existing tools or the principals could be used for entrustment decisions, supervision level, or autonomy.
Collapse
|
9
|
Burggraf M, Kristin J, Wegner A, Beck S, Herbstreit S, Dudda M, Jäger M, Kauther MD. Willingness of medical students to be examined in a physical examination course. BMC MEDICAL EDUCATION 2018; 18:246. [PMID: 30373579 PMCID: PMC6206683 DOI: 10.1186/s12909-018-1353-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/12/2018] [Accepted: 10/18/2018] [Indexed: 05/08/2023]
Abstract
BACKGROUND Physical examination courses are an essential part of the education of medical students. The aim of this study was to ascertain the factors influencing students' motivation and willingness to participate in a physical examination course. METHODS Students were asked to complete a questionnaire subdivided into five domains: anthropometric data, religiousness, motivation to take part in physical examination courses, willingness to be physically examined at 11 different body regions by peers or a professional tutor and a field for free text. RESULTS The questionnaire was completed by 142 medical students. The importance of the examination course was rated 8.7 / 10 points, the score for students' motivation was 7.8 / 10 points. Willingness to be physically examined ranged from 6 to 100% depending on body part and examiner. Female students were significantly less willing to be examined at sensitive body parts (breast, upper body, groin and the hip joint; p = .003 to < .001), depending on group composition and / or examiner. Strictly religious students showed significantly less willingness to undergo examination of any part of the body except the hand (p = .02 to < .001). Considering BMI, willingness to be examined showed comparable rates for normal weight and under- / overweight students in general (80% vs. 77%). Concerning the composition of the group for physical examination skills courses, students preferred self-assembled over mixed gender and same gender groups. CONCLUSIONS Peer physical examination is a method to improve students' skills. While motivation to participate in and acceptance of the physical examination course appears to be high, willingness to be examined is low for certain parts of the body, e.g. breast and groin, depending on religiousness, gender and examiner. Examination by a professional medical tutor did not lead to higher acceptance. Most students would prefer to choose their team for physical examination courses themselves rather than be assigned to a group.
Collapse
Affiliation(s)
- Manuel Burggraf
- Department of Orthopaedics and Trauma Surgery, University Hospital Essen, University of Duisburg-Essen, Hufelandstr. 55, 45147 Essen, Germany
| | - Julia Kristin
- Department of Otorhinolaryngology, University Hospital Duesseldorf, Heinrich Heine University Duesseldorf, Duesseldorf, Germany
| | - Alexander Wegner
- Department of Orthopaedics and Trauma Surgery, University Hospital Essen, University of Duisburg-Essen, Hufelandstr. 55, 45147 Essen, Germany
| | - Sascha Beck
- Department of Orthopaedics and Trauma Surgery, University Hospital Essen, University of Duisburg-Essen, Hufelandstr. 55, 45147 Essen, Germany
| | - Stephanie Herbstreit
- Department of Orthopaedics and Trauma Surgery, University Hospital Essen, University of Duisburg-Essen, Hufelandstr. 55, 45147 Essen, Germany
| | - Marcel Dudda
- Department of Orthopaedics and Trauma Surgery, University Hospital Essen, University of Duisburg-Essen, Hufelandstr. 55, 45147 Essen, Germany
| | - Marcus Jäger
- Department of Orthopaedics and Trauma Surgery, University Hospital Essen, University of Duisburg-Essen, Hufelandstr. 55, 45147 Essen, Germany
| | - Max Daniel Kauther
- Department of Orthopaedics and Trauma Surgery, University Hospital Essen, University of Duisburg-Essen, Hufelandstr. 55, 45147 Essen, Germany
| |
Collapse
|
10
|
Schüler IM, Heinrich-Weltzien R, Eiselt M. Effect of individual structured and qualified feedback on improving clinical performance of dental students in clinical courses-randomised controlled study. EUROPEAN JOURNAL OF DENTAL EDUCATION : OFFICIAL JOURNAL OF THE ASSOCIATION FOR DENTAL EDUCATION IN EUROPE 2018; 22:e458-e467. [PMID: 29424934 DOI: 10.1111/eje.12325] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 01/09/2018] [Indexed: 06/08/2023]
Abstract
AIM Analysis of the effect of individual structured and qualified feedback (FB) on practical skills development of dental students during clinical courses. METHODS Fifty-three final-year dental students at Jena University Hospital participated in this prospective randomised controlled interventional study. Two calibrated assessors evaluated 128 pre- and post-assessments of 4 different dental treatment steps performed by dental students during the integrated clinical course in restorative dentistry and prosthodontics and the clinical course paediatric dentistry. The assessment included direct observation, graded and non-grading evaluation and was documented with a specific FB assessment tool. Dental students in the intervention group (IG) received an elaborated, structured and qualified FB after the pre-assessment that focussed on individual strengths and weaknesses, providing specific suggestions for improvement and establishing a personal learning goal. Participants were randomly allocated to the IG and the control group (CG). RESULTS In both groups, dental students significantly enhanced their performance, but the difference was higher in the IG than in the CG. Large effect sizes (ES) were observed in all observed items, but FB had largest effect size in improving technical skills (ES = 1.6), followed by management (ES = 1.3) and communication skills (ES = 0.8). Factors with the highest influence on FB in enhancing dental students' clinical performance were their insight into their own mistakes or omissions, the observed dental treatment step and the duration of FB. CONCLUSION Individual structured and qualified FB is an effective method to enhance dental students' professional performances and to individually guide the learning process.
Collapse
Affiliation(s)
- I M Schüler
- Department of Preventive and Paediatric Dentistry, Jena University Hospital, Jena, Germany
| | - R Heinrich-Weltzien
- Department of Preventive and Paediatric Dentistry, Jena University Hospital, Jena, Germany
| | - M Eiselt
- Deanery, Medical Faculty, Friedrich-Schiller-University Jena, Jena, Germany
| |
Collapse
|
11
|
Berendonk C, Rogausch A, Gemperli A, Himmel W. Variability and dimensionality of students' and supervisors' mini-CEX scores in undergraduate medical clerkships - a multilevel factor analysis. BMC MEDICAL EDUCATION 2018; 18:100. [PMID: 29739387 PMCID: PMC5941409 DOI: 10.1186/s12909-018-1207-1] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/22/2017] [Accepted: 04/20/2018] [Indexed: 05/28/2023]
Abstract
BACKGROUND The mini clinical evaluation exercise (mini-CEX)-a tool used to assess student-patient encounters-is increasingly being applied as a learning device to foster clinical competencies. Although the importance of eliciting self-assessment for learning is widely acknowledged, little is known about the validity of self-assessed mini-CEX scores. The aims of this study were (1) to explore the variability of medical students' self-assessed mini-CEX scores, and to compare them with the scores obtained from their clinical supervisors, and (2) to ascertain whether learners' self-assessed mini-CEX scores represent a global dimension of clinical competence or discrete clinical skills. METHODS In year 4, medical students conducted one to three mini-CEX per clerkship in gynaecology, internal medicine, paediatrics, psychiatry and surgery. Students and clinical supervisors rated the students' performance on a 10-point scale (1 = great need for improvement; 10 = little need for improvement) in the six domains history taking, physical examination, counselling, clinical judgement, organisation/efficiency and professionalism as well as in overall performance. Correlations between students' self-ratings and ratings from clinical supervisors were calculated (Pearson's correlation coefficient) based on averaged scores per domain and overall. To investigate the dimensionality of the mini-CEX domain scores, we performed factor analyses using linear mixed models that accounted for the multilevel structure of the data. RESULTS A total of 1773 mini-CEX from 164 students were analysed. Mean scores for the six domains ranged from 7.5 to 8.3 (student ratings) and from 8.8 to 9.3 (supervisor ratings). Correlations between the ratings of students and supervisors for the different domains varied between r = 0.29 and 0.51 (all p < 0.0001). Mini-CEX domain scores revealed a single-factor solution for both students' and supervisors' ratings, with high loadings of all six domains between 0.58 and 0.83 (students) and 0.58 and 0.84 (supervisors). CONCLUSIONS These findings put a question mark on the validity of mini-CEX domain scores for formative purposes, as neither the scores obtained from students nor those obtained from clinical supervisors unravelled specific strengths and weaknesses of individual students' clinical competence.
Collapse
Affiliation(s)
- Christoph Berendonk
- Department of Assessment and Evaluation, Institute of Medical Education, University of Bern, Konsumstrasse 13, 3010 Bern, CH Switzerland
| | - Anja Rogausch
- Department of Assessment and Evaluation, Institute of Medical Education, University of Bern, Konsumstrasse 13, 3010 Bern, CH Switzerland
| | - Armin Gemperli
- Department of Health Sciences and Health Policy, University of Lucerne, Lucerne, Switzerland
- Swiss Paraplegic Research, Nottwil, Switzerland
| | - Wolfgang Himmel
- Department of General Practice, University Medical Center Göttingen, Göttingen, Germany
| |
Collapse
|
12
|
Humphrey-Murto S, Côté M, Pugh D, Wood TJ. Assessing the Validity of a Multidisciplinary Mini-Clinical Evaluation Exercise. TEACHING AND LEARNING IN MEDICINE 2018; 30:152-161. [PMID: 29240463 DOI: 10.1080/10401334.2017.1387553] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
UNLABELLED Construct: The purpose of this study was to provide validity evidence for the mini-clinical evaluation exercise (mini-CEX) as an assessment tool for clinical skills in the workplace. BACKGROUND Previous research has demonstrated validity evidence for the mini-CEX, but most studies were carried out in internal medicine or single disciplines, therefore limiting generalizability of the findings. If the mini-CEX is to be used in multidisciplinary contexts, then validity evidence should be gathered in similar settings. The purpose of this study was to gather further validity evidence for the mini-CEX but in a broader context. Specifically we sought to explore the effects of discipline and rater type on mini-CEX scores, internal structure, and the relationship between mini-CEXs and OSCEs in a multidisciplinary context. APPROACH During clerkship, medical students completed eight different rotations (family medicine, internal medicine, surgery, psychiatry, pediatrics, emergency, anesthesiology and obstetrics and gynecology). During each rotation, mini-CEX forms and a written examination were completed. Two multidisciplinary OSCEs (in Clerkship Year 3 and start of Year 4) assessed clinical skills. The reliability of the mini-CEX was assessed using Generalizability analyses. To assess the influence of discipline and rater type, mean scores were analyzed using a factorial analysis of variance. The total mini-CEX score was correlated to scores from the students' respective OSCEs and corresponding written exams. RESULTS Eighty-two students met inclusion criteria for a total of 781 ratings (average of 9.82 mini-CEX forms per student). There was a significant effect of discipline (p < .001, = .16), and faculty provided lower scores than nonfaculty raters (7.12 vs. 7.41; p = .002, = .02). The g-coefficient was .53 when discipline was included as a facet and .23 when rater type was a facet. There were low, but statistically significant correlations between the mini-CEX and scores for the 4th-year OSCE Total Score and the OSCE communication scores, r(80) = .40, p < .001 and r(80) = .29, p = .009. The mini-CEX was not correlated with the written examination scores for any of the disciplines. CONCLUSIONS Our results provide conflicting findings for validity evidence for the mini-CEX. Mini-CEX ratings were correlated to multidisciplinary OSCEs but not written examinations, supporting the validity argument. However, reliability of the mini-CEX was low to moderate, and error accounted for the greatest amount of variability in scores. There was variation in scores due to discipline and resident raters gave higher scores than faculty. These results should be considered when considering the use of the mini-CEX in different contexts.
Collapse
Affiliation(s)
| | - Mylène Côté
- a Department of Medicine , University of Ottawa , Ottawa , Ontario , Canada
| | - Debra Pugh
- a Department of Medicine , University of Ottawa , Ottawa , Ontario , Canada
| | - Timothy J Wood
- b Department of Innovation in Medical Education , University of Ottawa , Ottawa , Ontario , Canada
| |
Collapse
|
13
|
Lee V, Brain K, Martin J. Factors Influencing Mini-CEX Rater Judgments and Their Practical Implications: A Systematic Literature Review. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2017; 92:880-887. [PMID: 28030422 DOI: 10.1097/acm.0000000000001537] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
PURPOSE At present, little is known about how mini-clinical evaluation exercise (mini-CEX) raters translate their observations into judgments and ratings. The authors of this systematic literature review aim both to identify the factors influencing mini-CEX rater judgments in the medical education setting and to translate these findings into practical implications for clinician assessors. METHOD The authors searched for internal and external factors influencing mini-CEX rater judgments in the medical education setting from 1980 to 2015 using the Ovid MEDLINE, PsycINFO, ERIC, PubMed, and Scopus databases. They extracted the following information from each study: country of origin, educational level, study design and setting, type of observation, occurrence of rater training, provision of feedback to the trainee, research question, and identified factors influencing rater judgments. The authors also conducted a quality assessment for each study. RESULTS Seventeen articles met the inclusion criteria. The authors identified both internal and external factors that influence mini-CEX rater judgments. They subcategorized the internal factors into intrinsic rater factors, judgment-making factors (conceptualization, interpretation, attention, and impressions), and scoring factors (scoring integration and domain differentiation). CONCLUSIONS The current theories of rater-based judgment have not helped clinicians resolve the issues of rater idiosyncrasy, bias, gestalt, and conflicting contextual factors; therefore, the authors believe the most important solution is to increase the justification of rater judgments through the use of specific narrative and contextual comments, which are more informative for trainees. Finally, more real-world research is required to bridge the gap between the theory and practice of rater cognition.
Collapse
Affiliation(s)
- Victor Lee
- V. Lee is codirector of emergency medicine training, Department of Emergency Medicine, Austin Health, Heidelberg, Victoria, Australia.K. Brain is doctor, South West Healthcare, Warrnambool, Victoria, Australia.J. Martin is associate professor and director, Medical Student Programs, Monash University and Deakin University, Eastern Health Clinical School, Box Hill, Victoria, Australia
| | | | | |
Collapse
|