1
|
Gu Y, Tenenbein M, Korz L, Busse JW, Chiu M. Simulation-based medical education in Canadian anesthesiology academic institutions: a national survey. Can J Anaesth 2024:10.1007/s12630-024-02720-6. [PMID: 38453798 DOI: 10.1007/s12630-024-02720-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2023] [Revised: 12/04/2023] [Accepted: 12/05/2023] [Indexed: 03/09/2024] Open
Abstract
PURPOSE Simulation-based medical education (SBME) is provided by all anesthesiology residency programs in Canada. The purpose of this study was to characterize SBME in Canadian anesthesiology residency training programs. METHODS We administered a 21-question survey to the simulation director/coordinator for all 17 Canadian academic departments of anesthesiology from October 2019 to January 2020. The survey consisted of questions pertaining to the characteristics of the simulation centres, their faculty, learners, curriculum, and assessment processes. RESULTS All 17 residency training programs participated in the survey and reported large variability in the number and formal training of simulation faculty and in content delivery. Five programs (29%) did not provide faculty recognition for curriculum design and running simulation sessions. Most programs offered one to four simulation sessions per academic year for each year of residency. All programs offered mannequin-based and part-task trainers for teaching technical and nontechnical skills. Fourteen programs (82%) offered interprofessional and interdisciplinary simulation sessions, and ten programs (59%) did not include in situ simulation training. Commonly reported barriers to faculty involvement were lack of protected time (12 programs, 71%), lack of financial compensation (ten programs, 59%), and lack of appreciation for SBME (seven programs, 41%). CONCLUSION Large variability exists in the delivery of SBME in Canadian anesthesiology residency simulation programs, in part because of differences in financial/human resources and educational content. Future studies should explore whether training and patient outcomes differ between SBME programs and, if so, whether additional standardization is warranted.
Collapse
Affiliation(s)
- Yuqi Gu
- Department of Anesthesiology and Pain Medicine, University of Ottawa and The Ottawa Hospital, 501 Smyth Rd, Critical Care Wing 1401, Ottawa, ON, K1H 8L6, Canada.
| | - Marshall Tenenbein
- Department of Anesthesiology, Perioperative and Pain Medicine, University of Manitoba, Winnipeg, MB, Canada
| | - Linda Korz
- Department of Anesthesia, McMaster University, Hamilton, ON, Canada
| | - Jason W Busse
- Department of Anesthesia, McMaster University, Hamilton, ON, Canada
| | - Michelle Chiu
- Department of Anesthesiology and Pain Medicine, University of Ottawa and The Ottawa Hospital, Ottawa, ON, Canada
- Department of Innovation in Medical Education, Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada
| |
Collapse
|
2
|
Lopez BM, Fahy BG. Preoperative planning conversations (POPCs): A tool for certification success? J Clin Anesth 2023; 88:111148. [PMID: 37209543 DOI: 10.1016/j.jclinane.2023.111148] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2023] [Revised: 05/02/2023] [Accepted: 05/10/2023] [Indexed: 05/22/2023]
Affiliation(s)
- Brandon M Lopez
- Department of Anethesiology, University of Florida College of Medicine, PO Box 100254, Gainesville, FL 32610-0254, United States of America
| | - Brenda G Fahy
- Department of Anethesiology, University of Florida College of Medicine, PO Box 100254, Gainesville, FL 32610-0254, United States of America.
| |
Collapse
|
3
|
Rochlen LR, Woodrum DT, Zisblatt L. American Board of Anesthesiology Mock Standardized Oral Examination Faculty Development Workshop. MEDEDPORTAL : THE JOURNAL OF TEACHING AND LEARNING RESOURCES 2021; 17:11173. [PMID: 34395854 PMCID: PMC8319152 DOI: 10.15766/mep_2374-8265.11173] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/01/2020] [Accepted: 05/17/2021] [Indexed: 05/30/2023]
Abstract
INTRODUCTION Preparation for oral board examination is an important part of residency training. Anesthesiology programs provide mock oral exams for their trainees, but often, faculty have little guidance on the conduct of these exams. We describe a faculty development workshop for anesthesiology faculty to enhance their familiarity with the American Board of Anesthesiology Standardized Oral Examination (SOE). METHODS We created a faculty development workshop to administer to a live audience. The session consisted of didactic and practical components. A one-page tip sheet was also included to distribute to all faculty administering mock SOEs, for review and reference prior to administering an exam. Faculty and residents were surveyed before and after the session. RESULTS Eleven faculty participated in the live session. Eighty-two percent of faculty (nine of 11) committed to making a change in the way they delivered mock SOE as a result of attending the session. Fifty-eight percent of faculty (32 of 55) who responded to the postintervention survey reported that they used the tip sheet prior to administering a subsequent mock SOE. Residents described improvement in the clarity and organization of feedback following the intervention. DISCUSSION Faculty members play a vital role in preparing residents for board certification. It is therefore important that faculty are appropriately oriented to the goals and conduct of the mock SOE. After taking this workshop, faculty members will be more likely to adapt their examiner style to focus on the ABA-defined examinee attributes and to provide feedback after the mock SOE.
Collapse
Affiliation(s)
- Lauryn R. Rochlen
- Clinical Associate Professor, Department of Anesthesiology, University of Michigan Medical School
| | - Derek T. Woodrum
- Clinical Associate Professor, Department of Anesthesiology, University of Michigan Medical School
| | - Lara Zisblatt
- Education Specialist, Department of Anesthesiology, University of Michigan Medical School
| |
Collapse
|
4
|
Residency program directors' perceptions about the impact of the American Board of Anesthesiology's Objective Structured Clinical Examination. J Clin Anesth 2021; 75:110439. [PMID: 34293669 DOI: 10.1016/j.jclinane.2021.110439] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2021] [Revised: 06/18/2021] [Accepted: 06/26/2021] [Indexed: 11/22/2022]
Abstract
STUDY OBJECTIVE To describe how the introduction of an Objective Structured Clinical Examination (OSCE) by the American Board of Anesthesiology (ABA) to its initial certification impacted anesthesiology residencies in the United States. DESIGN AND SETTING A sequential mixed-methods design with focus groups and online survey among program directors of Accreditation Council for Graduate Medical Education-accredited anesthesiology residencies. PATIENTS No patients were included. INTERVENTION None. MEASUREMENTS A convenience sample of 34 program directors were interviewed to understand their perceptions of the ABA OSCE. Subsequently, an online survey, based on major themes identified from the focus groups, was sent to all 156 program directors. MAIN RESULTS Several themes emerged from the focus group discussions: (1) a mock OSCE was most common for preparing residents for the ABA OSCE; 2) the ABA OSCE led to changes in residency curriculum; 3) the ABA OSCE assessed communication and professionalism skills well, and how well it assessed technical skills was less agreed on. Survey results from 87 program directors (response rate = 56%) were mostly consistent with the themes generated by the focus groups. Eight-one out of 87 programs (93%) specifically prepared their residents for the ABA OSCE. Fifty-two out of 81 program directors (64%) reported the introduction of the ABA OSCE led to curricular changes. Out of 79 program directors, 45 (57%) agreed the ABA OSCE assesses skills essential to anesthesiology practice, and 40 (51%) considered it added value to board certification. CONCLUSIONS The introduction of the OSCE by the ABA for board certification has affected the curriculum of many residencies. Approximately 3 in 5 program directors perceived the ABA OSCE measures skills essential to anesthesiologists' practice. Future studies should assess residency graduates' perspective on the usefulness of both mock OSCE preparation and the ABA OSCE, and whether the ABA OSCE performance predicts future clinical practice.
Collapse
|
5
|
Martinelli SM, Chen F, Isaak RS, Huffmyer JL, Neves SE, Mitchell JD. Educating Anesthesiologists During the Coronavirus Disease 2019 Pandemic and Beyond. Anesth Analg 2021; 132:585-593. [PMID: 33201006 DOI: 10.1213/ane.0000000000005333] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/30/2023]
Abstract
The coronavirus disease 2019 (COVID-19) pandemic has altered approaches to anesthesiology education by shifting educational paradigms. This vision article discusses pre-COVID-19 educational methodologies and best evidence, adaptations required under COVID-19, and evidence for these modifications, and suggests future directions for anesthesiology education. Learning management systems provide structure to online learning. They have been increasingly utilized to improve access to didactic materials asynchronously. Despite some historic reservations, the pandemic has necessitated a rapid uptake across programs. Commercially available systems offer a wide range of peer-reviewed curricular options. The flipped classroom promotes learning foundational knowledge before teaching sessions with a focus on application during structured didactics. There is growing evidence that this approach is preferred by learners and may increase knowledge gain. The flipped classroom works well with learning management systems to disseminate focused preclass work. Care must be taken to keep virtual sessions interactive. Simulation, already used in anesthesiology, has been critical in preparation for the care of COVID-19 patients. Multidisciplinary, in situ simulations allow for rapid dissemination of new team workflows. Physical distancing and reduced availability of providers have required more sessions. Early pandemic decreases in operating volumes have allowed for this; future planning will have to incorporate smaller groups, sanitizing of equipment, and attention to use of personal protective equipment. Effective technical skills training requires instruction to mastery levels, use of deliberate practice, and high-quality feedback. Reduced sizes of skill-training workshops and approaches for feedback that are not in-person will be required. Mock oral and objective structured clinical examination (OSCE) allow for training and assessment of competencies often not addressed otherwise. They provide formative and summative data and objective measurements of Accreditation Council for Graduate Medical Education (ACGME) milestones. They also allow for preparation for the American Board of Anesthesiology (ABA) APPLIED examination. Adaptations to teleconferencing or videoconferencing can allow for continued use. Benefits of teaching in this new era include enhanced availability of asynchronous learning and opportunities to apply universal, expert-driven curricula. Burdens include decreased social interactions and potential need for an increased amount of smaller, live sessions. Acquiring learning management systems and holding more frequent simulation and skills sessions with fewer learners may increase cost. With the increasing dependency on multimedia and technology support for teaching and learning, one important focus of educational research is on the development and evaluation of strategies that reduce extraneous processing and manage essential and generative processing in virtual learning environments. Collaboration to identify and implement best practices has the potential to improve education for all learners.
Collapse
Affiliation(s)
- Susan M Martinelli
- From the Department of Anesthesiology, The University of North Carolina, Chapel Hill, North Carolina
| | - Fei Chen
- From the Department of Anesthesiology, The University of North Carolina, Chapel Hill, North Carolina
| | - Robert S Isaak
- From the Department of Anesthesiology, The University of North Carolina, Chapel Hill, North Carolina
| | - Julie L Huffmyer
- Department of Anesthesiology, University of Virginia, Charlottesville, Virginia
| | - Sara E Neves
- Department of Anesthesia, Critical Care and Pain Medicine, Beth Israel Deaconess Medical Center, Boston, MA
| | - John D Mitchell
- Department of Anesthesia, Critical Care and Pain Medicine, Beth Israel Deaconess Medical Center, Boston, MA
| |
Collapse
|
6
|
Tanaka P, Park YS, Liu L, Varner C, Kumar AH, Sandhu C, Yumul R, McCartney KT, Spilka J, Macario A. Assessment Scores of a Mock Objective Structured Clinical Examination Administered to 99 Anesthesiology Residents at 8 Institutions. Anesth Analg 2020; 131:613-621. [PMID: 32149757 DOI: 10.1213/ane.0000000000004705] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
BACKGROUND Objective Structured Clinical Examinations (OSCEs) are used in a variety of high-stakes examinations. The primary goal of this study was to examine factors influencing the variability of assessment scores for mock OSCEs administered to senior anesthesiology residents. METHODS Using the American Board of Anesthesiology (ABA) OSCE Content Outline as a blueprint, scenarios were developed for 4 of the ABA skill types: (1) informed consent, (2) treatment options, (3) interpretation of echocardiograms, and (4) application of ultrasonography. Eight residency programs administered these 4 OSCEs to CA3 residents during a 1-day formative session. A global score and checklist items were used for scoring by faculty raters. We used a statistical framework called generalizability theory, or G-theory, to estimate the sources of variation (or facets), and to estimate the reliability (ie, reproducibility) of the OSCE performance scores. Reliability provides a metric on the consistency or reproducibility of learner performance as measured through the assessment. RESULTS Of the 115 total eligible senior residents, 99 participated in the OSCE because the other residents were unavailable. Overall, residents correctly performed 84% (standard deviation [SD] 16%, range 38%-100%) of the 36 total checklist items for the 4 OSCEs. On global scoring, the pass rate for the informed consent station was 71%, for treatment options was 97%, for interpretation of echocardiograms was 66%, and for application of ultrasound was 72%. The estimate of reliability expressing the reproducibility of examinee rankings equaled 0.56 (95% confidence interval [CI], 0.49-0.63), which is reasonable for normative assessments that aim to compare a resident's performance relative to other residents because over half of the observed variation in total scores is due to variation in examinee ability. Phi coefficient reliability of 0.42 (95% CI, 0.35-0.50) indicates that criterion-based judgments (eg, pass-fail status) cannot be made. Phi expresses the absolute consistency of a score and reflects how closely the assessment is likely to reproduce an examinee's final score. Overall, the greatest (14.6%) variance was due to the person by item by station interaction (3-way interaction) indicating that specific residents did well on some items but poorly on other items. The variance (11.2%) due to residency programs across case items was high suggesting moderate variability in performance from residents during the OSCEs among residency programs. CONCLUSIONS Since many residency programs aim to develop their own mock OSCEs, this study provides evidence that it is possible for programs to create a meaningful mock OSCE experience that is statistically reliable for separating resident performance.
Collapse
Affiliation(s)
- Pedro Tanaka
- From the Department of Anesthesiology, Perioperative and Pain Medicine, Stanford University School of Medicine, Stanford, California
| | - Yoon Soo Park
- Department of Medical Education, University of Illinois at Chicago College of Medicine, Chicago, Illinois
| | - Linda Liu
- Department of Anesthesia and Perioperative Care, University of California San Francisco, San Francisco, California
| | - Chelsia Varner
- Department of Anesthesiology, University of Southern California, Los Angeles, California
| | - Amanda H Kumar
- Department of Anesthesiology, Duke University School of Medicine, Durham, North Carolina
| | - Charandip Sandhu
- Department of Anesthesiology, University of California Davis, Davis, California
| | - Roya Yumul
- Department of Anesthesiology, Cedars Sinai Medical Center, Los Angeles, California
| | - Kate Tobin McCartney
- Department of Anesthesiology, University of California Irvine, Irvine, California
| | - Jared Spilka
- Naval Medical Center San Diego, San Diego, California
| | - Alex Macario
- From the Department of Anesthesiology, Perioperative and Pain Medicine, Stanford University School of Medicine, Stanford, California
| |
Collapse
|
7
|
Learners and Luddites in the Twenty-first Century: Bringing Evidence-based Education to Anesthesiology. Anesthesiology 2020; 131:908-928. [PMID: 31365369 DOI: 10.1097/aln.0000000000002827] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022]
Abstract
Anesthesiologists are both teachers and learners and alternate between these roles throughout their careers. However, few anesthesiologists have formal training in the methodologies and theories of education. Many anesthesiology educators often teach as they were taught and may not be taking advantage of current evidence in education to guide and optimize the way they teach and learn. This review describes the most up-to-date evidence in education for teaching knowledge, procedural skills, and professionalism. Methods such as active learning, spaced learning, interleaving, retrieval practice, e-learning, experiential learning, and the use of cognitive aids will be described. We made an effort to illustrate the best available evidence supporting educational practices while recognizing the inherent challenges in medical education research. Similar to implementing evidence in clinical practice in an attempt to improve patient outcomes, implementing an evidence-based approach to anesthesiology education may improve learning outcomes.
Collapse
|
8
|
Warner DO, Isaak RS, Peterson-Layne C, Lien CA, Sun H, Menzies AO, Cole DJ, Dainer RJ, Fahy BG, Macario A, Suresh S, Harman AE. Development of an Objective Structured Clinical Examination as a Component of Assessment for Initial Board Certification in Anesthesiology. Anesth Analg 2020; 130:258-264. [PMID: 31688077 DOI: 10.1213/ane.0000000000004496] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
With its first administration of an Objective Structured Clinical Examination (OSCE) in 2018, the American Board of Anesthesiology (ABA) became the first US medical specialty certifying board to incorporate this type of assessment into its high-stakes certification examination system. The fundamental rationale for the ABA's introduction of the OSCE is to include an assessment that allows candidates for board certification to demonstrate what they actually "do" in domains relevant to clinical practice. Inherent in this rationale is that the OSCE will capture competencies not well assessed in the current written and oral examinations-competencies that will allow the ABA to judge whether a candidate meets the standards expected for board certification more properly. This special article describes the ABA's journey from initial conceptualization through first administration of the OSCE, including the format of the OSCE, the process for scenario development, the standardized patient program that supports OSCE administration, examiner training, scoring, and future assessment of reliability, validity, and impact of the OSCE. This information will be beneficial to both those involved in the initial certification process, such as residency graduate candidates and program directors, and others contemplating the use of high-stakes summative OSCE assessments.
Collapse
Affiliation(s)
- David O Warner
- From the Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester, Minnesota
| | - Robert S Isaak
- Department of Anesthesiology, The University of North Carolina at Chapel Hill, Chapel Hill, North Carolina
| | | | - Cynthia A Lien
- Department of Anesthesiology, Medical College of Wisconsin, Milwaukee, Wisconsin
| | - Huaping Sun
- The American Board of Anesthesiology, Raleigh, North Carolina
| | - Anna O Menzies
- The American Board of Anesthesiology, Raleigh, North Carolina
| | - Daniel J Cole
- Department of Anesthesiology and Perioperative Medicine, University of California, Los Angeles, Los Angeles, California
| | - Rupa J Dainer
- Department of Ambulatory Surgery, Pediatric Specialists of Virginia, Fairfax, Virginia
| | - Brenda G Fahy
- Department of Anesthesiology, University of Florida, Gainesville, Florida
| | - Alex Macario
- Department of Anesthesiology, Perioperative and Pain Medicine, Stanford University, Stanford, California
| | - Santhanam Suresh
- Department of Pediatric Anesthesiology, Ann & Robert H. Lurie Children's Hospital of Chicago, Northwestern University, Chicago, Illinois
| | - Ann E Harman
- The American Board of Anesthesiology, Raleigh, North Carolina
| |
Collapse
|
9
|
Fleming M, McMullen M, Beesley T, Egan R, Field S. Simulation-based evaluation of anaesthesia residents: optimising resource use in a competency-based assessment framework. BMJ SIMULATION & TECHNOLOGY ENHANCED LEARNING 2019; 6:339-343. [DOI: 10.1136/bmjstel-2019-000504] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 11/15/2019] [Indexed: 11/03/2022]
Abstract
IntroductionSimulation training in anaesthesiology bridges the gap between theory and practice by allowing trainees to engage in high-stakes clinical training without jeopardising patient safety. However, implementing simulation-based assessments within an academic programme is highly resource intensive, and the optimal number of scenarios and faculty required for accurate competency-based assessment remains to be determined. Using a generalisability study methodology, we examine the structure of simulation-based assessment in regard to the minimal number of scenarios and faculty assessors required for optimal competency-based assessments.MethodsSeventeen anaesthesiology residents each performed four simulations which were assessed by two expert raters. Generalisability analysis (G-analysis) was used to estimate the extent of variance attributable to (1) the scenarios, (2) the assessors and (3) the participants. The D-coefficient and the G-coefficient were used to determine accuracy targets and to predict the impact of adjusting the number of scenarios or faculty assessors.ResultsWe showed that multivariate G-analysis can be used to estimate the number of simulations and raters required to optimise assessment. In this study, the optimal balance was obtained when four scenarios were assessed by two simulation experts.ConclusionSimulation-based assessment is becoming an increasingly important tool for assessing the competency of medical residents in conjunction with other assessment methods. G-analysis can be used to assist in planning for optimal resource use and cost-efficacy.
Collapse
|
10
|
Rochlen LR, Tarnal V, Vance JL, Alderink E, Bernstein WK. Modules for the Technical Skills Section of the OSCE Component of the American Board of Anesthesiology APPLIED Examination. MEDEDPORTAL : THE JOURNAL OF TEACHING AND LEARNING RESOURCES 2019; 15:10820. [PMID: 31139739 PMCID: PMC6507923 DOI: 10.15766/mep_2374-8265.10820] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/13/2018] [Accepted: 03/23/2019] [Indexed: 06/09/2023]
Abstract
INTRODUCTION To assess communication and professionalism, as well as technical skills related to patient care, the American Board of Anesthesiology (ABA) has begun administering an Objective Structured Clinical Examination (OSCE) portion of the APPLIED Examination in addition to the Standard Oral Examination component. METHODS We created video modules and a curriculum for anesthesiology resident OSCE preparation for the Interpretation of Monitors and Interpretation of Echocardiography components. The modules can be used individually by trainees or included as part of an OSCE workshop led by faculty educators with seven individual stations matching the content of the actual ABA examination. These modules are recommended for all levels of anesthesiology trainees so that they can gain exposure to the format and the fast pace of the examination. RESULTS Sixty-six junior and senior anesthesiology residents, fellows, and junior faculty successfully participated in these modules. Seventy-three percent of the participants agreed that after completing these modules, they now had a good understanding of the Interpretation of Monitors and Interpretation of Echocardiography technical skills stations. More than 90% of participants reported that the modules were useful, and more than 70% reported that they now felt prepared for these stations of the OSCE. DISCUSSION Developing technical skills stations for deliberate practice and preparation for the ABA OSCE is resource intensive. Finding time and faculty to facilitate OSCE preparation is also challenging. With the video modules and scripts included in this publication, residents can practice independently or as part of larger preparation course.
Collapse
Affiliation(s)
- Lauryn R. Rochlen
- Clinical Associate Professor, Department of Anesthesiology, University of Michigan Medical School
| | - Vijay Tarnal
- Clinical Assistant Professor, Department of Anesthesiology, University of Michigan Medical School
| | - Jennifer L. Vance
- Clinical Assistant Professor, Department of Anesthesiology, University of Michigan Medical School
| | - Erik Alderink
- Technology Manager for Simulation and Experiential Learning, Office of Medical Student Education, University of Michigan Medical School
| | - Wendy K. Bernstein
- Professor, Department of Anesthesiology and Perioperative Medicine, University of Rochester Medical Center
- Vice Chairman of Education, Department of Anesthesiology and Perioperative Medicine, University of Rochester Medical Center
| |
Collapse
|
11
|
Tanaka P, Adriano A, Ngai L, Park YS, Marty A, Wakatsuki S, Brun C, Harrison K, Bushell E, Thomsen JLD, Wen L, Painter C, Chen M, Macario A. Development of an Objective Structured Clinical Examination Using the American Board of Anesthesiology Content Outline for the Objective Structured Clinical Examination Component of the APPLIED Certification Examination. A A Pract 2018; 11:193-197. [DOI: 10.1213/xaa.0000000000000779] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
12
|
|
13
|
Validity of Simulation-Based Assessment for Accreditation Council for Graduate Medical Education Milestone Achievement. ACTA ACUST UNITED AC 2018; 13:201-210. [DOI: 10.1097/sih.0000000000000285] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
14
|
Isaak R, Stiegler M, Hobbs G, Martinelli SM, Zvara D, Arora H, Chen F. Comparing Real-time Versus Delayed Video Assessments for Evaluating ACGME Sub-competency Milestones in Simulated Patient Care Environments. Cureus 2018; 10:e2267. [PMID: 29736352 PMCID: PMC5935426 DOI: 10.7759/cureus.2267] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Background Simulation is an effective method for creating objective summative assessments of resident trainees. Real-time assessment (RTA) in simulated patient care environments is logistically challenging, especially when evaluating a large group of residents in multiple simulation scenarios. To date, there is very little data comparing RTA with delayed (hours, days, or weeks later) video-based assessment (DA) for simulation-based assessments of Accreditation Council for Graduate Medical Education (ACGME) sub-competency milestones. We hypothesized that sub-competency milestone evaluation scores obtained from DA, via audio-video recordings, are equivalent to the scores obtained from RTA. Methods Forty-one anesthesiology residents were evaluated in three separate simulated scenarios, representing different ACGME sub-competency milestones. All scenarios had one faculty member perform RTA and two additional faculty members perform DA. Subsequently, the scores generated by RTA were compared with the average scores generated by DA. Variance component analysis was conducted to assess the amount of variation in scores attributable to residents and raters. Results Paired t-tests showed no significant difference in scores between RTA and averaged DA for all cases. Cases 1, 2, and 3 showed an intraclass correlation coefficient (ICC) of 0.67, 0.85, and 0.50 for agreement between RTA scores and averaged DA scores, respectively. Analysis of variance of the scores assigned by the three raters showed a small proportion of variance attributable to raters (4% to 15%). Conclusions The results demonstrate that video-based delayed assessment is as reliable as real-time assessment, as both assessment methods yielded comparable scores. Based on a department’s needs or logistical constraints, our findings support the use of either real-time or delayed video evaluation for assessing milestones in a simulated patient care environment.
Collapse
Affiliation(s)
- Robert Isaak
- Department of Anesthesiology, University of North Carolina School of Medicine
| | - Marjorie Stiegler
- Department of Anesthesiology, University of North Carolina School of Medicine
| | - Gene Hobbs
- Department of Neurosurgery, University of North Carolina School of Medicine
| | - Susan M Martinelli
- Department of Anesthesiology, University of North Carolina School of Medicine
| | - David Zvara
- Department of Anesthesiology, University of North Carolina School of Medicine
| | - Harendra Arora
- Department of Anesthesiology, University of North Carolina School of Medicine
| | - Fei Chen
- Department of Anesthesiology, University of North Carolina School of Medicine
| |
Collapse
|