1
|
Dao AT, Garcia MM, Correa R, Gay LJ, Wininger DA, Sweet M, Luther VP, Chow TM, Harper W, Lai CJ. AAIM Recommendations to Promote Equity and Inclusion in the Internal Medicine Residency Interview Process. Am J Med 2022; 135:1509-1516.e1. [PMID: 35981650 PMCID: PMC9376147 DOI: 10.1016/j.amjmed.2022.08.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/30/2022] [Accepted: 08/10/2022] [Indexed: 12/03/2022]
Affiliation(s)
- Anthony T Dao
- School of Medicine, Washington University, St. Louis, Mo.
| | - Maria M Garcia
- Chan Medical School, University of Massachusetts, Boston
| | | | | | | | | | - Vera P Luther
- School of Medicine, Wake Forest University, Winston-Salem, NC
| | - Timothy M Chow
- Morsani College of Medicine, University of South Florida, Tampa
| | | | - Cindy J Lai
- Morsani College of Medicine, University of South Florida, Tampa; School of Medicine, University of California, San Francisco
| |
Collapse
|
2
|
Hughes RH, Kleinschmidt S, Sheng AY. Using structured interviews to reduce bias in emergency medicine residency recruitment: Worth a second look. AEM EDUCATION AND TRAINING 2021; 5:S130-S134. [PMID: 34616987 PMCID: PMC8480396 DOI: 10.1002/aet2.10562] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/26/2020] [Revised: 11/20/2020] [Accepted: 11/24/2020] [Indexed: 05/29/2023]
Affiliation(s)
| | | | - Alexander Y. Sheng
- Department of Emergency MedicineBoston Medical CenterBostonMAUSA
- Boston University School of MedicineBostonMAUSA
| |
Collapse
|
3
|
Schnapp BH, Alvarez A, Bianchi R, Caretta‐Weyer H, Jewell C, Kalantari A, Lee E, Miller D, Quinn A. Curated collection for clinician educators: Six key papers on residency recruitment. AEM EDUCATION AND TRAINING 2021; 5:e10597. [PMID: 33969251 PMCID: PMC8086575 DOI: 10.1002/aet2.10597] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/11/2020] [Revised: 02/19/2021] [Accepted: 03/08/2021] [Indexed: 05/23/2023]
Abstract
INTRODUCTION All emergency medicine (EM) residency programs must recruit new medical school graduates each year. The process is often overwhelming, with each program receiving far more applicants than available positions. We searched for evidence-based best practices to guide residency programs in screening, interviewing, and ranking applicants to ensure a high-performing and diverse residency class. METHODS A literature search was conducted on the topic of residency recruitment, utilizing a call on social media as well as multiple databases. After identifying relevant articles, we performed a modified Delphi process in three rounds, utilizing junior educators as well as more senior faculty. RESULTS We identified 51 relevant articles on the topic of residency recruitment. The Delphi process yielded six articles that were deemed most highly relevant over the three rounds. Transparency with selection criteria, holistic application review, standardized letters of evaluation, and blinding applicant files for interviewers were among noted best practices. CONCLUSIONS Well-supported evidence-based practices exist for residency recruitment, and programs may benefit from understanding which common recruitment practices offer the most value. The articles discussed here provide a foundation for faculty looking to improve their program's recruiting practices.
Collapse
Affiliation(s)
| | - Al’ai Alvarez
- Department of Emergency MedicineStanford UniversityStanfordCaliforniaUSA
| | - Riccardo Bianchi
- Department of Physiology and PharmacologySUNY Downstate Health Sciences UniversityNew YorkNew YorkUSA
| | | | - Corlin Jewell
- Department of Emergency MedicineUniversity of WisconsinMadisonWisconsinUSA
| | - Annahieta Kalantari
- Department of Emergency MedicineMilton S Hershey Medical CenterPenn State HealthHersheyPennsylvaniaUSA
| | - Eric Lee
- Department of Emergency MedicineMaimonides Medical CenterBrooklynNew YorkUSA
| | - Danielle Miller
- Department of Emergency MedicineStanford UniversityStanfordCaliforniaUSA
| | - Antonia Quinn
- SUNY Downstate Health SciencesUniversity College of MedicineNew YorkNew YorkUSA
- Department of Emergency MedicineSUNY DownstateBrooklynNew YorkUSA
| |
Collapse
|
4
|
Yang A, Gilani C, Saadat S, Murphy L, Toohey S, Boysen‐Osborn M. Which Applicant Factors Predict Success in Emergency Medicine Training Programs? A Scoping Review. AEM EDUCATION AND TRAINING 2020; 4:191-201. [PMID: 32704588 PMCID: PMC7369487 DOI: 10.1002/aet2.10411] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/01/2019] [Revised: 10/30/2019] [Accepted: 11/04/2019] [Indexed: 05/30/2023]
Abstract
BACKGROUND Program directors (PDs) in emergency medicine (EM) receive an abundance of applications for very few residency training spots. It is unclear which selection strategies will yield the most successful residents. Many authors have attempted to determine which items in an applicant's file predict future performance in EM. OBJECTIVES The purpose of this scoping review is to examine the breadth of evidence related to the predictive value of selection factors for performance in EM residency. METHODS The authors systematically searched four databases and websites for peer-reviewed and gray literature related to EM admissions published between 1992 and February 2019. Two reviewers screened titles and abstracts for articles that met the inclusion criteria, according to the scoping review study protocol. The authors included studies if they specifically examined selection factors and whether those factors predicted performance in EM residency training in the United States. RESULTS After screening 23,243 records, the authors selected 60 for full review. From these, the authors selected 15 published manuscripts, one unpublished manuscript, and 11 abstracts for inclusion in the review. These studies examined the United States Medical Licensing Examination (USMLE), Standardized Letters of Evaluation, Medical Student Performance Evaluation, medical school attended, clerkship grades, membership in honor societies, and other less common factors and their association with future EM residency training performance. CONCLUSIONS The USMLE was the most common factor studied. It unreliably predicts clinical performance, but more reliably predicts performance on licensing examinations. All other factors were less commonly studied and, similar to the USMLE, yielded mixed results.
Collapse
Affiliation(s)
- Allen Yang
- Department of Emergency MedicineUniversity of California, IrvineIrvineCA
| | - Chris Gilani
- Department of Emergency MedicineUniversity of California, IrvineIrvineCA
| | - Soheil Saadat
- Department of Emergency MedicineUniversity of California, IrvineIrvineCA
| | - Linda Murphy
- Health Science Library OrangeUniversity of California, IrvineIrvineCA
| | - Shannon Toohey
- Department of Emergency MedicineUniversity of California, IrvineIrvineCA
| | - Megan Boysen‐Osborn
- Department of Emergency MedicineUniversity of California, IrvineIrvineCA
- School of MedicineUniversity of California, IrvineIrvineCA
| |
Collapse
|
5
|
Murphy JA, Pattin AJ, Sarver JG, Seegert ML, Mertz S, Blashford E. Interviewer perceptions during the implementation of the multiple mini-interview model at a school of pharmacy. CURRENTS IN PHARMACY TEACHING & LEARNING 2020; 12:864-871. [PMID: 32540049 DOI: 10.1016/j.cptl.2020.02.003] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/30/2019] [Revised: 12/18/2019] [Accepted: 02/26/2020] [Indexed: 06/11/2023]
Abstract
INTRODUCTION Studies reveal positive interviewer perceptions of multiple mini-interview (MMI) upon MMI completion. No studies evaluate change in interviewer perceptions during MMI implementation. The objective was to evaluate the change in interviewer perceptions during the implementation of the MMI model at the University of Toledo College of Pharmacy and Pharmaceutical Sciences. METHODS Interviewers (faculty volunteers, preceptors, student pharmacists) were eligible for inclusion in the prospective cohort. Consenting individuals (1) completed a pre-MMI training survey regarding perceptions of MMI, (2) participated in a 90-minute MMI training program (PowerPoint presentation and review of videos demonstrating MMI practices), (3) completed a post-MMI training survey, and (4) after interviews, completed a post-interview survey. The six Likert-scale MMI perception questions were independently analyzed for changes in the rank response across the three survey time points using Friedman's nonparametric repeated-measures analysis. Each question was evaluated for all respondents together, and for nine different respondent subgroups. The overall criteria for significance was α = 0.05 for each question, with Bonferroni correction for the ten overall comparisons made for each question. RESULTS Thirty-two interviewers participated (20 faculty members, five preceptors, and seven student pharmacists). From the pre-MMI training survey through the post-interview survey, interviewers gained confidence in their ability to explain the rationale behind the MMI model, were more likely to agree that six minutes was adequate time to assess an applicant and believed MMI provides a fair assessment of an applicant's noncognitive attributes. CONCLUSIONS After interviewers received training and gained experience with MMI, perceptions of MMI improved.
Collapse
Affiliation(s)
- Julie A Murphy
- University of Toledo College of Pharmacy and Pharmaceutical Sciences, 3000 Arlington Ave, Mail Stop 1013, Toledo, OH, United States 43614.
| | - Anthony J Pattin
- University of Toledo College of Pharmacy and Pharmaceutical Sciences, 3000 Arlington Ave, Mail Stop 1013, Toledo, OH, United States 43614.
| | - Jeffrey G Sarver
- University of Toledo College of Pharmacy and Pharmaceutical Sciences, 3000 Arlington Ave, Mail Stop 1013, Toledo, OH, United States 43614.
| | - Michelle L Seegert
- University of Toledo College of Pharmacy and Pharmaceutical Sciences, 3000 Arlington Ave, Mail Stop 1013, Toledo, OH, United States 43614.
| | - Sean Mertz
- PGY-2 Critical Care Pharmacy Resident, Cleveland Clinic, 9500 Euclid Ave, Cleveland, OH, United States, 44195.
| | - Ethan Blashford
- Legacy Health, 1919 NW Lovejoy St, Portland, OR, United States, 97209.
| |
Collapse
|
6
|
Callwood A, Groothuizen JE, Lemanska A, Allan H. The predictive validity of Multiple Mini Interviews (MMIs) in nursing and midwifery programmes: Year three findings from a cross-discipline cohort study. NURSE EDUCATION TODAY 2020; 88:104320. [PMID: 32193067 DOI: 10.1016/j.nedt.2019.104320] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/30/2019] [Revised: 11/11/2019] [Accepted: 12/18/2019] [Indexed: 06/10/2023]
Abstract
BACKGROUND Education literature worldwide is replete with studies evaluating the effectiveness of Multiple Mini Interviews (MMIs) in admissions to medicine but <1% of published studies have been conducted in selection to nursing and midwifery programmes. OBJECTIVES To examine the predictive validity of MMIs using end of programme clinical and academic performance indicators of pre-registration adult, child, and mental health nursing and midwifery students. DESIGN AND SETTING A cross-sectional cohort study at one university in the United Kingdom. PARTICIPANTS A non-probability consecutive sampling strategy whereby all applicants to the September 2015 pre-registration adult, child, mental health nursing and midwifery programmes were invited to participate. Of the 354 students who commenced year one, 225 (64%) completed their three-year programme and agreed to take part (adult 120, child 32, mental health nursing 30 and midwifery 43). METHODS All applicants were interviewed using MMIs with six and seven station, four-minute models deployed in nursing and midwifery student selection respectively. Associations between MMI scores and the cross-discipline programme performance indicators available for each student at this university at the end of year three: clinical practice (assessed by mentors) and academic attainment (dissertation mark) were explored using multiple linear regression adjusting for applicant age, academic entry level, discipline and number of MMI stations. RESULTS In the adjusted models, students with higher admissions MMI score (at six and seven stations) performed better in clinical practice (p < 0.001) but not in academic attainment (p = 0.122) at the end of their three-year programme. CONCLUSION These findings provide the first report of the predictive validity of MMIs for performance in clinical practice using six and seven station models in nursing and midwifery programmes. Further evidence is required from both clinical and academic perspectives from larger, multi-site evaluations.
Collapse
Affiliation(s)
- Alison Callwood
- School of Health Sciences, University of Surrey, Guildford, UK.
| | | | | | - Helen Allan
- Centre for Critical Research in Nursing and Midwifery, School of Health and Education, Middlesex University, UK.
| |
Collapse
|
7
|
Structured Interviewer Training for the Implementation of Standardized Behavioral Questions in Medical School Admissions. Simul Healthc 2020; 15:128-132. [PMID: 32235263 DOI: 10.1097/sih.0000000000000415] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
8
|
Crawford SB, Monks SM, Wells RN. Virtual Reality as an Interview Technique in Evaluation of Emergency Medicine Applicants. AEM EDUCATION AND TRAINING 2018; 2:328-333. [PMID: 30386843 PMCID: PMC6194035 DOI: 10.1002/aet2.10113] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/23/2018] [Revised: 05/21/2018] [Accepted: 05/31/2018] [Indexed: 05/30/2023]
Abstract
NEED FOR INNOVATION Current interviewing strategies and the standardized letter of evaluation may not provide enough insight into preferred resident characteristics. Emergency medicine (EM) residency programs are challenged with identifying trainees who can problem solve, communicate, and work well with fellow health professionals. BACKGROUND Structured interviews have previously been used and can help predict success but candidates have reported a negative impression with their use. OBJECTIVE OF INNOVATION This structured virtual reality (VR) interviewing method was designed so that interviewers can observe the communication abilities, subtle personality traits, and teamwork skills of applicants interviewed at an EM residency program. DEVELOPMENT PROCESS A consumer VR headset became available and in combination with an interactive team game was incorporated into a standardized team-based interview session. This session was designed to allow observation of candidates' communication, problem solving, and teamwork skills. IMPLEMENTATION PHASE Surveys were collected to examine the satisfaction of EM residency applicants who participated in this novel standardized interviewing method using a VR headset. After the submission of rank lists, but prior to Match Day, those who interviewed were e-mailed a voluntary, anonymous, and confidential survey asking about their interview experience, specifically about the VR portion. The survey was sent to 102 applicants with 63 responses for a 62% response rate at the completion of the 2015 to 2016 interview season. OUTCOMES Overall study findings suggested that participants had a highly favorable impression of the VR portion of the interview. Specifically, participants reported that this interview technique was appropriate and worthwhile. Additionally, participants attested that the Oculus portion of the interview gave insight to their work ethic, personality, and communication skills and how they work with others. REFLECTIVE DISCUSSION The novel interviewing method used in this study allowed interviewers to gain insight beyond that of the paperwork and brief face-to-face interaction. Study findings suggest that interviewees accepted the use of this novel interview method. It has been incorporated into our interview process for three consecutive years.
Collapse
Affiliation(s)
- Scott B. Crawford
- Department of Emergency MedicineTexas Tech University Health Science Center El PasoEl PasoTX
| | - Stormy M. Monks
- Department of Emergency MedicineTexas Tech University Health Science Center El PasoEl PasoTX
| | - Radosveta N. Wells
- Department of Emergency MedicineTexas Tech University Health Science Center El PasoEl PasoTX
| |
Collapse
|
9
|
Callwood A, Jeevaratnam K, Kotronoulas G, Schneider A, Lewis L, Nadarajah VD. Personal domains assessed in multiple mini interviews (MMIs) for healthcare student selection: A narrative synthesis systematic review. NURSE EDUCATION TODAY 2018; 64:56-64. [PMID: 29459193 DOI: 10.1016/j.nedt.2018.01.016] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/25/2017] [Revised: 12/08/2017] [Accepted: 01/22/2018] [Indexed: 06/08/2023]
Abstract
OBJECTIVES To examine the personal domains multiple mini interviews (MMIs) are being designed to assess, explore how they were determined and contextualise such domains in current and future healthcare student selection processes DESIGN: A systematic review of empirical research reporting on MMI model design was conducted from database inception to November 2017. DATA SOURCES Twelve electronic bibliographic databases. REVIEW METHODS Evidence was extracted from original studies, and integrated in a narrative synthesis guided by the PRISMA statement for reporting systematic reviews. Personal domains were clustered into themes using a modified Delphi technique. RESULTS A total of 584 articles were screened. 65 unique studies (80 articles) matched our inclusion criteria of which seven were conducted within nursing/midwifery faculties. Six in 10 studies featured applicants to medical school. Across selection processes, we identified 32 personal domains assessed by MMIs, the most frequent being: communication skills (84%), teamwork/collaboration (70%), and ethical/moral judgement (65%). Domains capturing ability to cope with stressful situations (14%), make decisions (14%), and resolve conflict in the workplace (13%) featured in fewer than ten studies overall. Intra- and inter-disciplinary inconsistencies in domain profiles were noted, as well as differences by entry level. MMIs deployed in nursing and midwifery assessed compassion and decision-making more frequently than in all other disciplines. Own programme philosophy and professional body guidance were most frequently cited (~50%) as sources for personal domains; a blueprinting process was reported in only 8% of studies. CONCLUSIONS Nursing, midwifery and allied healthcare professionals should develop their theoretical frameworks for MMIs to ensure they are evidence-based and fit-for-purpose. We suggest a re-evaluation of domain priorities to ensure that students who are selected, not only have the capacity to offer the highest standards of care provision, but are able to maintain these standards when facing clinical practice and organisational pressures.
Collapse
Affiliation(s)
- Alison Callwood
- School of Health Sciences, University of Surrey, Guildford, Surrey GU2 7XH, UK.
| | - Kamalan Jeevaratnam
- School of Veterinary Medicine, University of Surrey, Guildford, Surrey GU2 7XH, UK
| | | | | | | | | |
Collapse
|
10
|
Roberts C, Khanna P, Rigby L, Bartle E, Llewellyn A, Gustavs J, Newton L, Newcombe JP, Davies M, Thistlethwaite J, Lynam J. Utility of selection methods for specialist medical training: A BEME (best evidence medical education) systematic review: BEME guide no. 45. MEDICAL TEACHER 2018; 40:3-19. [PMID: 28847200 DOI: 10.1080/0142159x.2017.1367375] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
BACKGROUND Selection into specialty training is a high-stakes and resource-intensive process. While substantial literature exists on selection into medical schools, and there are individual studies in postgraduate settings, there seems to be paucity of evidence concerning selection systems and the utility of selection tools in postgraduate training environments. AIM To explore, analyze and synthesize the evidence related to selection into postgraduate medical specialty training. METHOD Core bibliographic databases including PubMed; Ovid Medline; Embase, CINAHL; ERIC and PsycINFO were searched, and a total of 2640 abstracts were retrieved. After removing duplicates and screening against the inclusion criteria, 202 full papers were coded, of which 116 were included. RESULTS Gaps in underlying selection frameworks were illuminated. Frameworks defined by locally derived selection criteria, and heavily weighed on academic parameters seem to be giving way to the evidencing of competency-based selection approaches in some settings. Regarding selection tools, we found favorable psychometric evidence for multiple mini-interviews, situational judgment tests and clinical problem-solving tests, although the bulk of evidence was mostly limited to the United Kingdom. The evidence around the robustness of curriculum vitae, letters of recommendation and personal statements was equivocal. The findings on the predictors of past performance were limited to academic criteria with paucity of long-term evaluations. The evidence around nonacademic criteria was inadequate to make an informed judgment. CONCLUSIONS While much has been gained in understanding the utility of individual selection methods, though the evidence around many of them is equivocal, the underlying theoretical and conceptual frameworks for designing holistic and equitable selection systems are yet to be developed.
Collapse
Affiliation(s)
- Chris Roberts
- a Primary Care and Medical Education, Sydney Medical School , University of Sydney , New South Wales , Australia
| | - Priya Khanna
- b The Royal Australasian College of Physicians , New South Wales , Australia
| | - Louise Rigby
- c Health Education and Training Institute , New South Wales , Australia
| | - Emma Bartle
- d School of Dentistry , University of Queensland , Queensland , Australia
| | - Anthony Llewellyn
- e Hunter New England Local Health District , New Lambton , Australia
- f Health Education and Training Institute, University of Newcastle , Newcastle Australia
| | - Julie Gustavs
- b The Royal Australasian College of Physicians , New South Wales , Australia
| | - Libby Newton
- b The Royal Australasian College of Physicians , New South Wales , Australia
| | | | - Mark Davies
- h Royal Brisbane and Women's Hospital , Queensland , Australia
| | - Jill Thistlethwaite
- i School of Communication , University of Technology Sydney , New South Wales , Australia
| | - James Lynam
- j Calvary Mater Newcastle, University of Newcastle , New South Wales , Australia
| |
Collapse
|
11
|
Callwood A, Cooke D, Bolger S, Lemanska A, Allan H. The reliability and validity of multiple mini interviews (MMIs) in values based recruitment to nursing, midwifery and paramedic practice programmes: Findings from an evaluation study. Int J Nurs Stud 2018; 77:138-144. [DOI: 10.1016/j.ijnurstu.2017.10.003] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2017] [Revised: 09/25/2017] [Accepted: 10/05/2017] [Indexed: 11/30/2022]
|
12
|
Heitz CR, Coates W, Farrell SE, Fisher J, Juve AM, Yarris LM. Critical Appraisal of Emergency Medicine Educational Research: The Best Publications of 2015. Acad Emerg Med 2017; 24:1212-1225. [PMID: 28857348 DOI: 10.1111/acem.13305] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2017] [Revised: 08/24/2017] [Accepted: 08/25/2017] [Indexed: 12/01/2022]
Abstract
OBJECTIVE The objectives were to critically appraise the medical education research literature of 2015 and review the highest-quality quantitative and qualitative examples. METHODS A total of 434 emergency medicine (EM)-related articles were discovered upon a search of ERIC, PsychINFO, PubMED, and SCOPUS. These were both quantitative and qualitative in nature. All were screened by two of the authors using previously published exclusion criteria, and the remaining were appraised by all authors using a previously published scoring system. The highest scoring articles were then reviewed. RESULTS Sixty-one manuscripts were scored, and 10 quantitative and two qualitative papers were the highest scoring and are reviewed and summarized in this article. CONCLUSIONS This installment in this critical appraisal series reviews 12 of the highest-quality EM-related medical education research manuscripts published in 2015.
Collapse
Affiliation(s)
- Corey R. Heitz
- Carilion Clinic/Virginia Tech Carilion School of Medicine; Roanoke VA
| | - Wendy Coates
- Harbor/University of California Los Angeles Medical Center; Los Angeles CA
| | | | - Jonathan Fisher
- Maricopa Medical Center/University of Arizona College of Medicine-Phoenix; Phoenix AZ
| | | | | |
Collapse
|
13
|
Heitz CR, Coates W, Farrell SE, Fisher J, Juve AM, Yarris LM. Critical Appraisal of Emergency Medicine Educational Research: The Best Publications of 2015. AEM EDUCATION AND TRAINING 2017; 1:255-268. [PMID: 30051043 PMCID: PMC6001510 DOI: 10.1002/aet2.10063] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/17/2017] [Revised: 08/24/2017] [Accepted: 08/25/2017] [Indexed: 06/08/2023]
Abstract
OBJECTIVE The objectives were to critically appraise the medical education research literature of 2015 and review the highest-quality quantitative and qualitative examples. METHODS A total of 434 emergency medicine (EM)-related articles were discovered upon a search of ERIC, PsychINFO, PubMED, and SCOPUS. These were both quantitative and qualitative in nature. All were screened by two of the authors using previously published exclusion criteria, and the remaining were appraised by all authors using a previously published scoring system. The highest scoring articles were then reviewed. RESULTS Sixty-one manuscripts were scored, and 10 quantitative and two qualitative papers were the highest scoring and are reviewed and summarized in this article. CONCLUSIONS This installment in this critical appraisal series reviews 12 of the highest-quality EM-related medical education research manuscripts published in 2015.
Collapse
Affiliation(s)
- Corey R. Heitz
- Carilion Clinic/Virginia Tech Carilion School of MedicineRoanokeVA
| | - Wendy Coates
- Harbor/University of California Los Angeles Medical CenterLos AngelesCA
| | | | - Jonathan Fisher
- Maricopa Medical Center/University of Arizona College of Medicine–PhoenixPhoenixAZ
| | | | | |
Collapse
|
14
|
McArthur TA, Flug JA, Restauri N. Behavioral Interviewing: Integrating ACGME Competency-Based Questions Into the Radiology Resident Selection Process. Curr Probl Diagn Radiol 2017; 46:91-94. [DOI: 10.1067/j.cpradiol.2016.11.004] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2016] [Revised: 10/02/2016] [Accepted: 11/08/2016] [Indexed: 11/22/2022]
|
15
|
Van Dermark JT, Wald DA, Corker JR, Reid DG. Financial Implications of the Emergency Medicine Interview Process. AEM EDUCATION AND TRAINING 2017; 1:60-69. [PMID: 30051011 PMCID: PMC6001822 DOI: 10.1002/aet2.10011] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/16/2016] [Revised: 10/22/2016] [Accepted: 11/08/2016] [Indexed: 05/30/2023]
Abstract
BACKGROUND Emergency medicine (EM) residency interviews are an important, yet costly process for programs and applicants. The total economic burden of the EM interviewing process is previously unstudied. Graduate medical education funding and student finances are both fragile shifting sources, which appear to fund most of these economic expenditures. OBJECTIVES The total economic impact of the EM interview season is unknown. This study sought to calculate total dollars spent by EM residency programs and senior medical students (M4) during interview season. Potential solutions for reducing this burden will be outlined. METHODS Institutional review board-approved, piloted e-mail surveys were sent to accredited (Accreditation Council for Graduate Medical Education [ACGME] and American Osteopathic Association [AOA]) EM program directors (PDs) and M4 student members of EMRA. PDs were queried after the 2014-2015 interview season. PDs questions included demographics, estimated faculty, and resident and administrative time used, along with dollars spent during the 2014-2015 interview season. M4 questions included demographics and dollars spent during the 2015-2016 season. Results were reported using descriptive statistics. Financial data for EM programs were calculated with academic EM faculty, resident, and administrative assistant salaries along with reported hours used during the interview season. RESULTS A total of 82 of 223 EM PDs completed the survey, reporting an mean annual cost of $210,649.04 per program to review, screen, and interview applicants based on time spent by faculty, resident, and administrative assistants. A total of 84.6% of EM program costs were due to faculty hours. A total of 180 of 1,425 EM-bound M4 students completed the survey, reporting a mean annual estimate of US$5,065.44 per student to apply and interview. Seventy-two percent of estimated costs were due to airfare and lodging. Loans and credit cards were the top two methods of payments of these interview costs by students. Extrapolating the cost of EM personnel with hours spent, the economic burden of an interview season for EM programs is approximately US$46,974,735.92. M4 students spent US$19,724,823.40 for application fees and interview-related expenses. CONCLUSIONS Emergency medicine residency programs and applicants appear to spend over US$66 million per cycle on the interview process. EM residency programs may save resources by reducing faculty hours associated with the interview process and leveraging administrative and resident resources. Creation of regional or national fixed interview locations may also be appropriate. Applicants may reduce travel costs by participating in video interviews, reducing program applications, and attending regionalized interview days. A full conversation among all specialties and organized medicine needs to take place to reform the systems in place to reduce the economic burden on students and residency programs.
Collapse
Affiliation(s)
| | - David A. Wald
- Department of Emergency MedicineLewis Katz School of Medicine, Temple UniversityPhiladelphiaPA
| | - John Robert Corker
- Department of Emergency MedicineUniversity of Texas Southwestern Medical Center/Parkland Healthcare SystemDallasTX
| | - David Godley Reid
- Department of Emergency MedicineUniversity of Texas Southwestern Medical Center/Parkland Healthcare SystemDallasTX
| |
Collapse
|
16
|
Hern HG, Trivedi T, Alter HJ, Wills CP. How Prevalent Are Potentially Illegal Questions During Residency Interviews? A Follow-up Study of Applicants to All Specialties in the National Resident Matching Program. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2016; 91:1546-1553. [PMID: 27049540 DOI: 10.1097/acm.0000000000001181] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
PURPOSE To describe the prevalence and effects on applicants of being asked potentially illegal questions during the residency interview process by surveying all residency applicants to all specialties. METHOD The authors surveyed all applicants from U.S. medical schools to residency programs in all specialties in 2012-2013. The survey included questions about the prevalence of potentially illegal questions, applicants' level of comfort with such questions, and whether such questions affected how applicants ranked programs. Descriptive statistics, tests of proportions, t tests, and logistic regression modeling were used to analyze the data. RESULTS Of 21,457 eligible applicants, 10,976 (51.1%) responded to the survey. Overall, 65.9% (7,219/10,967) reported receiving at least one potentially illegal question. More female respondents reported being asked questions about gender (513/5,357 [9.6%] vs. 148/5,098 [2.9%]), marital status (2,895/5,283 [54.8%] vs. 2,592/4,990 [51.9%]), or plans for having children (889/5,241 [17.0%] vs. 521/4,931 [10.6%]) than male respondents (P < .001). Those in surgical specialties were more likely to have received a potentially illegal question than those in nonsurgical specialties (1,908/2,330 [81.9%] vs. 5,311/8,281 [64.1%]). Questions regarding their commitment to the program were reported by 15.5% (1,608/10,378) of respondents. Such potentially illegal questions negatively affected how respondents ranked programs. CONCLUSIONS Two-thirds of applicants reported being asked potentially illegal questions. More women than men reported receiving questions about marital status or family planning. Potentially illegal questions negatively influence how applicants perceive and rank programs. A formal interview code of conduct or interviewer training could help to address these issues.
Collapse
Affiliation(s)
- H Gene Hern
- H.G. Hern Jr is vice chair for education, Department of Emergency Medicine, Highland Hospital, Alameda Health System, Oakland, California, and associate clinical professor, Department of Emergency Medicine, University of California, San Francisco, School of Medicine, San Francisco, California. T. Trivedi is an emergency medicine resident, Department of Emergency Medicine, Highland Hospital, Alameda Health System, Oakland, California. H.J. Alter is vice chair for research, Department of Emergency Medicine, Highland Hospital, Alameda Health System, Oakland, California, and associate clinical professor, Department of Emergency Medicine, University of California, San Francisco, School of Medicine, San Francisco, California. C.P. Wills is residency director, Department of Emergency Medicine, Highland Hospital, Alameda Health System, Oakland, California, and associate clinical professor, Department of Emergency Medicine, University of California, San Francisco, School of Medicine, San Francisco, California
| | | | | | | |
Collapse
|
17
|
Hanson MD, Woods NN, Martimianakis MA, Rasasingham R, Kulasegaram K. Multiple independent sampling within medical school admission interviewing: an "intermediate approach". PERSPECTIVES ON MEDICAL EDUCATION 2016; 5:292-299. [PMID: 27638394 PMCID: PMC5035284 DOI: 10.1007/s40037-016-0298-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
INTRODUCTION Balancing reliability and resource limitations as well as recruitment activities during admission interviews is a challenge for many medical schools. The Modified Personal Interview (MPI) has been shown to have good psychometric properties while being resource efficient for specialized admission interviews. We describe implementation of an MPI adaptation integrating psychometric rigour alongside resourcing and recruitment goals for larger-scale medical school admission interviewing at the University of Toronto. METHODS The MPI was implemented during the 2013-2014 admission cycle. The MPI uses multiple independent sampling by having applicants interviewed in a circuit of four brief semi-structured interviews. Recruitment is reflected in a longer MPI interviewing time to foster a 'human touch'. Psychometric evaluation includes generalizability studies to examine inter-interview reliability and other major sources of error variance. We evaluated MPI impact upon applicant recruitment yield and resourcing. RESULTS MPI reliability is 0.56. MPI implementation maintained recruitment compared with previous year. MPI implementation required 160 interviewers for 600 applicants whereas for pre-MPI implementation 290 interviewers were required to interview 587 applicants. MPI score correlated with first year OSCE performance at 0.30 (p < 0.05). DISCUSSION MPI reliability is measured at 0.56 alongside enhanced resource utilization and maintenance of recruitment yield. This 'intermediate approach' may enable broader institutional uptake of integrated multiple independent sampling-based admission interviewing within institution-specific resourcing and recruitment goals.
Collapse
Affiliation(s)
- Mark D Hanson
- Department of Psychiatry, Faculty of Medicine, University of Toronto, Toronto, Ontario, Canada.
- Hospital for Sick Children, Toronto, Ontario, Canada.
| | - Nicole N Woods
- The Wilson Centre, University of Toronto, Toronto, Ontario, Canada
| | | | - Raj Rasasingham
- Department of Psychiatry, Faculty of Medicine, University of Toronto, Toronto, Ontario, Canada
- Humber River Regional Hospital, Toronto, Ontario, Canada
| | | |
Collapse
|
18
|
Ogunyemi D, Alexander C, Tangchitnob E, Kim DS. Mini Surgical Simulation, Role Play, and Group and Behavioral Interviews in Resident Selection. J Grad Med Educ 2016; 8:410-6. [PMID: 27413446 PMCID: PMC4936861 DOI: 10.4300/jgme-d-15-00203.1] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND A robust selection process is critical to residents' "cultural fit" and success in their program. Traditional selection methods have shortcomings. OBJECTIVE We describe a novel residency interview process for obstetrics-gynecology residents that incorporates behavioral, group, and surgical simulation multiple mini interviews (MMIs). METHODS In 2010, the Cedars-Sinai Medical Center obstetrics-gynecology residency program developed surgical simulation, role play, ethics group interview, and Accreditation Council for Graduate Medical Education competency-based behavioral interview stations. RESULTS From 2010 to 2012, a total of 199 applicants were interviewed, 62 ranked in the top 20, and 18 matched into the program. The MMI scores for interview stations were used in compiling our rank list and were found to adequately differentiate candidates. The MMI mean scores for role play, ethics interview, surgical simulation, and the behavioral interview for the top 20 ranked candidates were statistically significantly higher than those for other applicants. Standardized tests minimally correlated with various interview modalities. Applicants found the interview process acceptable. Implementing these MMI stations increased the total applicant interview time for the day by 15% (from 5.5 to 6.5 hours) and increased the face-to-face interview time from 2 to 4 hours. Approximately 42 hours of coordinator time was required for the yearly interview cycle. CONCLUSIONS A multifaceted interview process utilizing MMI, group interview, and surgical simulation MMI is feasible and acceptable. The approach may decrease subjectivity and reliance on traditional interview methods and facilitate the selection of "compatible" residents into the program.
Collapse
Affiliation(s)
| | | | | | - David Seil Kim
- Corresponding author: David Seil Kim, MD, PhD, MBA, Cedars-Sinai Medical Center, 160 West Tower, 8635 West 3rd Street, Los Angeles, CA 90048, 310.423.2914, fax 310.423.0140,
| |
Collapse
|
19
|
Min AA, Leetch A, Nuño T, Fiorello AB. How well will you FIT? Use of a modified MMI to assess applicants' compatibility with an emergency medicine residency program. MEDICAL EDUCATION ONLINE 2016; 21:29587. [PMID: 26842824 PMCID: PMC4740091 DOI: 10.3402/meo.v21.29587] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/31/2015] [Revised: 12/03/2015] [Accepted: 12/21/2015] [Indexed: 05/11/2023]
Abstract
PURPOSE Emergency medicine residency programs have evaluated the use of Multiple Mini Interviews (MMIs) for applicants. The authors developed an MMI-style method called the Fast Interview Track (FIT) to predict an applicant's 'fit' within an individual residency program. METHODS Applicants meet with up to five residents and are asked one question by each. Residents score the applicant using a Likert scale from 1 to 5 on two questions: 'How well does the applicant think on his/her feet?' and 'How well do you think the applicant will fit in here?'. To assess how well these questions predicted a resident's 'fit', current residents scored fellow residents on these same questions. These scores were compared with the residents' interview FIT scores. A postmatch survey of applicants who did not match at this program solicited applicants' attitudes toward the FIT sessions. RESULTS Among the junior class, the correlation between interview and current scores was significant for question 1 (rho=0.5192 [p=0.03]) and question 2 (rho=0.5753 [p=0.01]). Among seniors, Spearman's rho was statistically significant for question 2, though not statistically significant for question 1. The chi-square measure of high scores (4-5) versus low scores (1-3) found a statistically significant association between interview and current scores for interns and juniors. Of the 29 responses to the postmatch survey, 16 (55%) felt FIT sessions provided a good sense of the program's personality and only 6 (21%) disagreed. Nine (31%) felt FIT sessions positively impacted our program's ranking and 11 (38%) were 'Neutral'. Only two (7%) reported that FIT sessions negatively impacted their ranking of our program. CONCLUSIONS FIT provided program leadership with a sense of an applicant's 'fit' within this program. Interview day scores correlated with scores received during residency. Most applicants report a positive experience with FIT sessions. FIT provides a useful tool to recruit applicants who fit with the residency program.
Collapse
Affiliation(s)
- Alice A Min
- Department of Emergency Medicine, College of Medicine, The University of Arizona, Tucson, AZ, USA;
| | - Aaron Leetch
- Department of Emergency Medicine and Pediatrics, College of Medicine, The University of Arizona, Tucson, AZ, USA
| | - Tomas Nuño
- Department of Emergency Medicine, College of Medicine, The University of Arizona, Tucson, AZ, USA
| | - Albert B Fiorello
- Department of Emergency Medicine, College of Medicine, The University of Arizona, Tucson, AZ, USA
| |
Collapse
|