1
|
Olaf M, Moffett S, Ledford M, Fix M, Smith L. Resource Utilization and Emergency Medicine Advisors' Approach to Video Interview Preparation. Cureus 2021; 13:e18504. [PMID: 34754664 PMCID: PMC8569643 DOI: 10.7759/cureus.18504] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/05/2021] [Indexed: 11/05/2022] Open
Abstract
Introduction The Standardized Video Interview (SVI) was a residency application component introduced by the Association of American Medical Colleges (AAMC) as a supplement to the existing process, which aimed to measure knowledge of professional behaviors and interpersonal skills. Given its novelty in both aim and execution, little advice or experience was available to inform preparation strategies. We sought to perform a cross-sectional analysis to explore advisors' practices in guiding students' preparation for the SVI. Methods An electronic questionnaire was developed and piloted for flow and usability, then distributed to all members of the Council of Residency Directors in Emergency Medicine (CORD EM), the professional society for emergency medicine educators, via listserv, comprised of 270 residency programs. Questions were both open- and closed-ended and therefore analyzed in a mixed-method fashion. Results We received 56 responses from a listserv representing 270 residency programs. Respondents cited personal experience and consensus opinions from national organizations as the primary sources for their advice. The most common resources offered to students were space for completing the SVI (41%) or technical support for completing the SVI (47%). The time committed to student advising specifically for the SVI ranged from zero to 20 hours. Estimated associated costs of preparation ranged from zero up to $10,000 (time plus resources). Two individuals reported recommending commercial preparation resources to students. Conclusion The SVI was a novel attempt to augment the resident application process. We found variability in resources and advice offered to students, including broad ranges of time dedicated, the monetary value of resources contributed, and the types of resources utilized. As the global COVID-19 pandemic has inspired a wave of innovation and process changes, we present this data for consideration as a snapshot of the variable responses to a single uniform process change.
Collapse
Affiliation(s)
- Mark Olaf
- Emergency Medicine, Geisinger Commonwealth School of Medicine, Scranton, USA
| | | | - Matthew Ledford
- Emergency Medicine, University of Connecticut School of Medicine, Farmington, USA
| | - Megan Fix
- Emergency Medicine, University of Utah School of Medicine, Salt Lake City, USA
| | - Liza Smith
- Emergency Medicine, Baystate Medical Center, Springfield, USA
| |
Collapse
|
2
|
Yang A, Gilani C, Saadat S, Murphy L, Toohey S, Boysen‐Osborn M. Which Applicant Factors Predict Success in Emergency Medicine Training Programs? A Scoping Review. AEM EDUCATION AND TRAINING 2020; 4:191-201. [PMID: 32704588 PMCID: PMC7369487 DOI: 10.1002/aet2.10411] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/01/2019] [Revised: 10/30/2019] [Accepted: 11/04/2019] [Indexed: 05/30/2023]
Abstract
BACKGROUND Program directors (PDs) in emergency medicine (EM) receive an abundance of applications for very few residency training spots. It is unclear which selection strategies will yield the most successful residents. Many authors have attempted to determine which items in an applicant's file predict future performance in EM. OBJECTIVES The purpose of this scoping review is to examine the breadth of evidence related to the predictive value of selection factors for performance in EM residency. METHODS The authors systematically searched four databases and websites for peer-reviewed and gray literature related to EM admissions published between 1992 and February 2019. Two reviewers screened titles and abstracts for articles that met the inclusion criteria, according to the scoping review study protocol. The authors included studies if they specifically examined selection factors and whether those factors predicted performance in EM residency training in the United States. RESULTS After screening 23,243 records, the authors selected 60 for full review. From these, the authors selected 15 published manuscripts, one unpublished manuscript, and 11 abstracts for inclusion in the review. These studies examined the United States Medical Licensing Examination (USMLE), Standardized Letters of Evaluation, Medical Student Performance Evaluation, medical school attended, clerkship grades, membership in honor societies, and other less common factors and their association with future EM residency training performance. CONCLUSIONS The USMLE was the most common factor studied. It unreliably predicts clinical performance, but more reliably predicts performance on licensing examinations. All other factors were less commonly studied and, similar to the USMLE, yielded mixed results.
Collapse
Affiliation(s)
- Allen Yang
- Department of Emergency MedicineUniversity of California, IrvineIrvineCA
| | - Chris Gilani
- Department of Emergency MedicineUniversity of California, IrvineIrvineCA
| | - Soheil Saadat
- Department of Emergency MedicineUniversity of California, IrvineIrvineCA
| | - Linda Murphy
- Health Science Library OrangeUniversity of California, IrvineIrvineCA
| | - Shannon Toohey
- Department of Emergency MedicineUniversity of California, IrvineIrvineCA
| | - Megan Boysen‐Osborn
- Department of Emergency MedicineUniversity of California, IrvineIrvineCA
- School of MedicineUniversity of California, IrvineIrvineCA
| |
Collapse
|
3
|
Dharssi S, Woreta FA, Boland MV. Ophthalmology Applicant Perceptions of Two Residency Application Services: The San Francisco Match Central Application Service and Electronic Residency Application Service. JOURNAL OF ACADEMIC OPHTHALMOLOGY 2020. [DOI: 10.1055/s-0040-1717065] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022] Open
Abstract
Abstract
Purpose Given ophthalmology residency programs are transitioning to include the internship year, either through “joint” or “integrated” 4-year programs, we set out to identify applicant preferences regarding the match and their experiences with two residency application systems: (1) the Central Application Service (CAS) and (2) the Electronic Residency Application Service (ERAS).
Design This study is designed as a retrospective repeated cross-sectional survey.
Methods A 15-question online survey was sent to 196 and 461 applicants to the 2019 and 2020 ophthalmology match cycles, respectively. Questions from the survey assessed user experiences with specific components of both application services and evaluated preferences regarding the future of the ophthalmology match.
Results Responses were received from 208 (32%) applicants. A majority of users had positive experiences with both application services; for CAS, 162 (78%) applicants had a positive experience, compared with 111 (53%) for ERAS. When compared directly, applicants favored the CAS (60%) to ERAS (21%). Furthermore, 108 (52%) respondents stated that they would prefer ophthalmology continue to use both the CAS and ERAS, while 47 (23%) respondents indicated a desire for the CAS to become the only application system for both matches.
Conclusion Although half of all respondents prefer that both the CAS and ERAS systems are utilized for the match process, many express a desire for a single matching program. As ophthalmology residency programs move to joint and integrated 4-year programs, the complexity of matching will increase. Further evaluation of applicant preferences during this transition phase is needed as applicants are required to apply to a variety of different joint and integrated internship and ophthalmology programs.
Collapse
Affiliation(s)
- Shazia Dharssi
- Department of Ophthalmology, Wilmer Eye Institute, Johns Hopkins University School of Medicine, Baltimore, Maryland
| | - Fasika A. Woreta
- Department of Ophthalmology, Wilmer Eye Institute, Johns Hopkins University School of Medicine, Baltimore, Maryland
| | - Michael V. Boland
- Department of Ophthalmology, Wilmer Eye Institute, Johns Hopkins University School of Medicine, Baltimore, Maryland
| |
Collapse
|
4
|
Hall MM, Lewis JJ, Joseph JW, Ketterer AR, Rosen CL, Dubosh NM. Standardized Video Interview Scores Correlate Poorly with Faculty and Patient Ratings. West J Emerg Med 2019; 21:145-148. [PMID: 31913835 PMCID: PMC6948708 DOI: 10.5811/westjem.2019.11.44054] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2019] [Accepted: 11/13/2019] [Indexed: 11/11/2022] Open
Abstract
The Standardized Video Interview (SVI) was developed by the Association of American Medical Colleges to assess professionalism, communication, and interpersonal skills of residency applicants. How SVI scores compare with other measures of these competencies is unknown. The goal of this study was to determine whether there is a correlation between the SVI score and both faculty and patient ratings of these competencies in emergency medicine (EM) applicants. This was a retrospective analysis of a prospectively collected dataset of medical students. Students enrolled in the fourth-year EM clerkship at our institution and who applied to the EM residency Match were included. We collected faculty ratings of the students’ professionalism and patient care/communication abilities as well as patient ratings using the Communication Assessment Tool (CAT) from the clerkship evaluation forms. Following completion of the clerkship, students applying to EM were asked to voluntarily provide their SVI score to the study authors for research purposes. We compared SVI scores with the students’ faculty and patient scores using Spearman’s rank correlation. Of the 43 students from the EM clerkship who applied in EM during the 2017–2018 and 2018–2019 application cycles, 36 provided their SVI scores. All 36 had faculty evaluations and 32 had CAT scores available. We found that SVI scores did not correlate with faculty ratings of professionalism (rho = 0.09, p = 0.13), faculty assessment of patient care/communication (rho = 0.12, p = 0.04), or CAT scores (rho = 0.11, p = 0.06). Further studies are needed to validate the SVI and determine whether it is indeed a predictor of these competencies in residency.
Collapse
Affiliation(s)
- Matthew M Hall
- Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, Massachusetts
| | - Jason J Lewis
- Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, Massachusetts
| | - Joshua W Joseph
- Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, Massachusetts
| | - Andrew R Ketterer
- Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, Massachusetts
| | - Carlo L Rosen
- Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, Massachusetts
| | - Nicole M Dubosh
- Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, Massachusetts
| |
Collapse
|
5
|
Hanson MD, Eva KW. A Reflection Upon the Impact of Early 21st-Century Technological Innovations on Medical School Admissions. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2019; 94:640-644. [PMID: 30640267 DOI: 10.1097/acm.0000000000002590] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
The authors describe influences associated with the incorporation of modern technologies into medical school admissions processes. Their purpose is not to critique or support specific technologies but, rather, to prompt reflection on the evolution that is afoot. Technology is now integral to the administration of multiple admissions tools, including the Medical College Admission Test, situational judgment tests, and standardized video interviews. Consequently, today's admissions landscape is transforming into an online, globally interconnected marketplace for health professions admissions tools. Academic capitalism and distance-based technologies combine to enable global marketing and dissemination of admissions tests beyond the national jurisdictions in which they are designed. As predicted by disruptive business theory, they are becoming key drivers of transformative change. The seeds of technological disruption are present now rather than something to be wary of in the future. The authors reflect on this transformation and the need for tailoring test modifications to address issues of medical student diversity and social responsibility. They comment on the online assessment of applicants' personal competencies and the potential detriments if this method were to replace admissions methods involving human contact, thanks to the ease with which institutions can implement them without cost to themselves and without adequate consideration of measurement utility or contextual appropriateness. The authors advocate for socially responsible academic capitalism within this interconnected admissions marketplace: Attending to today's transformative challenges may inform how health professions education responds to tomorrow's admissions technologies and, in turn, how tomorrow's health professionals respond to their patients' needs.
Collapse
Affiliation(s)
- Mark D Hanson
- M.D. Hanson is child and adolescent psychiatrist, Hospital for Sick Children, and professor, Department of Psychiatry, Faculty of Medicine, University of Toronto, Toronto, Ontario, Canada; ORCID: https://orcid.org/0000-0002-0820-4521. K.W. Eva is associate director and senior scientist, Centre for Health Education Scholarship, and professor and director of education research and scholarship, Department of Medicine, Faculty of Medicine, University of British Columbia, Vancouver, British Columbia, Canada; ORCID: https://orcid.org/0000-0002-8672-2500
| | | |
Collapse
|
6
|
Taira T, Santen SA, Roberts NK. Defining the "Problem Resident" and the Implications of the Unfixable Problem: The Rationale for a "Front-door" Solution. West J Emerg Med 2019; 20:43-49. [PMID: 30643600 PMCID: PMC6324719 DOI: 10.5811/westjem.2018.11.39867] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2018] [Revised: 11/03/2018] [Accepted: 11/08/2018] [Indexed: 11/26/2022] Open
Abstract
Introduction Problem residents are common in graduate medical education, yet little is known about their characteristics, deficits, and the consequences for emergency medicine (EM) residencies. The American Board of Internal Medicine (ABIM) defines a problem resident as “a trainee who demonstrates a significant enough problem that requires intervention by someone of authority, usually the program director [PD] or chief resident.” Although this is a comprehensive definition, it lacks specificity. Our study seeks to add granularity and nuance to the definition of “problem resident,” which can be used to guide the recruitment, selection, and training of residents. Methods We conducted semi-structured interviews with a convenience sample of EM PDs between 2011 and 2012. We performed qualitative analysis of the resulting transcripts with our thematic analysis based on the principles of grounded theory. We reached thematic sufficiency after 17 interviews. Interviews were coded as a team through consensus. Results The analysis identified diversity in the type, severity, fixability, and attribution of problems among problem residents. PDs applied a variety of thresholds to define a problem resident with many directly rejecting the ABIM definition. There was consistency in defining academic problems and some medical problems as “fixable.” In contrast, personality problems were consistently defined as “non-fixable.” Despite the diversity of the definition, there was consensus that residents who caused “turbulence” were problem residents. Conclusion The ABIM definition of the problem resident captures trainees who many PDs do not consider problem residents. We propose that an alternative definition of the problem resident would be “a resident with a negative sphere of influence beyond their personal struggle.” This combination acknowledges the identified themes of turbulence and the diversity of threshold. Further, the combination of PDs’ unwillingness to terminate trainees and the presence of non-fixable problems implies the need for a “front-door” solution that emphasizes personality issues at the potential expense of academic potential. This “front-door” solution depends on the commitment of all stakeholders including medical schools, the Association of American Medical Colleges, and PDs.
Collapse
Affiliation(s)
- Taku Taira
- LAC+USC Medical Center, Department of Emergency Medicine, Los Angeles, California.,Stony Brook University Medical Center, Department of Emergency Medicine, Stony Brook, New York
| | - Sally A Santen
- Virginia Commonwealth University School of Medicine, Department of Emergency Medicine, Richmond, Virginia
| | - Nicole K Roberts
- The City University of New York (CUNY) School of Medicine, Department of Medical Education, New York, New York
| |
Collapse
|
7
|
Schnapp BH, Ritter D, Kraut AS, Fallon S, Westergaard MC. Assessing Residency Applicants' Communication and Professionalism: Standardized Video Interview Scores Compared to Faculty Gestalt. West J Emerg Med 2018; 20:132-137. [PMID: 30643616 PMCID: PMC6324715 DOI: 10.5811/westjem.2018.10.39709] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2018] [Revised: 10/08/2018] [Accepted: 10/19/2018] [Indexed: 11/27/2022] Open
Abstract
Introduction The Association of American Medical Colleges has introduced the Standardized Video Interview (SVI) to assess the communication and professionalism skills of residency applicants to allow a more holistic view of applicants beyond academic performance. Initial data suggests scores are not correlated with academic performance and provide a new measure of applicant attributes. It is not currently known how the SVI compares to existing metrics for assessing communication and professionalism during the interview process. Methods Applicants to the University of Wisconsin Emergency Medicine Residency program were invited and interviewed without use of the SVI scores or videos. All faculty interviewers were blinded to applicants’ SVI information and asked to rate each applicant on their communication and professionalism on a scale from 1–25 (faculty gestalt score), analogous to the 6–30 scoring used by the SVI. We transformed SVI scores to our 1–25 system (transformed SVI score) for ease of comparison and compared them to faculty gestalt scores as well as applicants’ overall score for all components of their interview day (interview score). Results We collected data for 125 residency candidates. Each applicant received a faculty gestalt score from up to four faculty interviewers. There was no significant correlation of SVI scores with faculty gestalt scores (Spearman’s rank correlation coefficient [rs] (123)=0.09, p=0.30) and no correlation with the overall interview score (rs(123)=0.01, p=0.93). Faculty gestalt scores were correlated positively with interview scores (rs(123)=0.65, p<0.01). Conclusion SVI scores show no significant correlation with faculty gestalt scores of communication and professionalism. This could relate to bias introduced by knowledge of an applicant’s academic performance, different types of questions being asked by faculty interviewers, or lack of uniform criteria by which faculty assess these competencies. Further research is needed to determine whether SVI scores or faculty gestalt correlate with performance during residency.
Collapse
Affiliation(s)
- Benjamin H Schnapp
- University of Wisconsin, BerbeeWalsh Department of Emergency Medicine, Madison, Wisconsin
| | - Daniel Ritter
- University of Wisconsin, BerbeeWalsh Department of Emergency Medicine, Madison, Wisconsin
| | - Aaron S Kraut
- University of Wisconsin, BerbeeWalsh Department of Emergency Medicine, Madison, Wisconsin
| | - Sarah Fallon
- University of Wisconsin, BerbeeWalsh Department of Emergency Medicine, Madison, Wisconsin
| | - Mary C Westergaard
- University of Wisconsin, BerbeeWalsh Department of Emergency Medicine, Madison, Wisconsin
| |
Collapse
|