1
|
Ellison HB, Grabowski CJ, Schmude M, Costa JB, Naemi B, Schmidt M, Patel D, Westervelt M. Evaluating a Situational Judgment Test for Use in Medical School Admissions: Two Years of AAMC PREview Exam Administration Data. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2024; 99:183-191. [PMID: 37976531 DOI: 10.1097/acm.0000000000005548] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/19/2023]
Abstract
PURPOSE To examine the relationship between the Association of American Medical Colleges (AAMC) Professional Readiness Exam (PREview) scores and other admissions data, group differences in mean PREview scores, and whether adding a new assessment tool affected the volume and composition of applicant pools. METHOD Data from the 2020 and 2021 PREview exam administrations were analyzed. Two U.S. schools participated in the PREview pilot in 2020 and 6 U.S. schools participated in 2021. PREview scores were paired with data from the American Medical College Application Service (undergraduate grade point averages [GPAs], Medical College Admission Test [MCAT] scores, race, and ethnicity) and participating schools (interview ratings). RESULTS Data included 19,525 PREview scores from 18,549 unique PREview examinees. Correlations between PREview scores and undergraduate GPAs ( r = .16) and MCAT scores ( r = .29) were small and positive. Correlations between PREview scores and interview ratings were also small and positive, ranging between .09 and .14 after correcting for range restriction. Small group differences in mean PREview scores were observed between White and Black or African American and White and Hispanic, Latino, or of Spanish origin examinees. The addition of the PREview exam did not substantially change the volume or composition of participating schools' applicant pools. CONCLUSIONS Results suggest the PREview exam measures knowledge of competencies that are distinct from those measured by other measures used in medical school admissions. Observed group differences were smaller than group differences observed with traditional academic assessments and evaluations. The addition of the PREview exam did not substantially change the overall volume of applications or the proportions of out-of-state, underrepresented in medicine, or lower socioeconomic status applicants. While more research is needed, these results suggest the PREview exam may provide unique information to the admissions process without adversely affecting applicant pools.
Collapse
|
2
|
Kennedy AB, Riyad CNY, Ellis R, Fleming PR, Gainey M, Templeton K, Nourse A, Hardaway V, Brown A, Evans P, Natafgi N. Evaluating a Global Assessment Measure Created by Standardized Patients for the Multiple Mini Interview in Medical School Admissions: Mixed Methods Study. J Particip Med 2022; 14:e38209. [PMID: 36040776 PMCID: PMC9472042 DOI: 10.2196/38209] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2022] [Revised: 07/19/2022] [Accepted: 08/03/2022] [Indexed: 11/15/2022] Open
Abstract
Background Standardized patients (SPs) are essential stakeholders in the multiple mini interviews (MMIs) that are increasingly used to assess medical school applicants’ interpersonal skills. However, there is little evidence for their inclusion in the development of instruments. Objective This study aimed to describe the process and evaluate the impact of having SPs co-design and cocreate a global measurement question that assesses medical school applicants’ readiness for medical school and acceptance status. Methods This study used an exploratory, sequential, and mixed methods study design. First, we evaluated the initial MMI program and determined the next quality improvement steps. Second, we held a collaborative workshop with SPs to codevelop the assessment question and response options. Third, we evaluated the created question and the additional MMI rubric items through statistical tests based on 1084 applicants’ data from 3 cohorts of applicants starting in the 2018-2019 academic year. The internal reliability of the MMI was measured using a Cronbach α test, and its prediction of admission status was tested using a forward stepwise binary logistic regression. Results Program evaluation indicated the need for an additional quantitative question to assess applicant readiness for medical school. In total, 3 simulation specialists, 2 researchers, and 21 SPs participated in a workshop leading to a final global assessment question and responses. The Cronbach α’s were >0.8 overall and in each cohort year. The final stepwise logistic model for all cohorts combined was statistically significant (P<.001), explained 9.2% (R2) of the variance in acceptance status, and correctly classified 65.5% (637/972) of cases. The final model consisted of 3 variables: empathy, rank of readiness, and opening the encounter. Conclusions The collaborative nature of this project between stakeholders, including nonacademics and researchers, was vital for the success of this project. The SP-created question had a significant impact on the final model predicting acceptance to medical school. This finding indicates that SPs bring a critical perspective that can improve the process of evaluating medical school applicants.
Collapse
Affiliation(s)
- Ann Blair Kennedy
- Biomedical Sciences Department, School of Medicine Greenville, University of South Carolina, Greenville, SC, United States
- Patient Engagement Studio, University of South Carolina, Greenville, SC, United States
- Family Medicine Department, Prisma Health, Greenville, SC, United States
| | - Cindy Nessim Youssef Riyad
- School of Medicine Greenville, University of South Carolina, Greenville, SC, United States
- Hospital Based Accreditation, Accreditation Council of Graduate Medical Education, Chicago, IL, United States
| | - Ryan Ellis
- School of Medicine Greenville, University of South Carolina, Greenville, SC, United States
| | - Perry R Fleming
- Patient Engagement Studio, University of South Carolina, Greenville, SC, United States
- School of Medicine Columbia, University of South Carolina, Columbia, SC, United States
| | - Mallorie Gainey
- School of Medicine Greenville, University of South Carolina, Greenville, SC, United States
| | - Kara Templeton
- Prisma Health-Upstate Simulation Center, School of Medicine Greenville, University of South Carolina, Greenville, SC, United States
| | - Anna Nourse
- Patient Engagement Studio, University of South Carolina, Greenville, SC, United States
| | - Virginia Hardaway
- Admissions and Registration, School of Medicine Greenville, University of South Carolina, Greenville, SC, United States
| | - April Brown
- Prisma Health-Upstate Simulation Center, School of Medicine Greenville, University of South Carolina, Greenville, SC, United States
| | - Pam Evans
- Patient Engagement Studio, University of South Carolina, Greenville, SC, United States
- Prisma Health-Upstate Simulation Center, School of Medicine Greenville, University of South Carolina, Greenville, SC, United States
| | - Nabil Natafgi
- Patient Engagement Studio, University of South Carolina, Greenville, SC, United States
- Health Services, Policy, Management Department, Arnold School of Public Health, University of South Carolina, Columbia, SC, United States
| |
Collapse
|