1
|
Ghaffari-Rafi A, Lee RE, Fang R, Miles JD. Multivariable analysis of factors associated with USMLE scores across U.S. medical schools. BMC MEDICAL EDUCATION 2019; 19:154. [PMID: 31109315 PMCID: PMC6528346 DOI: 10.1186/s12909-019-1605-z] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/26/2018] [Accepted: 05/13/2019] [Indexed: 06/09/2023]
Abstract
BACKGROUND Gauging medical education quality has always remained challenging. Many studies have examined predictors of standardized exam performance; however, data sets do not distinguish by institution or curriculum. Our objective is to present a summary of variables associated with the United States Medical Licensing Examination (USMLE) scores, and thus identify institutions (and therefore curriculums) which deviate from trend lines by producing higher USMLE scores despite having lower entrance grade point averages and medical college admissions test (MCAT) scores. METHODS Data was obtained from U.S. News and World Report's 2014 evaluation of allopathic U.S. medical schools. A univariate analysis was performed first for each variable using two sample t-test or Wilcoxon rank sum test for categorical variables, and Pearson or Spearman correlation coefficients for continuous variables. A multivariable linear regression model was developed to identify the factors contributing to USMLE scores. All statistical analyses were two-sided and performed using SAS software version 9.4 (SAS Institute Inc., Cary, NC). RESULTS Univariate analysis reveals a significant association between USMLE Step 1 and 2 scores with medical college admissions test scores, grade point averages, school type (private vs. public), full-time faculty-to-student ratio, National Institute of Health funds, residency director assessment score, peer assessment score, and class size. Of these nine variables, MCAT scores and Step 1 scores display the strongest correlation (corr = 0.72, P < .0001). Multivariable analysis also supports a significant association between MCAT scores and Step scores, meanwhile National Institute of Health funding size demonstrates a negative correlation with USMLE Step 2 scores. Although MCAT scores and National Institute of Health funds are significantly associated with USMLE performance, six outlier institutions were identified, producing higher USMLE scores than trend line predictions. CONCLUSIONS Outlier institutions produce USMLE scores that do not follow expected trend lines. Their performance might be explainable by differences in curriculum. Having identified these institutions, their curriculums can be further studied to determine what factors enhance student learning.
Collapse
Affiliation(s)
- Arash Ghaffari-Rafi
- University of Hawai‘i at Mānoa John A. Burns School of Medicine, Honolulu, Hawai‘i USA
- Queen Square Institute of Neurology, University College London, London, UK
| | - Rachel Elizabeth Lee
- University of Hawai‘i at Mānoa John A. Burns School of Medicine, Honolulu, Hawai‘i USA
| | - Rui Fang
- University of Hawai‘i at Mānoa John A. Burns School of Medicine, Honolulu, Hawai‘i USA
| | - J. Douglas Miles
- University of Hawai‘i at Mānoa John A. Burns School of Medicine, Honolulu, Hawai‘i USA
| |
Collapse
|
2
|
Linsenmeyer M, Ridpath L. Nelson-Denny Reading Test Scores as a Predictor of Student Success in Osteopathic Medical Education. ACTA ACUST UNITED AC 2019; 119:189-197. [PMID: 30801115 DOI: 10.7556/jaoa.2019.030] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
Abstract
Background Reading skills are crucial in medical school, where students are expected to absorb an onslaught of new and complex material. Studies on reading assessment in osteopathic medical education are lacking. Objective To address gaps in the literature related to reading assessment and to investigate the correlation of the Nelson-Denny Reading Test with various performance indicators in osteopathic medical education. Methods The West Virginia School of Osteopathic Medicine administered the Nelson-Denny Reading Test to first- and second-year students between 2015 and 2017. Raw scores were translated into the percentile rank, scale score, grade equivalent score, and stanine score based on guidelines supplied with the Nelson-Denny Reading Test. These translated scores were compared with Medical College Admission Test (MCAT) scores, first- and second-year performance on course examinations, Comprehensive Osteopathic Medical Licensing Examination-USA (COMLEX-USA) Level 1 scores, and scores provided in a 2002 study by Haught and Walls. Results A total of 623 students took the first-year Nelson-Denny Reading Test, and 408 took both the first- and second-year Nelson-Denny Reading Test. Findings showed a large correlation between the Nelson-Denny Reading Test and the verbal reasoning section (r=0.56 for the class of 2020 and 0.46 for the class of 2021) of the old MCAT (before 2015) and the reasoning skills section (r=0.42 for the class of 2020 and 0.49 for the class of 2021) of the new MCAT (released in 2015). There were no correlations with first- and second-year course examination scores or COMLEX-USA Level 1 scores. The Nelson-Denny Reading Test scores reported by Haught and Walls for medical students and health professional students were slightly higher than those found for osteopathic medical students in this study. Conclusion The reasoning skills section of the new MCAT could serve as a good proxy for a reading test. There were no correlations between the Nelson-Denny Reading Test and performance in the first 2 years of medical school or COMLEX-USA Level 1 performance. Further research can strengthen the findings and determine whether correlations exist with clinical performance.
Collapse
|
3
|
Gauer JL, Wolff JM, Jackson JB. Do MCAT scores predict USMLE scores? An analysis on 5 years of medical student data. MEDICAL EDUCATION ONLINE 2016; 21:31795. [PMID: 27702431 PMCID: PMC5045966 DOI: 10.3402/meo.v21.31795] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/31/2016] [Revised: 08/23/2016] [Accepted: 08/31/2016] [Indexed: 05/26/2023]
Abstract
INTRODUCTION The purpose of this study was to determine the associations and predictive values of Medical College Admission Test (MCAT) component and composite scores prior to 2015 with U.S. Medical Licensure Exam (USMLE) Step 1 and Step 2 Clinical Knowledge (CK) scores, with a focus on whether students scoring low on the MCAT were particularly likely to continue to score low on the USMLE exams. METHOD Multiple linear regression, correlation, and chi-square analyses were performed to determine the relationship between MCAT component and composite scores and USMLE Step 1 and Step 2 CK scores from five graduating classes (2011-2015) at the University of Minnesota Medical School (N=1,065). RESULTS The multiple linear regression analyses were both significant (p<0.001). The three MCAT component scores together explained 17.7% of the variance in Step 1 scores (p<0.001) and 12.0% of the variance in Step 2 CK scores (p<0.001). In the chi-square analyses, significant, albeit weak associations were observed between almost all MCAT component scores and USMLE scores (Cramer's V ranged from 0.05 to 0.24). DISCUSSION Each of the MCAT component scores was significantly associated with USMLE Step 1 and Step 2 CK scores, although the effect size was small. Being in the top or bottom scoring range of the MCAT exam was predictive of being in the top or bottom scoring range of the USMLE exams, although the strengths of the associations were weak to moderate. These results indicate that MCAT scores are predictive of student performance on the USMLE exams, but, given the small effect sizes, should be considered as part of the holistic view of the student.
Collapse
|
4
|
Hu Y, Martindale JR, LeGallo RD, White CB, McGahren ED, Schroen AT. Relationships between preclinical course grades and standardized exam performance. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2016; 21:389-399. [PMID: 26363626 DOI: 10.1007/s10459-015-9637-6] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/07/2015] [Accepted: 09/07/2015] [Indexed: 06/05/2023]
Abstract
Success in residency matching is largely contingent upon standardized exam scores. Identifying predictors of standardized exam performance could promote primary intervention and lead to design insights for preclinical courses. We hypothesized that clinically relevant courses with an emphasis on higher-order cognitive understanding are most strongly associated with performance on United States Medical Licensing Examination Step exams and National Board of Medical Examiners clinical subject exams. Academic data from students between 2007 and 2012 were collected. Preclinical course scores and standardized exam scores were used for statistical modeling with multiple linear regression. Preclinical courses were categorized as having either a basic science or a clinical knowledge focus. Medical College Admissions Test scores were included as an additional predictive variable. The study sample comprised 795 graduating medical students. Median score on Step 1 was 234 (interquartile range 219-245.5), and 10.2 % (81/795) scored lower than one standard deviation below the national average (205). Pathology course score was the strongest predictor of performance on all clinical subject exams and Step exams, outperforming the Medical College Admissions Test in strength of association. Using Pathology score <75 as a screening metric for Step 1 score <205 results in sensitivity and specificity of 37 and 97 %, respectively, and a likelihood ratio of 11.9. Performance in Pathology, a clinically relevant course with case-based learning, is significantly related to subsequent performance on standardized exams. Multiple linear regression is useful for identifying courses that have potential as risk stratifiers.
Collapse
Affiliation(s)
- Yinin Hu
- Department of Surgery, University of Virginia School of Medicine, P.O. Box 800679, Charlottesville, VA, 22908-0709, USA.
| | - James R Martindale
- Undergraduate Medical Education, University of Virginia School of Medicine, Charlottesville, VA, USA
| | - Robin D LeGallo
- Department of Pathology, University of Virginia School of Medicine, Charlottesville, VA, USA
| | - Casey B White
- Undergraduate Medical Education, University of Virginia School of Medicine, Charlottesville, VA, USA
| | - Eugene D McGahren
- Department of Surgery, University of Virginia School of Medicine, P.O. Box 800679, Charlottesville, VA, 22908-0709, USA
| | - Anneke T Schroen
- Department of Surgery, University of Virginia School of Medicine, P.O. Box 800679, Charlottesville, VA, 22908-0709, USA
| |
Collapse
|
5
|
Bills JL, VanHouten J, Grundy MM, Chalkley R, Dermody TS. Validity of the Medical College Admission Test for predicting MD-PhD student outcomes. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2016; 21:33-49. [PMID: 25952644 PMCID: PMC4749640 DOI: 10.1007/s10459-015-9609-x] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/10/2014] [Accepted: 04/20/2015] [Indexed: 05/30/2023]
Abstract
The Medical College Admission Test (MCAT) is a quantitative metric used by MD and MD-PhD programs to evaluate applicants for admission. This study assessed the validity of the MCAT in predicting training performance measures and career outcomes for MD-PhD students at a single institution. The study population consisted of 153 graduates of the Vanderbilt Medical Scientist Training Program (combined MD-PhD program) who matriculated between 1963 and 2003 and completed dual-degree training. This population was divided into three cohorts corresponding to the version of the MCAT taken at the time of application. Multivariable regression (logistic for binary outcomes and linear for continuous outcomes) was used to analyze factors associated with outcome measures. The MCAT score and undergraduate GPA (uGPA) were treated as independent variables; medical and graduate school grades, time-to-PhD defense, USMLE scores, publication number, and career outcome were dependent variables. For cohort 1 (1963-1977), MCAT score was not associated with any assessed outcome, although uGPA was associated with medical school preclinical GPA and graduate school GPA (gsGPA). For cohort 2 (1978-1991), MCAT score was associated with USMLE Step II score and inversely correlated with publication number, and uGPA was associated with preclinical GPA (mspGPA) and clinical GPA (mscGPA). For cohort 3 (1992-2003), the MCAT score was associated with mscGPA, and uGPA was associated with gsGPA. Overall, MCAT score and uGPA were inconsistent or weak predictors of training metrics and career outcomes for this population of MD-PhD students.
Collapse
Affiliation(s)
- James L Bills
- Medical Scientist Training Program, Vanderbilt University School of Medicine, D7235 Medical Center North, 1161 21st Avenue South, Nashville, TN, 37232-2581, USA
- Department of Medical Education and Administration, Vanderbilt University School of Medicine, Nashville, TN, 37232, USA
| | - Jacob VanHouten
- Medical Scientist Training Program, Vanderbilt University School of Medicine, D7235 Medical Center North, 1161 21st Avenue South, Nashville, TN, 37232-2581, USA
- Department of Biomedical Informatics, Vanderbilt University School of Medicine, Nashville, TN, 37232, USA
- Department of Biostatistics, Vanderbilt University School of Medicine, Nashville, TN, 37232, USA
| | - Michelle M Grundy
- Medical Scientist Training Program, Vanderbilt University School of Medicine, D7235 Medical Center North, 1161 21st Avenue South, Nashville, TN, 37232-2581, USA
- Department of Medical Education and Administration, Vanderbilt University School of Medicine, Nashville, TN, 37232, USA
| | - Roger Chalkley
- Department of Medical Education and Administration, Vanderbilt University School of Medicine, Nashville, TN, 37232, USA
| | - Terence S Dermody
- Medical Scientist Training Program, Vanderbilt University School of Medicine, D7235 Medical Center North, 1161 21st Avenue South, Nashville, TN, 37232-2581, USA.
- Department of Pathology, Microbiology, and Immunology, Vanderbilt University School of Medicine, Nashville, TN, 37232, USA.
- Department of Pediatrics, Vanderbilt University School of Medicine, Nashville, TN, 37232, USA.
- Elizabeth B. Lamb Center for Pediatric Research, Vanderbilt University School of Medicine, Nashville, TN, 37232, USA.
| |
Collapse
|
6
|
Affiliation(s)
- Jules L Dienstag
- Office of the Dean for Medical Education, Harvard Medical School, Boston, USA
| |
Collapse
|
7
|
Callahan CA, Hojat M, Veloski J, Erdmann JB, Gonnella JS. The predictive validity of three versions of the MCAT in relation to performance in medical school, residency, and licensing examinations: a longitudinal study of 36 classes of Jefferson Medical College. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2010; 85:980-7. [PMID: 20068426 DOI: 10.1097/acm.0b013e3181cece3d] [Citation(s) in RCA: 41] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
PURPOSE The Medical College Admission Test (MCAT) has undergone several revisions for content and validity since its inception. With another comprehensive review pending, this study examines changes in the predictive validity of the MCAT's three recent versions. METHOD Study participants were 7,859 matriculants in 36 classes entering Jefferson Medical College between 1970 and 2005; 1,728 took the pre-1978 version of the MCAT; 3,032 took the 1978-1991 version, and 3,099 took the post-1991 version. MCAT subtest scores were the predictors, and performance in medical school, attrition, scores on the medical licensing examinations, and ratings of clinical competence in the first year of residency were the criterion measures. RESULTS No significant improvement in validity coefficients was observed for performance in medical school or residency. Validity coefficients for all three versions of the MCAT in predicting Part I/Step 1 remained stable (in the mid-0.40s, P < .01). A systematic decline was observed in the validity coefficients of the MCAT versions in predicting Part II/Step 2. It started at 0.47 for the pre-1978 version, decreased to between 0.42 and 0.40 for the 1978-1991 versions, and to 0.37 for the post-1991 version. Validity coefficients for the MCAT versions in predicting Part III/Step 3 remained near 0.30. These were generally larger for women than men. CONCLUSIONS Although the findings support the short- and long-term predictive validity of the MCAT, opportunities to strengthen it remain. Subsequent revisions should increase the test's ability to predict performance on United States Medical Licensing Examination Step 2 and must minimize the differential validity for gender.
Collapse
|
8
|
Collin VT, Violato C, Hecker K. Aptitude, achievement and competence in medicine: a latent variable path model. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2009; 14:355-366. [PMID: 18481186 DOI: 10.1007/s10459-008-9121-7] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/12/2007] [Accepted: 04/24/2008] [Indexed: 05/26/2023]
Abstract
To develop and test a latent variable path model of general achievement, aptitude for medicine and competence in medicine employing data from the Medical College Admission Test (MCAT), pre-medical undergraduate grade point average (UGPA) and demographic characteristics for competence in pre-clinical and measures of competence (United States Licensure Examination {USMLE} Steps 1, 2, and 3). Data were gathered on 839,710 participants from 1991 to 2000 on demographic and school variables, UGPA, MCAT subtest scores and Steps 1, 2, and 3 of the United Stated Licensure Examination (USMLE). However, subsets of the total 839,710 participants included in the database were used for various analyses and the testing of a latent variable path model (LVPA). A number of preliminary descriptive and inferential techniques were used to confirm previous hypotheses and stated relationships amongst the variables of interest to the present study. Through development and testing of a latent variable path model, three latent variables measured by UGPA (general achievement), subscales of the MCAT (aptitude for medicine), and Steps 1, 2, and 3 of the USMLE (competence in medicine) were identified which resulted in a comparative fit index = .932 of the model to a large sample (n = 20,714). In a confirmatory latent variable path model we were able to identify theoretical constructs, aptitude for medicine, general achievement, and competence in medicine and their interrelationships. These are distinct but interrelated latent variables.
Collapse
Affiliation(s)
- V Terri Collin
- Department of Surgery, University of Pittsburgh Medical Center, 200 Lothrop St. Room F677PUH, Pittsburgh, PA 15213, USA.
| | | | | |
Collapse
|
9
|
Emery JL, Bell JF. The predictive validity of the BioMedical Admissions Test for pre-clinical examination performance. MEDICAL EDUCATION 2009; 43:557-64. [PMID: 19493180 DOI: 10.1111/j.1365-2923.2009.03367.x] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
CONTEXT Some medical courses in the UK have many more applicants than places and almost all applicants have the highest possible previous and predicted examination grades. The BioMedical Admissions Test (BMAT) was designed to assist in the student selection process specifically for a number of 'traditional' medical courses with clear pre-clinical and clinical phases and a strong focus on science teaching in the early years. It is intended to supplement the information provided by examination results, interviews and personal statements. This paper reports on the predictive validity of the BMAT and its predecessor, the Medical and Veterinary Admissions Test. METHODS Results from the earliest 4 years of the test (2000-2003) were matched to the pre-clinical examination results of those accepted onto the medical course at the University of Cambridge. Correlation and logistic regression analyses were performed for each cohort. RESULTS Section 2 of the test ('Scientific Knowledge') correlated more strongly with examination marks than did Section 1 ('Aptitude and Skills'). It also had a stronger relationship with the probability of achieving the highest examination class. CONCLUSIONS The BMAT and its predecessor demonstrate predictive validity for the pre-clinical years of the medical course at the University of Cambridge. The test identifies important differences in skills and knowledge between candidates, not shown by their previous attainment, which predict their examination performance. It is thus a valid source of additional admissions information for medical courses with a strong scientific emphasis when previous attainment is very high.
Collapse
|
10
|
Peskun C, Detsky A, Shandling M. Effectiveness of medical school admissions criteria in predicting residency ranking four years later. MEDICAL EDUCATION 2007; 41:57-64. [PMID: 17209893 DOI: 10.1111/j.1365-2929.2006.02647.x] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
BACKGROUND Medical schools across Canada expend great effort in selecting students from a large pool of qualified applicants. Non-cognitive assessments are conducted by most schools in an effort to ensure that medical students have the personal characteristics of importance in the practice of Medicine. We reviewed the ability of University of Toronto academic and non-academic admission assessments to predict ranking by Internal Medicine and Family Medicine residency programmes. METHODS The study sample consisted of students who had entered the University of Toronto between 1994 and 1998 inclusive, and had then applied through the Canadian resident matching programme to positions in Family or Internal Medicine at the University of Toronto in their graduating year. The value of admissions variables in predicting medical school performance and residency ranking was assessed. RESULTS Ranking in Internal Medicine correlated significantly with undergraduate grade point average (GPA) and the admissions non-cognitive assessment. It also correlated with 2-year objective structured clinical examination (OSCE) score, clerkship grade in Internal Medicine, and final grade in medical school. Ranking in Family Medicine correlated with the admissions interview score. It also correlated with 2nd-year OSCE score, clerkship grade in Family Medicine, clerkship ward evaluation in Internal Medicine and final grade in medical school. DISCUSSION The results of this study suggest that cognitive as well as non-cognitive factors evaluated during medical school admission are important in predicting future success in Medicine. The non-cognitive assessment provides additional value to standard academic criteria in predicting ranking by 2 residency programmes, and justifies its use as part of the admissions process.
Collapse
Affiliation(s)
- Christopher Peskun
- University of Toronto, University of Toronto, University of Toronto, Canada.
| | | | | |
Collapse
|
11
|
Donnon T, Paolucci EO, Violato C. The predictive validity of the MCAT for medical school performance and medical board licensing examinations: a meta-analysis of the published research. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2007; 82:100-6. [PMID: 17198300 DOI: 10.1097/01.acm.0000249878.25186.b7] [Citation(s) in RCA: 185] [Impact Index Per Article: 10.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
PURPOSE To conduct a meta-analysis of published studies to determine the predictive validity of the MCAT on medical school performance and medical board licensing examinations. METHOD The authors included all peer-reviewed published studies reporting empirical data on the relationship between MCAT scores and medical school performance or medical board licensing exam measures. Moderator variables, participant characteristics, and medical school performance/medical board licensing exam measures were extracted and reviewed separately by three reviewers using a standardized protocol. RESULTS Medical school performance measures from 11 studies and medical board licensing examinations from 18 studies, for a total of 23 studies, were selected. A random-effects model meta-analysis of weighted effects sizes (r) resulted in (1) a predictive validity coefficient for the MCAT in the preclinical years of r = 0.39 (95% confidence interval [CI], 0.21-0.54) and on the USMLE Step 1 of r = 0.60 (95% CI, 0.50-0.67); and (2) the biological sciences subtest as the best predictor of medical school performance in the preclinical years (r = 0.32 95% CI, 0.21-0.42) and on the USMLE Step 1 (r = 0.48 95% CI, 0.41-0.54). CONCLUSIONS The predictive validity of the MCAT ranges from small to medium for both medical school performance and medical board licensing exam measures. The medical profession is challenged to develop screening and selection criteria with improved validity that can supplement the MCAT as an important criterion for admission to medical schools.
Collapse
Affiliation(s)
- Tyrone Donnon
- Medical Education and Research Unit, Department of Community Health Sciences, Faculty of Medicine, University of Calgary, Calgary, Canada.
| | | | | |
Collapse
|
12
|
Julian ER. Validity of the Medical College Admission Test for predicting medical school performance. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2005; 80:910-7. [PMID: 16186610 DOI: 10.1097/00001888-200510000-00010] [Citation(s) in RCA: 67] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
PURPOSE Since the introduction of the revised Medical College Admission Test (MCAT(R)) in 1991, the Association of American Medical Colleges has been investigating the extent to which MCAT scores supplement the power of undergraduate grade point averages (uGPAs) to predict success in medical school. This report is a comprehensive summary of the relationships between MCAT scores and (1) medical school grades, (2) United States Medical Licensing Examination (USMLE) Step scores, and (3) academic distinction or difficulty. METHOD This study followed two cohorts from entrance to medical school through residency. Students from 14 medical schools' 1992 and 1993 entering classes provided data for predicting medical school grades and academic difficulty/distinction, while their peers from all of the U.S. medical schools were used to predict performance on USMLE Steps 1, 2, and 3. Regression analyses assessed the predictive power of combinations of uGPAs, MCAT scores, and undergraduate-institution selectivity. RESULTS Grades were best predicted by a combination of MCAT scores and uGPAs, with MCAT scores providing a substantial increment over uGPAs. MCAT scores were better predictors of USMLE Step scores than were uGPAs, and the combination did little better than MCAT scores alone. The probability of experiencing academic difficulty or distinction tended to vary with MCAT scores. MCAT scores were strong predictors of scores for all three Step examinations, particularly Step 1. CONCLUSIONS MCAT scores almost double the proportion of variance in medical school grades explained by uGPAs, and essentially replace the need for uGPAs in their impressive prediction of Step scores. The MCAT performs well as an indicator of academic preparation for medical school, independent of the school-specific handicaps of uGPAs.
Collapse
Affiliation(s)
- Ellen R Julian
- Medical College Admission Test, Association of American Medical Colleges, 2450 N Street, N.W., Washington, DC 20037-1127, USA
| |
Collapse
|
13
|
Cunningham KA, DesJardins SL, Christensen MG. Predictive efficacy of Chiropractic College Assessment Test scores in basic science chiropractic education. J Manipulative Physiol Ther 2005; 28:175-8. [PMID: 15855905 DOI: 10.1016/j.jmpt.2005.02.012] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022]
Abstract
OBJECTIVE To evaluate the ability of Chiropractic College Assessment Test (CCAT) to explain academic success within a chiropractic basic science curriculum. METHODS The CCAT examination was administered to 202 subjects from 1 chiropractic college on the first day of classes. Zero-order Pearson correlations were used to examine for associations between the prechiropractic grade point average (GPA), CCAT scores, and basic science GPA. Multiple regression techniques were applied to determine the predictive efficacy of CCAT scores on basic science GPA. RESULTS Study results indicate a correlation between prechiropractic GPA, CCAT scores (r = 0.348, P < .001), and basic science GPA (r = 0.559, P < .001). Correlation was also noted between CCAT scores and basic science GPA (r = 0.537, P < .001). Using multiple regression, together the variables (age, postsecondary education, prechiropractic GPA, and CCAT scores) accounted for a significant portion (R2 = 0.483, P < .001) of the total variance in basic science GPA. Furthermore, the CCAT scores accounted for significant unique explanation (change R2 = 0.081, P < .001) beyond that offered by the traditionally used prechiropractic GPA. CONCLUSION The CCAT examination provides a valuable a priori indicator of success within the basic science curriculum of this particular chiropractic program. Consideration should be given to adopting the CCAT examination as one of a number of heuristic guides students and college officials use in making enrollment decisions.
Collapse
Affiliation(s)
- Kevin A Cunningham
- Academic Affairs, Palmer College of Chiropractic, Davenport, IA 52803, USA.
| | | | | |
Collapse
|