1
|
Alomari S, Lubelski D, Feghali J, Brem H, Witham T, Huang J. Impact of virtual vs. in-person interviews among neurosurgery residency applicants. J Clin Neurosci 2022; 101:63-66. [PMID: 35561432 DOI: 10.1016/j.jocn.2022.05.005] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2021] [Revised: 04/09/2022] [Accepted: 05/07/2022] [Indexed: 11/28/2022]
Abstract
BACKGROUND The interview is considered a key factor in selecting residents in various medical and surgical specialties. However, the reliability of the interview process in selecting neurosurgery training program applicants remains largely under-investigated. OBJECTIVE To investigate the reliability of the interview process for neurosurgery residency applicants and to evaluate the impact of virtual interviews on this process. METHODS We analyzed the records of neurosurgery residency applicant interviews at our institution between 2016 and 2021. An average of 20 neurosurgery faculty members (clinical and research) interviewed each applicant and graded them 1 (best) to 4 (worst). Intraclass correlation coefficient (ICC) and Levene's test were used to assess the inter-rater and intra-rater reliability, respectively. RESULTS 214 neurosurgery residency applicants were interviewed at a single institution between 2016 and 2021. The mean applicant rating each year ranged from 1.77 to 1.92. Inter-rater agreement was relatively poor in each year, (ICC < 0.5, P < 0.05). Among 60% of the raters, variability of scores significantly changed from year to year, (p < 0.05). When comparing the scores submitted during the virtual interview process (2021) with the scores submitted in the previous years (2016-2020), 2 interviewers (10%) had less variability using the virtual process. CONCLUSION Our analysis found that the current interview process for neurosurgery residency applicants' selection suffers from poor inter- and intra-rater reliability. Virtual interviews may be part of a cost-effective strategy to improve the reliability of the interview process. Further validation is needed, as well as identification of novel strategies to maximize the reliability of the selection process.
Collapse
Affiliation(s)
- Safwan Alomari
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Daniel Lubelski
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - James Feghali
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Henry Brem
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Timothy Witham
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Judy Huang
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA.
| |
Collapse
|
2
|
Peeters MJ. Moving beyond Cronbach's Alpha and Inter-Rater Reliability: A Primer on Generalizability Theory for Pharmacy Education. Innov Pharm 2021; 12:10.24926/iip.v12i1.2131. [PMID: 34007684 PMCID: PMC8102977 DOI: 10.24926/iip.v12i1.2131] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
BACKGROUND When available, empirical evidence should help guide decision-making. Following each administration of a learning assessment, data becomes available for analysis. For learning assessments, Kane's Framework for Validation can helpfully categorize evidence by inference (i.e., scoring, generalization, extrapolation, implications). Especially for test-scores used within a high-stakes setting, generalization evidence is critical. While reporting Cronbach's alpha, inter-rater reliability, and other reliability coefficients for a single measurement error are somewhat common in pharmacy education, dealing with multiple concurrent sources of measurement error within complex learning assessments is not. Performance-based assessments (e.g., OSCEs) that use raters, are inherently complex learning assessments. PRIMER Generalizability Theory (G-Theory) can account for multiple sources of measurement error. G-Theory is a powerful tool that can provide a composite reliability (i.e., generalization evidence) for more complex learning assessments, including performance-based assessments. It can also help educators explore ways to make a learning assessment more rigorous if needed, as well as suggest ways to better allocate resources (e.g., staffing, space, fiscal). A brief review of G-Theory is discussed herein focused on pharmacy education. MOVING FORWARD G-Theory has been common and useful in medical education, though has been used rarely in pharmacy education. Given the similarities in assessment methods among health-professions, G-Theory should prove helpful in pharmacy education as well. Within this Journal and accompanying this Idea Paper, there are multiple reports that demonstrate use of G-Theory in pharmacy education.
Collapse
|
3
|
Reed BN, Noel ZR, Heil EL, Shipper AG, Gardner AK. Surveying the selection landscape: A systematic review of processes for selecting postgraduate year 1 pharmacy residents and key implications. JOURNAL OF THE AMERICAN COLLEGE OF CLINICAL PHARMACY 2020. [DOI: 10.1002/jac5.1334] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Affiliation(s)
- Brent N. Reed
- Department of Pharmacy Practice and Science University of Maryland School of Pharmacy Baltimore Maryland USA
| | - Zachary R. Noel
- Department of Pharmacy Practice and Science University of Maryland School of Pharmacy Baltimore Maryland USA
| | - Emily L. Heil
- Department of Pharmacy Practice and Science University of Maryland School of Pharmacy Baltimore Maryland USA
| | - Andrea G. Shipper
- Health Sciences and Human Services Library University of Maryland Baltimore Baltimore Maryland USA
| | - Aimee K. Gardner
- Health Professions, Surgery Baylor College of Medicine Houston Texas USA
| |
Collapse
|
4
|
Henneman A, Haines S. Implementation of a modified multiple mini-interview method to assess non-cognitive qualities during resident candidate interviews. CURRENTS IN PHARMACY TEACHING & LEARNING 2020; 12:585-589. [PMID: 32336457 DOI: 10.1016/j.cptl.2020.01.010] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/29/2019] [Revised: 11/11/2019] [Accepted: 01/12/2020] [Indexed: 06/11/2023]
Abstract
BACKGROUND AND PURPOSE Conventional onsite interview methods often make comparing applicants difficult. Literature has noted conventional interviews leave room for bias and high interrater variability, making non-cognitive attributes difficult to ascertain. In 2016, the residency committee of a small, multi-site, academic-based postgraduate year one residency program implemented a modified multiple mini-interview (MMI) approach as a component of the residency interview process to better qualify candidate attributes. EDUCATIONAL ACTIVITY AND SETTING A modified MMI was developed to address the non-cognitive attributes, ethical reasoning, communication, and professionalism. Scenarios, scripts, questions, and rubrics were developed by residency committee members. The author of the case was assigned to role play that scenario with candidates while other committee members silently observed. Candidates and residency committee members were surveyed to explore their perception of the MMI as a component of the residency interview process. FINDINGS Thirty-one candidates have been interviewed since the incorporation of the modified MMI. Of those, 20 completed the post-interview survey. The majority of resident candidates (55%) completing the survey felt they were able to portray strengths and abilities more effectively vs. a conventional interview. Of the five residency committee members, all (100%) completed the survey and all (100%) perceived implementation of the modified MMI provided increased confidence in determining candidate ranking. SUMMARY Implementation of a modified MMI approach to an onsite residency interview process assisted residency committee members in assessing non-cognitive attributes and contributed to greater confidence in determining resident candidate ranking.
Collapse
Affiliation(s)
- Amy Henneman
- Lloyd L. Gregory School of Pharmacy, Palm Beach Atlantic University, 901 S. Flagler Dr., West Palm Beach, FL 33416, United States.
| | - Seena Haines
- Department of Pharmacy Practice, The University of Mississippi School of Pharmacy, 2500 North State Street, Jackson, MS 39216, United States.
| |
Collapse
|
5
|
Austin Szwak J, Bondi DS, Knoebel R, Soni HP. Utility of clinical-focused multiple mini interviews during postgraduate year one pharmacy residency interviews. JOURNAL OF THE AMERICAN COLLEGE OF CLINICAL PHARMACY 2020. [DOI: 10.1002/jac5.1235] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Affiliation(s)
- Jennifer Austin Szwak
- Department of Pharmacy; The University of Chicago Medicine; Chicago Illinois United States
| | - Deborah S. Bondi
- Department of Pharmacy; The University of Chicago Medicine; Chicago Illinois United States
| | - Randall Knoebel
- Department of Pharmacy; The University of Chicago Medicine; Chicago Illinois United States
| | - Hailey P. Soni
- Department of Pharmacy; The University of Chicago Medicine; Chicago Illinois United States
| |
Collapse
|
6
|
McLaughlin MM, Borchert JS, Wilson C, Jensen AO, Gettig JP. Effect of application score strategy on interviews offered to postgraduate year 1 pharmacy residency applicants. J Am Pharm Assoc (2003) 2018; 58:84-88. [DOI: 10.1016/j.japh.2017.10.008] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2017] [Revised: 09/27/2017] [Accepted: 10/06/2017] [Indexed: 11/26/2022]
|
7
|
Peeters MJ, Martin BA. Validation of learning assessments: A primer. CURRENTS IN PHARMACY TEACHING & LEARNING 2017; 9:925-933. [PMID: 29233326 DOI: 10.1016/j.cptl.2017.06.001] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/22/2016] [Revised: 05/25/2017] [Accepted: 06/02/2017] [Indexed: 06/07/2023]
Abstract
The Accreditation Council for Pharmacy Education's Standards 2016 has placed greater emphasis on validating educational assessments. In this paper, we describe validity, reliability, and validation principles, drawing attention to the conceptual change that highlights one validity with multiple evidence sources; to this end, we recommend abandoning historical (confusing) terminology associated with the term validity. Further, we describe and apply Kane's framework (scoring, generalization, extrapolation, and implications) for the process of validation, with its inferences and conclusions from varied uses of assessment instruments by different colleges and schools of pharmacy. We then offer five practical recommendations that can improve reporting of validation evidence in pharmacy education literature. We describe application of these recommendations, including examples of validation evidence in the context of pharmacy education. After reading this article, the reader should be able to understand the current concept of validation, and use a framework as they validate and communicate their own institution's learning assessments.
Collapse
Affiliation(s)
- Michael J Peeters
- University of Toledo College of Pharmacy & Pharmaceutical Sciences, 3000 Arlington Ave, Mail Stop 1013, Toledo, OH 43614, United States.
| | - Beth A Martin
- University of Wisconsin-Madison School of Pharmacy, 777 Highland Ave, Madison, WI 53705-2222, United States.
| |
Collapse
|
8
|
Sando KR, Skoy E, Bradley C, Frenzel J, Kirwin J, Urteaga E. Assessment of SOAP note evaluation tools in colleges and schools of pharmacy. CURRENTS IN PHARMACY TEACHING & LEARNING 2017; 9:576-584. [PMID: 29233430 DOI: 10.1016/j.cptl.2017.03.010] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/13/2016] [Revised: 01/10/2017] [Accepted: 03/13/2017] [Indexed: 06/07/2023]
Abstract
INTRODUCTION To describe current methods used to assess SOAP notes in colleges and schools of pharmacy. METHODS Members of the American Association of Colleges of Pharmacy Laboratory Instructors Special Interest Group were invited to share assessment tools for SOAP notes. Content of submissions was evaluated to characterize overall qualities and how the tools assessed subjective, objective, assessment, and plan information. RESULTS Thirty-nine assessment tools from 25 schools were evaluated. Twenty-nine (74%) of the tools were rubrics and ten (26%) were checklists. All rubrics included analytic scoring elements, while two (7%) were mixed with holistic and analytic scoring elements. A majority of the rubrics (35%) used a four-item rating scale. Substantial variability existed in how tools evaluated subjective and objective sections. All tools included problem identification in the assessment section. Other assessment items included goals (82%) and rationale (69%). Seventy-seven percent assessed drug therapy; however, only 33% assessed non-drug therapy. Other plan items included education (59%) and follow-up (90%). DISCUSSION AND CONCLUSIONS There is a great deal of variation in the specific elements used to evaluate SOAP notes in colleges and schools of pharmacy. Improved consistency in assessment methods to evaluate SOAP notes may better prepare students to produce standardized documentation when entering practice.
Collapse
Affiliation(s)
- Karen R Sando
- Dept. of Pharmacotherapy & Translational Research, University of Florida College of Pharmacy, Gainesville, FL, United States.
| | - Elizabeth Skoy
- Dept. of Pharmacy Practice, North Dakota State University School of Pharmacy, Fargo, ND, United States.
| | - Courtney Bradley
- High Point University, Fred Wilson School of Pharmacy, High Point, NC, United States.
| | - Jeanne Frenzel
- Dept. of Pharmacy Practice, North Dakota State University School of Pharmacy, Fargo, ND, United States.
| | - Jennifer Kirwin
- Dept. of Pharmacy and Health Systems Sciences, School of Pharmacy, Northeastern University, Boston, MA, United States.
| | - Elizabeth Urteaga
- Dept. of Pharmacy Practice, Feik School of Pharmacy, University of the Incarnate Word, San Antonio, TX, United States.
| |
Collapse
|
9
|
Peeters MJ, Hayes LM. A Plea for Psychometric Rigor. AMERICAN JOURNAL OF PHARMACEUTICAL EDUCATION 2017; 81:79. [PMID: 28630520 PMCID: PMC5468717 DOI: 10.5688/ajpe81479] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/09/2023]
Affiliation(s)
- Michael J Peeters
- University of Toledo College of Pharmacy and Pharmaceutical Sciences, Toledo, Ohio
| | - Lisa M Hayes
- University of Toledo College of Pharmacy and Pharmaceutical Sciences, Toledo, Ohio
| |
Collapse
|
10
|
Martin M, Salzberg L. Resident characteristics to evaluate during recruitment and interview: a Delphi study. EDUCATION FOR PRIMARY CARE 2016; 28:81-85. [PMID: 27966391 DOI: 10.1080/14739879.2016.1266696] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
BACKGROUND AND OBJECTIVES The recruitment and interview process for medical residency programmes is a time- and resource-intensive effort. There is very little research to guide programmes when evaluating residency candidates. This study represents one step in identifying candidate characteristics to assess during the recruitment and interview process. METHODS Sixteen expert interviewers from 14 family medicine residency programmes in North Carolina participated in a three-round Delphi study to build consensus around a ranked list of successful resident candidate characteristics. An interrater reliability analysis produced average pair-wise agreement and Krippendorff's Alpha coefficients. RESULTS Clinical skills, medical knowledge, interpersonal and communication skills, critical thinking, and professional and ethical behaviour were the highest ranked characteristics. Average pair-wise agreement for rounds two and three were 6.30 and 11.04%, respectively. CONCLUSIONS Residency programmes may benefit from using an empirically studied list of characteristics to evaluate candidate applications and interviews. Future research should include national surveys of expert interviewers from a variety of residency programmes and a longitudinal study to correlate interview evaluations using the ranked list with measures of residency success.
Collapse
Affiliation(s)
- Matthew Martin
- a Duke/Southern Regional Area Health Education Centre (AHEC) Family Medicine Residency Programme , Fayetteville , NC , USA
| | - Lenard Salzberg
- a Duke/Southern Regional Area Health Education Centre (AHEC) Family Medicine Residency Programme , Fayetteville , NC , USA
| |
Collapse
|
11
|
Austin JH, Mieure KD, Weber LM, McCarthy BC. National characteristics of residency program interview processes. Am J Health Syst Pharm 2015; 72:1085-6. [DOI: 10.2146/ajhp150052] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022] Open
Affiliation(s)
- Jennifer H. Austin
- Department of Pharmacy Services University of Chicago Medicine Chicago, IL
| | | | | | | |
Collapse
|
12
|
Peeters MJ, Kelly CP, Cor MK. Summative Evaluations When Using an Objective Structured Teaching Exercise. AMERICAN JOURNAL OF PHARMACEUTICAL EDUCATION 2015; 79:60. [PMID: 26089569 PMCID: PMC4469026 DOI: 10.5688/ajpe79460] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Affiliation(s)
| | | | - M Kenneth Cor
- University of Alberta Faculty of Pharmacy and Pharmaceutical Sciences, Edmonton, Alberta, Canada
| |
Collapse
|
13
|
Peeters MJ, Beltyukova SA, Martin BA. Educational testing and validity of conclusions in the scholarship of teaching and learning. AMERICAN JOURNAL OF PHARMACEUTICAL EDUCATION 2013; 77:186. [PMID: 24249848 PMCID: PMC3831397 DOI: 10.5688/ajpe779186] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/21/2013] [Accepted: 07/27/2013] [Indexed: 05/13/2023]
Abstract
Validity and its integral evidence of reliability are fundamentals for educational and psychological measurement, and standards of educational testing. Herein, we describe these standards of educational testing, along with their subtypes including internal consistency, inter-rater reliability, and inter-rater agreement. Next, related issues of measurement error and effect size are discussed. This article concludes with a call for future authors to improve reporting of psychometrics and practical significance with educational testing in the pharmacy education literature. By increasing the scientific rigor of educational research and reporting, the overall quality and meaningfulness of SoTL will be improved.
Collapse
Affiliation(s)
- Michael J. Peeters
- College of Pharmacy and Pharmaceutical Sciences, University of Toledo, Toledo, Ohio
| | | | - Beth A. Martin
- School of Pharmacy, University of Wisconsin-Madison, Madison, Wisconsin
| |
Collapse
|