1
|
Khadka S, Holt K, Peeters MJ. Academic conference posters: Describing visual impression in pharmacy education. Explor Res Clin Soc Pharm 2024; 13:100423. [PMID: 38420611 PMCID: PMC10899018 DOI: 10.1016/j.rcsop.2024.100423] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2023] [Revised: 02/06/2024] [Accepted: 02/14/2024] [Indexed: 03/02/2024] Open
Abstract
Background Academic conference posters are a key communication before journal articles. Attention to visual attributes can enhance academic poster communication. Objective This investigation's purpose was to create a visual impression measurement instrument, and then to describe and compare visual impression among scientific posters from an academic conference. Methods A mixed-approach rubric was created to quickly measure visual impression of academic posters. Then, posters from a pharmacy education conference were retrospectively reviewed and scored. Visual impression was compared for traditional versus contemporary poster-formats. Various poster characteristics (poster-format, summary statement presence, abstract presence, wordiness, QR-code presence, logical sequencing, visuals) that might have impacted visual communication were coded. These characteristics were regressed onto visual impression scores. Results Three-hundred seventy-eight posters were scored with sound inter-rater reliability. Contemporary poster-format scored significantly higher than traditional. Poster-format, abstract absence, lack of wordiness, QR-code presence, logical sequencing, and number of visuals were significant when regressed. Conclusion Posters at one academic conference had varied visual impression. While a contemporary poster-format appeared more helpful, it was not a panacea; variation from poor through exemplary was seen with both poster-formats. Posters are not text-filled articles; displaying a combination of visuals/text clearly and concisely can help effective communication with academic posters.
Collapse
Affiliation(s)
- Sheela Khadka
- University of Toledo College of Pharmacy and Pharmaceutical Sciences, Toledo, OH, United States of America
| | - Katlyn Holt
- University of Toledo College of Pharmacy and Pharmaceutical Sciences, Toledo, OH, United States of America
| | - Michael J Peeters
- University of Toledo College of Pharmacy and Pharmaceutical Sciences, Toledo, OH, United States of America
| |
Collapse
|
2
|
Peeters MJ, D'Amico A, Khadka S, Cleary HM, Singh S. Interacting within an asynchronous online interprofessional education workshop focused on social determinants of health. Curr Pharm Teach Learn 2024; 16:196-201. [PMID: 38171978 DOI: 10.1016/j.cptl.2023.12.031] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/23/2023] [Revised: 12/21/2023] [Accepted: 12/25/2023] [Indexed: 01/05/2024]
Abstract
BACKGROUND Meaningful interprofessional education (IPE) involves students from at least two professions interacting to learn with, about, and from one another. Our objective was to describe a novel online approach used to create meaningful IPE within a social determinants of health (SDoH) workshop. INTERPROFESSIONAL EDUCATION ACTIVITY This online workshop integrated four different professions' perspectives on SDoH (social-work, public-health, nursing, and pharmacy). Each six-student interprofessional team was assigned a local neighborhood. This week-long workshop had numerous activities (pre- and post-workshop quizzes, a SDoH-primer video, video self-introduction to teammates, a windshield questionnaire with two subsequent clinical cases, a post-workshop reflection, and post-workshop evaluation). For discussion, asynchronous video-based responses were used instead of traditional text-based discussion-boards. DISCUSSION Quantitatively comparing quiz scores, students' SDoH knowledge increased with this workshop. Qualitatively from evaluations, most students found this workshop helpful and meaningful. Supporting use of video-based responses, many students' favorite aspect was interacting and collaborating within their interprofessional teams, although some students desired synchronous activities instead. Faculty facilitators confirmed that meaningful IPE interactions occurred. IMPLICATIONS In short, students from multiple health-professions learned SDoH-content and, using video-based responses, interacted asynchronously during this online workshop. This report demonstrated one tool available to help facilitate meaningful IPE asynchronously. This asynchronous, online IPE workshop appears to be a promising format to be integrated with other in-person IPE sessions.
Collapse
Affiliation(s)
- Michael J Peeters
- College of Pharmacy and Pharmaceutical Sciences, University of Toledo, Toledo, OH, United States.
| | - Alina D'Amico
- College of Pharmacy and Pharmaceutical Sciences, University of Toledo, Toledo, OH, United States
| | - Safalta Khadka
- College of Pharmacy and Pharmaceutical Sciences, University of Toledo, Toledo, OH, United States
| | - Heather M Cleary
- College of School of Social Justice (Social Work), University of Toledo, Toledo, OH, United States
| | - Shipra Singh
- College of Health and Human Services (Public Health), University of Toledo, Toledo, OH, United States
| |
Collapse
|
3
|
Howard MS, Bodi SM, Peeters MJ. Interprofessional education workshop for prescription drug monitoring programs. Med Educ 2023; 57:1161-1162. [PMID: 37721163 DOI: 10.1111/medu.15204] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Subscribe] [Scholar Register] [Received: 02/17/2023] [Accepted: 08/15/2023] [Indexed: 09/19/2023]
|
4
|
Castleberry A, Peeters MJ. Identifying and analyzing likert scales. Curr Pharm Teach Learn 2023:S1877-1297(23)00209-5. [PMID: 37620207 DOI: 10.1016/j.cptl.2023.07.021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/22/2023] [Revised: 07/12/2023] [Accepted: 07/27/2023] [Indexed: 08/26/2023]
Affiliation(s)
- Ashley Castleberry
- Interprofessional Education, University of Toledo College of Pharmacy & Pharmaceutical Sciences, 3000 Arlington Ave, Mail Stop 1013, Toledo, OH 43614 United States
| | - Michael J Peeters
- Interprofessional Education, University of Toledo College of Pharmacy & Pharmaceutical Sciences, 3000 Arlington Ave, Mail Stop 1013, Toledo, OH 43614 United States.
| |
Collapse
|
5
|
Peeters MJ, Augustine JM. Using Rasch measurement for instrument rating scale refinement. Curr Pharm Teach Learn 2023; 15:110-118. [PMID: 36898895 DOI: 10.1016/j.cptl.2023.02.015] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/20/2022] [Revised: 08/31/2022] [Accepted: 02/23/2023] [Indexed: 06/18/2023]
Abstract
OUR SITUATION Rasch measurement is an analysis tool that can provide validity evidence for instruments that attempt to measure student learning or other psychosocial behaviors, regardless if tools are newly created, modified, or previously developed. Rating scales are exceedingly common among psychosocial instruments and properly functioning rating scales are critical to effective measurement. Rasch measurement can help investigate this. METHODOLOGICAL LITERATURE REVIEW Aside from using Rasch measurement from the beginning to help create rigorous new measurement instruments, researchers can also benefit from employing Rasch measurement on previously developed instruments that had not included Rasch measurement during development. This article is focused on Rasch measurement's unique analysis of rating scales. That is, Rasch measurement can uniquely help examine if and how an instrument's rating scale is functioning among newly studied respondents (who will likely differ from the originally researched sample). OUR RECOMMENDATIONS AND THEIR APPLICATION After reviewing this article, the reader should be able to describe Rasch measurement, including how it is focused on fundamental measurement and how it differs from classical test theory and item-response theory, and reflect on situations in their own research where a Rasch measurement analysis might be helpful for generating validation evidence with a previously developed instrument. POTENTIAL IMPACT In the end, Rasch measurement can offer a helpful, unique, rigorous approach to further developing instruments that scientifically measure, accurately and precisely.
Collapse
Affiliation(s)
- Michael J Peeters
- University of Toledo College of Pharmacy & Pharmaceutical Sciences, 3000 Arlington Ave, Mail Stop 1013, Toledo, OH 43614, United States.
| | - Jill M Augustine
- Mercer University College of Pharmacy, 3001 Mercer University Drive, Atlanta, GA 30341, United States.
| |
Collapse
|
6
|
Janke KK, Covvey JR, Mospan CM, Smith KJ, Smith MD, Peeters MJ. But Scholarship Can Be Hard in Many Ways. Am J Pharm Educ 2021; 85:8620. [PMID: 34965918 PMCID: PMC8715962 DOI: 10.5688/ajpe8620] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/26/2021] [Accepted: 04/14/2021] [Indexed: 06/14/2023]
Affiliation(s)
- Kristin K Janke
- University of Minnesota, College of Pharmacy-Twin Cities, Minneapolis, Minnesota
| | - Jordan R Covvey
- Duquesne University, Mylan School of Pharmacy, Pittsburgh, Pennsylvania
| | | | - Kathryn J Smith
- University of Oklahoma, College of Pharmacy, Oklahoma City, Oklahoma
| | | | - Michael J Peeters
- University of Toledo, College of Pharmacy & Pharmaceutical Sciences, Toledo, Ohio
| |
Collapse
|
7
|
Peeters MJ, Harpe SE. Last Matter: Introducing infographics to Methodology Matters. Curr Pharm Teach Learn 2021; 13:1259-1260. [PMID: 34521516 DOI: 10.1016/j.cptl.2021.03.027] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/24/2021] [Accepted: 03/26/2021] [Indexed: 06/13/2023]
Abstract
Visual summaries are gaining momentum in the health sciences literature. The Journal is introducing a new article type-Last Matter (LM). These will consist of infographics that quickly summarize and visually describe topics typically addressed in more detail within Methodology Matters reviews. The primary goal is to provide readers with clear guidance related to one or two common issues, pitfalls, or points of confusion when conducting pharmacy education scholarship. In addition to a graphical summary, a key element of each LM is a list of recommended resources for readers interested in more detailed information. The first Last Matter published in this issue summarizes key concepts related to quality in qualitative research. The Journal hopes these infographics may be helpful to for readers to comprehend and share, as well as to influence future contributions to the pharmacy education literature.
Collapse
Affiliation(s)
- Michael J Peeters
- University of Toledo College of Pharmacy & Pharmaceutical Sciences, 3000 Arlington Ave, Mail Stop 1013, Toledo, OH 43614, United States.
| | - Spencer E Harpe
- Midwestern University College of Pharmacy, Downwers Grove, 555 31(st) Street, Downer's Grove, IL, United States.
| |
Collapse
|
8
|
Castleberry AN, Peeters MJ. Last matter: How do I build quality into my qualitative study. Curr Pharm Teach Learn 2021; 13:1386-1387. [PMID: 34521536 DOI: 10.1016/j.cptl.2021.07.021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/01/2020] [Revised: 06/01/2021] [Accepted: 07/15/2021] [Indexed: 06/13/2023]
Affiliation(s)
- Ashley N Castleberry
- Division Head and Clinical Associate Professor, Pharmacy Practice University of Texas at Austin College of Pharmacy, United States.
| | - Michael J Peeters
- Clinical Associate Professor University of Toledo College of Pharmacy & Pharmaceutical Sciences, 3000 Arlington Ave, Mail Stop 1013, Toledo, OH 43614, United States.
| |
Collapse
|
9
|
Peeters MJ. A Teachable Moment Matters for: Striving for inter-rater reliability with a patient counseling assessment rubric: Lessons learned. Curr Pharm Teach Learn 2021; 13:1073-1077. [PMID: 34294250 DOI: 10.1016/j.cptl.2021.03.028] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/22/2019] [Revised: 01/01/2021] [Accepted: 03/26/2021] [Indexed: 06/13/2023]
Abstract
The Teachable Moments Matter (TMM) category of articles is designed to offer readers insight into a methodological issue identified within a companion article. Written in collaboration with one of these authors, these articles provide an opportunity to focus on a challenge experienced by the authors and, in the process, provide one or more perspectives as to how to successfully navigate this issue. The current TMM is focused on issues and pitfalls in validation. The Journal hopes this case-based approach will help highlight the nuance of a topic in context, something that might get "lost" in the entirety of a full-length article.
Collapse
Affiliation(s)
- Michael J Peeters
- University of Toledo College of Pharmacy & Pharmaceutical Sciences, 3000 Arlington Ave, Mail Stop 1013, Toledo, OH 43614, United States.
| |
Collapse
|
10
|
Peeters MJ, Zavod RM. Introducing Teachable Moments Matters. Curr Pharm Teach Learn 2021; 13:903-904. [PMID: 34294252 DOI: 10.1016/j.cptl.2021.03.026] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/20/2019] [Accepted: 03/26/2021] [Indexed: 06/13/2023]
Abstract
The Teachable Moments Matter (TMM) category of articles is designed to offer readers insight into a methodological issue identified within a companion article. Written in collaboration with one of the companion article authors, these articles provide an opportunity to focus on a challenge experienced by the authors and, in the process, provide one or more perspectives as to how to successfully navigate this issue. The Journal hopes this case-based approach will help to highlight an issue nuance in context, something that might get "lost" in the entirety of a full-length article.
Collapse
Affiliation(s)
- Michael J Peeters
- University of Toledo College of Pharmacy & Pharmaceutical Sciences, 3000 Arlington Ave, Mail Stop 1013, Toledo, OH 43614, United States.
| | - Robin M Zavod
- Pharmaceutical Sciences, Midwestern University College of Pharmacy Downers Grove, 555 31(st) Street, Downers Grove, IL, United States.
| |
Collapse
|
11
|
Khadka S, Peeters MJ. Comparing Data Submitted by Public and Private Pharmacy Schools to the Opioid-Related Activities Database. Am J Pharm Educ 2021; 85:8328. [PMID: 34315703 PMCID: PMC8341228 DOI: 10.5688/ajpe8328] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/03/2020] [Accepted: 12/15/2020] [Indexed: 06/13/2023]
Abstract
Objective. This investigation compared similarities and differences in education on opioids and opioid abuse between public and private US schools and colleges of pharmacy.Methods. The American Association of Colleges of Pharmacy has created and maintains an Opioid-Related Activities database for schools and colleges of pharmacy in the United States. With data from 2019, a mixed-methods design was used to triangulate quantitative analysis with a concurrent qualitative analysis. After describing, the data were compared to national statistics of schools and colleges of pharmacy (ie, number, type of school, and program structure). Data from the database on opioid activity types (ie, education, service, practice, research, and advocacy) were compared between private and public institutions, both quantitatively and qualitatively. The quantitative analysis used odds-ratios (for effect-size) and chi-square (for statistical significance), while the qualitative analysis employed word clouds to explore opioid-related activities descriptors.Results. One-hundred-seven of 144 US schools and colleges of pharmacy (74% response rate) provided their opioid-related activities information to AACP. The institutions (55 private, 52 public) had entered 436 unique opioid-related activities in the AACP database. Results of the quantitative and qualitative analyses triangulated that private institutions focused more on education-opioid-related activities, while public institutions offered more activities that involving research. Magnified to education-type opioid-related activities, faculty from private institutions often focused narrowly on an education event alone, while faculty from institutions often focused more broadly on education and other aspects such as funding, research and published articles.Conclusion. Overall, private and public US schools and colleges of pharmacy widely engaged in combatting the US opioid epidemic by training student pharmacists in this important area.
Collapse
Affiliation(s)
- Safalta Khadka
- University of Toledo, College of Pharmacy and Pharmaceutical Sciences, Toledo, Ohio
| | - Michael J Peeters
- University of Toledo, College of Pharmacy and Pharmaceutical Sciences, Toledo, Ohio
| |
Collapse
|
12
|
Peeters MJ, Cor MK, Petite SE, Schroeder MN. Validation Evidence using Generalizability Theory for an Objective Structured Clinical Examination. Innov Pharm 2021; 12:10.24926/iip.v12i1.2110. [PMID: 34007675 PMCID: PMC8102968 DOI: 10.24926/iip.v12i1.2110] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
OBJECTIVES Performance-based assessments, including objective structured clinical examinations (OSCEs), are essential learning assessments within pharmacy education. Because important educational decisions can follow from performance-based assessment results, pharmacy colleges/schools should demonstrate acceptable rigor in validation of their learning assessments. Though G-Theory has rarely been reported in pharmacy education, it would behoove pharmacy educators to, using G-Theory, produce evidence demonstrating reliability as a part of their OSCE validation process. This investigation demonstrates the use of G-Theory to describes reliability for an OSCE, as well as to show methods for enhancement of the OSCE's reliability. INNOVATION To evaluate practice-readiness in the semester before final-year rotations, third-year PharmD students took an OSCE. This OSCE included 14 stations over three weeks. Each week had four or five stations; one or two stations were scored by faculty-raters while three stations required students' written responses. All stations were scored 1-4. For G-Theory analyses, we used G_Strings and then mGENOVA. CRITICAL ANALYSIS Ninety-seven students completed the OSCE; stations were scored independently. First, univariate G-Theory design of students crossed with stations nested in weeks (p × s:w) was used. The total-score g-coefficient (reliability) for this OSCE was 0.72. Variance components for test parameters were identified. Of note, students accounted for only some OSCE score variation. Second, a multivariate G-Theory design of students crossed with stations (p• × s°) was used. This further analysis revealed which week(s) were weakest for the reliability of test-scores from this learning assessment. Moreover, decision-studies showed how reliability could change depending on the number of stations each week. For a g-coefficient >0.80, seven stations per week were needed. Additionally, targets for improvements were identified. IMPLICATIONS In test validation, evidence of reliability is vital for the inference of generalization; G-Theory provided this for our OSCE. Results indicated that the reliability of scores was mediocre and could be improved with more stations. Revision of problematic stations could help reliability as well. Within this need for more stations, one practical insight was to administer those stations over multiple weeks/occasions (instead of all stations in one occasion).
Collapse
Affiliation(s)
- Michael J. Peeters
- University of Toledo College of Pharmacy & Pharmaceutical Sciences, Toledo, OH
| | - M. Kenneth Cor
- University of Alberta Faculty of Pharmacy & Pharmaceutical Sciences, Edmonton, AB
| | - Sarah E. Petite
- University of Toledo College of Pharmacy & Pharmaceutical Sciences, Toledo, OH
| | | |
Collapse
|
13
|
Peeters MJ, Cor MK, Maki ED. Providing Validation Evidence for a Clinical-Science Module: Improving Testing Reliability with Quizzes. Innov Pharm 2021; 12:10.24926/iip.v12i1.2235. [PMID: 34007668 PMCID: PMC8102960 DOI: 10.24926/iip.v12i1.2235] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
DESCRIPTION OF THE PROBLEM High-stakes decision-making should have sound validation evidence; reliability is vital towards this. A short exam may not be very reliable on its own within didactic courses, and so supplementing it with quizzes might help. But how much? This study's objective was to understand how much reliability (for the overall module-grades) could be gained by adding quiz data to traditional exam data in a clinical-science module. THE INNOVATION In didactic coursework, quizzes are a common instructional strategy. However, individual contexts/instructors can vary quiz use formatively and/or summatively. Second-year PharmD students took a clinical-science course, wherein a 5-week module focused on cardiovascular therapeutics. Generalizability Theory (G-Theory) combined seven quizzes leading to an exam into one module-level reliability, based on a model where students were crossed with items nested in eight fixed testing occasions (mGENOVA used). Furthermore, G-Theory decision-studies were planned to illustrate changes in module-grade reliability, where the number of quiz-items and relative-weighting of quizzes were altered. CRITICAL ANALYSIS One-hundred students took seven quizzes and one exam. Individually, the exam had 32 multiple-choice questions (MCQ) (KR-20 reliability=0.67), while quizzes had a total of 50MCQ (5-9MCQ each) with most individual quiz KR-20s less than or equal to 0.54. After combining the quizzes and exam using G-Theory, estimated reliability of module-grades was 0.73; improved from the exam alone. Doubling the quiz-weight, from the syllabus' 18% quizzes and 82% exam, increased the composite-reliability of module-grades to 0.77. Reliability of 0.80 was achieved with equal-weight for quizzes and exam. NEXT STEPS Expectedly, more items lent to higher reliability. However, using quizzes predominantly formatively had little impact on reliability, while using quizzes more summatively (i.e., increasing their relative-weight in module-grade) improved reliability further. Thus, depending on use, quizzes can add to a course's rigor.
Collapse
Affiliation(s)
| | - M. Kenneth Cor
- University of Alberta Faculty of Pharmacy & Pharmaceutical Sciences
| | | |
Collapse
|
14
|
Abstract
OBJECTIVE There is a paucity of validation evidence for assessing clinical case-presentations by Doctor of Pharmacy (PharmD) students. Within Kane's Framework for Validation, evidence for inferences of scoring and generalization should be generated first. Thus, our objectives were to characterize and improve scoring, as well as build initial generalization evidence, in order to provide validation evidence for performance-based assessment of clinical case-presentations. DESIGN Third-year PharmD students worked up patient-cases from a local hospital. Students orally presented and defended their therapeutic care-plan to pharmacist preceptors (evaluators) and fellow students. Evaluators scored each presentation using an 11-item instrument with a 6-point rating-scale. In addition, evaluators scored a global-item with a 4-point rating-scale. Rasch Measurement was used for scoring analysis, while Generalizability Theory was used for generalization analysis. FINDINGS Thirty students each presented five cases that were evaluated by 15 preceptors using an 11-item instrument. Using Rasch Measurement, the 11-item instrument's 6-point rating-scale did not work; it only worked once collapsed to a 4-point rating-scale. This revised 11-item instrument also showed redundancy. Alternatively, the global-item performed reasonably on its own. Using multivariate Generalizability Theory, the g-coefficient (reliability) for the series of five case-presentations was 0.76 with the 11-item instrument, and 0.78 with the global-item. Reliability was largely dependent on multiple case-presentations and, to a lesser extent, the number of evaluators per case-presentation. CONCLUSIONS Our pilot results confirm that scoring should be simple (scale and instrument). More specifically, the longer 11-item instrument measured but had redundancy, whereas the single global-item provided measurement over multiple case-presentations. Further, acceptable reliability can be balanced between more/fewer case-presentations and using more/fewer evaluators.
Collapse
|
15
|
Abstract
BACKGROUND When available, empirical evidence should help guide decision-making. Following each administration of a learning assessment, data becomes available for analysis. For learning assessments, Kane's Framework for Validation can helpfully categorize evidence by inference (i.e., scoring, generalization, extrapolation, implications). Especially for test-scores used within a high-stakes setting, generalization evidence is critical. While reporting Cronbach's alpha, inter-rater reliability, and other reliability coefficients for a single measurement error are somewhat common in pharmacy education, dealing with multiple concurrent sources of measurement error within complex learning assessments is not. Performance-based assessments (e.g., OSCEs) that use raters, are inherently complex learning assessments. PRIMER Generalizability Theory (G-Theory) can account for multiple sources of measurement error. G-Theory is a powerful tool that can provide a composite reliability (i.e., generalization evidence) for more complex learning assessments, including performance-based assessments. It can also help educators explore ways to make a learning assessment more rigorous if needed, as well as suggest ways to better allocate resources (e.g., staffing, space, fiscal). A brief review of G-Theory is discussed herein focused on pharmacy education. MOVING FORWARD G-Theory has been common and useful in medical education, though has been used rarely in pharmacy education. Given the similarities in assessment methods among health-professions, G-Theory should prove helpful in pharmacy education as well. Within this Journal and accompanying this Idea Paper, there are multiple reports that demonstrate use of G-Theory in pharmacy education.
Collapse
|
16
|
Peeters MJ, Cor MK, Boddu SHS, Nesamony J. Validation Evidence from using Generalizability Theory in a Basic-Science Course: Reliability of Course-Grades from Multiple Examinations. Innov Pharm 2021; 12:10.24926/iip.v12i1.2925. [PMID: 34007682 PMCID: PMC8102975 DOI: 10.24926/iip.v12i1.2925] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022] Open
Abstract
DESCRIPTION OF THE PROBLEM Reliability is critical validation evidence on which to base high-stakes decision-making. Many times, one exam in a didactic course may not be acceptably reliable on its own. But how much might multiple exams add when combined together? THE INNOVATION To improve validation evidence towards high-stakes decision-making, Generalizability Theory (G-Theory) can combine reliabilities from multiple exams into one composite-reliability (G_String IV software). Further, G-Theory decision-studies can illustrate changes in course-grade reliability, depending on the number of exams and exam-items. CRITICAL ANALYSIS 101 first-year PharmD students took two midterm-exams and one final-exam in a pharmaceutics course. Individually, Exam1 had 50MCQ (KR-20=0.69), Exam2 had 43MCQ (KR-20=0.65), and Exam3 had 67MCQ (KR-20=0.67). After combining exam occasions using G-Theory, the composite-reliability was 0.71 for overall course-grades-better than any exam alone. Remarkably, increased numbers of exam occasions showed fewer items per exam were needed, and fewer items over all exams, to obtain an acceptable composite-reliability. Acceptable reliability could be achieved with different combinations of number of MCQs on each exam and number of exam occasions. IMPLICATIONS G-Theory provided reliability critical validation evidence towards high-stakes decision-making. Final course-grades appeared quite reliable after combining multiple course exams-though this reliability could and should be improved. Notably, more exam occasions allowed fewer items per exam and fewer items over all the exams. Thus, one added benefit of more exam occasions for educators is developing fewer items per exam and fewer items over all exams.
Collapse
Affiliation(s)
- Michael J. Peeters
- University of Toledo College of Pharmacy & Pharmaceutical Sciences, Toledo, OH
| | - M. Kenneth Cor
- University of Alberta Faculty of Pharmacy & Pharmaceutical Sciences, Edmonton, AB
| | - Sai HS Boddu
- University of Toledo College of Pharmacy & Pharmaceutical Sciences, Toledo, OH
- Department of Pharmaceutical Sciences, College of Pharmacy and Health Sciences, Ajman University, Ajman, United Arab Emirates
| | - Jerry Nesamony
- University of Toledo College of Pharmacy & Pharmaceutical Sciences, Toledo, OH
| |
Collapse
|
17
|
Peeters MJ, Schmude KA. Learning Assessment vs Program Evaluation. Am J Pharm Educ 2020; 84:ajpe7938. [PMID: 33012800 PMCID: PMC7523661 DOI: 10.5688/ajpe7938] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/25/2019] [Accepted: 03/12/2020] [Indexed: 06/11/2023]
Affiliation(s)
- Michael J Peeters
- University of Toledo, College of Pharmacy and Pharmaceutical Sciences, Toledo, Ohio
| | - Kimberly A Schmude
- University of Toledo, College of Pharmacy and Pharmaceutical Sciences, Toledo, Ohio
| |
Collapse
|
18
|
Peeters MJ, Harpe SE. Updating conceptions of validity and reliability. Res Social Adm Pharm 2020; 16:1127-1130. [PMID: 31806566 DOI: 10.1016/j.sapharm.2019.11.017] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2019] [Revised: 11/23/2019] [Accepted: 11/29/2019] [Indexed: 11/21/2022]
Affiliation(s)
- Michael J Peeters
- University of Toledo College of Pharmacy & Pharmaceutical Sciences, 3000 Arlington Ave, MS1013, Toledo, OH, 43614, United States.
| | - Spencer E Harpe
- Midwestern University Chicago College of Pharmacy, Department of Pharmacy Practice, 555 31st Street, Downers Grove, IL, 60515, United States
| |
Collapse
|
19
|
Peeters MJ, Cor MK. Guidance for high-stakes testing within pharmacy educational assessment. Curr Pharm Teach Learn 2020; 12:1-4. [PMID: 31843158 DOI: 10.1016/j.cptl.2019.10.001] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/04/2018] [Revised: 08/31/2018] [Accepted: 10/15/2019] [Indexed: 06/10/2023]
Abstract
BACKGROUND The term "high-stakes testing" is widely used among pharmacy educators, but the term often seems misused or used incompletely. This Teachable Moments Matter (TMM) focuses on the importance of scientific-rigor when assessing learners' abilities. This article discusses high-stakes testing - what it is and what it is not. This TMM is not meant as an extensive review of the topic. IMPACT As imperative for ethically-fair high-stakes testing, we will focus on defining and explaining high-stake testing, to include: evidence for validation, development of cut-scores, magnitudes of reliability coefficients, and other reliability measurement tools such as Generalizability Theory and Item-Response Theory. TEACHABLE MOMENT From our perspectives as educational psychometricians, we hope that this discussion will help foster scientifically-rigorous use and reporting of high-stakes testing in pharmacy education and research.
Collapse
Affiliation(s)
- Michael J Peeters
- University of Toledo College of Pharmacy & Pharmaceutical Sciences, 3000 Arlington Ave, Mail Stop 1013, Toledo, OH 43614, United States.
| | - M Ken Cor
- Faculty of Pharmacy and Pharmaceutical Sciences, 3-209 Edmonton Clinic Health Academy, University of Alberta, 11405-87 Ave, Edmonton T6G1C9, Alberta, Canada.
| |
Collapse
|
20
|
Abstract
Measurement validity is important when conducting research. This is as true for sociobehavioral research as for clinical research. Although the importance of validity is not new, its conceptualization has changed substantially in the past few decades. In the literature, there is a lack of consistency in how validity is presented. This may stem from a lack of awareness of the relatively recent changes in conceptualization of validity, the continued use of a historical framework in some educational texts, and/or the continued use of a historical framework in some training programs. This article presents a brief history of the conceptualization of validity including the progression from a perspective of related concepts of reliability and validity, to multiple types of validity, to a view of validity as a unitary concept supported by different types of evidence. This article closes by raising some important considerations about promoting use of a contemporary validity framework and associated terminology in current research, as well as in the education of future health-sciences researchers.
Collapse
Affiliation(s)
- Michael J Peeters
- University of Toledo College of Pharmacy & Pharmaceutical Sciences, 3000 Arlington Ave, MS1013, Toledo, OH, 43614, United States.
| | - Spencer E Harpe
- Midwestern University Chicago College of Pharmacy, Department of Pharmacy Practice, 555 31st Street, Downers Grove, IL, 60515, United States
| |
Collapse
|
21
|
Peeters MJ. Evidence for validity in assessing the affective domain? Curr Pharm Teach Learn 2019; 11:961-962. [PMID: 31570136 DOI: 10.1016/j.cptl.2019.05.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/02/2019] [Revised: 03/31/2019] [Accepted: 05/15/2019] [Indexed: 06/10/2023]
Affiliation(s)
- Michael J Peeters
- University of Toledo College of Pharmacy & Pharmaceutical Sciences, 3000 Arlington Ave, Mail Stop 1013, Toledo, OH 43614, United States.
| |
Collapse
|
22
|
Baki G, Peeters MJ. Exploring the Impact of Technology Use with Cosmetic Science Guest-Speakers: A Qualitative Study. Innov Pharm 2019; 10. [PMID: 34007548 PMCID: PMC7592864 DOI: 10.24926/iip.v10i2.1604] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022] Open
Abstract
Improvements in current classroom technology such as video conferencing have allowed geographically-distant guest-speakers to participate in teaching. However, is time and effort that faculty may spend coordinating guest-speakers helpful for their students' learning? Relevance is key to motivation and learning, and therefore, it would seem that professionals who can share industry applications and their experiences should help promote relevance. During the core application-based cosmetic science coursework in an undergraduate cosmetic science and formulation design degree at the University of Toledo, multiple industry experts come in as guest-speakers. The majority of them join remotely via a real-time video conferencing tool. The purpose of this study was to both explore the impact of guest-speakers on these students' learning, as well as to understand how guest-speakers might also value these experiences. Twenty-two students and sixteen guest-speakers participated. Using a qualitative approach, authors used an inductive thematic analysis of transcripts from focus-groups of students and interviews of guest-speakers. Twenty-one codes were identified, and five themes were constructed for both the student and guest-speaker groups. Themes from both groups were integrated and distilled into an essence related to teaching and learning. Our results indicated that students greatly appreciated the relevance from guest-speakers to augment their introductory/foundational instruction from faculty. From guest-speakers' perspectives, teaching students was formative towards developing informed future coworkers for the cosmetic industry. Technology enabled much of this. Overall, we believe that professional, experienced guest-speakers can make an impact on students. We hope that other higher education institutions might consider technology to foster use of guest-speakers within their programs.
Collapse
Affiliation(s)
- Gabriella Baki
- University of Toledo College of Pharmacy and Pharmaceutical Sciences
| | - Michael J Peeters
- University of Toledo College of Pharmacy and Pharmaceutical Sciences
| |
Collapse
|
23
|
Peeters MJ, Beltyukova SA. Using Rasch Measurement to Revisit Survey Checklist Items. Acad Med 2018; 93:1603. [PMID: 30376523 DOI: 10.1097/acm.0000000000002421] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Affiliation(s)
- Michael J Peeters
- Clinical senior lecturer, University of Toledo, Toledo, Ohio; . Professor, Judith Herb College of Education, University of Toledo, Toledo, Ohio
| | | |
Collapse
|
24
|
Reale MC, Riche DM, Witt BA, Baker WL, Peeters MJ. Development of critical thinking in health professions education: A meta-analysis of longitudinal studies. Curr Pharm Teach Learn 2018; 10:826-833. [PMID: 30236420 DOI: 10.1016/j.cptl.2018.04.019] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/17/2017] [Revised: 01/30/2018] [Accepted: 04/03/2018] [Indexed: 06/08/2023]
Abstract
INTRODUCTION While reports of critical thinking exist in the health professions literature, development of critical thinking across a broad range of health-professions students has not been systematically reviewed. METHODS In this meta-analysis, multiple databases and journals were searched through February 2016 to identify longitudinal studies using standardized tests of critical thinking [California Critical Thinking Skills Test (CCTST), Health Science Reasoning Test (HSRT), and Defining Issues Test (DIT)] in any language. Two reviewers extracted information and collected information regarding primary author, publishing journal, health profession, critical thinking test, and time1/time2 means and standard deviations. Standardized mean differences (SMD) with 95% confidence intervals (CI) were reported using a random-effects model. RESULTS Four hundred sixty-two studies were screened, and 79 studies (representing 6884 students) were included. Studies contained 37 CCTST, 22 DIT, and 20 HSRT. Health professions comprised nursing, pharmacy, physical therapy, occupational therapy, dentistry, medicine, veterinary medicine, dental hygiene, clinical laboratory sciences, and allied health. Cohen's kappa was strong (0.82) for inter-reviewer agreement. Both the CCTST (SMD = 0.37, 95% CI = 0.23-0.52) and DIT (SMD = 0.28, 95%CI = 0.18-0.39) demonstrated significant increases in total scores, but the HSRT (SMD = 0.03, 95%CI = -0.05-0.12) did not show improvement. DISCUSSION/CONCLUSIONS In this meta-analysis, students from the majority of health professions consistently showed improvement in development of critical thinking. In this diverse population, only the CCTST and DIT appeared responsive to change.
Collapse
Affiliation(s)
| | - Daniel M Riche
- University of Mississippi School of Pharmacy, Jackson, MS, United States.
| | - Benjamin A Witt
- The University of Utah Hospitals and Clinics, Salt Lake City, UT, United States.
| | - William L Baker
- University of Connecticut School of Pharmacy, Storrs, CT, United States.
| | - Michael J Peeters
- University of Toledo College of Pharmacy & Pharmaceutical Sciences, Toledo, OH, United States.
| |
Collapse
|
25
|
Peeters MJ, Sexton M, Metz AE, Hasbrouck CS. A team-based interprofessional education course for first-year health professions students. Curr Pharm Teach Learn 2017; 9:1099-1110. [PMID: 29233378 DOI: 10.1016/j.cptl.2017.07.006] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/16/2016] [Revised: 03/06/2017] [Accepted: 07/28/2017] [Indexed: 06/07/2023]
Abstract
BACKGROUND AND PURPOSE Interprofessional education (IPE) is required within pharmacy education, and should include classroom-based education along with experiential interprofessional collaboration. For classroom-based education, small-group learning environments may create a better platform for engaging students in the essential domain of interprofessional collaboration towards meaningful learning within IPE sub-domains (interprofessional communication, teams and teamwork, roles and responsibilities, and values and ethics). Faculty envisioned creating a small-group learning environment that was inviting, interactive, and flexible using situated learning theory. This report describes an introductory, team-based, IPE course for first-year health-professions students; it used small-group methods for health-professions students' learning of interprofessional collaboration. EDUCATIONAL ACTIVITY AND SETTING The University of Toledo implemented a 14-week required course involving 554 first-year health-sciences students from eight professions. The course focused on the Interprofessional Education Collaborative's (IPEC) Core Competencies for Interprofessional Collaboration. Students were placed within interprofessional teams of 11-12 students each and engaged in simulations, standardized-patient interviews, case-based communications exercises, vital signs training, and patient safety rotations. Outcomes measured were students' self-ratings of attaining learning objectives, perceptions of other professions (from word cloud), and satisfaction through end-of-course evaluations. FINDINGS This introductory, team-based IPE course with 554 students improved students' self-assessed competency in learning objectives (p < 0.01, Cohen's d = 0.9), changed students' perceptions of other professions (via word clouds), and met students' satisfaction through course evaluations. DISCUSSION AND SUMMARY Through triangulation of our various assessment methods, we considered this course offering a success. This interprofessional, team-based, small-group strategy to teaching and learning IPE appeared helpful within this interactive, classroom-based course.
Collapse
Affiliation(s)
- Michael J Peeters
- University of Toledo College of Pharmacy & Pharmaceutical Sciences, 3000 Arlington Ave, Mail Stop 1013, Toledo, OH 43614, United States.
| | - Martha Sexton
- University of Toledo College of Nursing, 3000 Arlington Ave, MS 1026, Toledo, OH 43614, United States.
| | - Alexia E Metz
- Occupational Therapy Doctoral Program, University of Toledo, 2801 W. Bancroft, MS 119, Toledo, OH 43606, United States.
| | - Carol S Hasbrouck
- University of Toledo, Interprofessional Immersive Simulation Center, 1214B Center for Creative Education Building, 3000 Arlington Ave, MS 1030, Toledo, OH 43614, United States.
| |
Collapse
|
26
|
Abstract
Formative assessment is critical for deliberate improvement, development and growth. While not entirely synonymous, assessment for learning (AFL) is an approach using formative assessment to specifically improve students' learning. While using formative assessments, AFL can also have summative programmatic-assessment implications. For each learning assessment, summative and formative uses can be leveraged; it can scaffold (formative), foster students' growth (formative), and document students' development in a competency/standard (summative). For example, using a developmental portfolio with iterative reflective-writings (formative), PharmD students showed qualitative development in the "professionalism" competency (summative; ACPE Standard 4.4). (In parallel, this development in professionalism was confirmed quantitatively.) An AFL approach can complement other assessments; it can be integrated with other summative assessments into a multi-method assessment program, wherein developmental portfolio sections could be used for a few specific competencies. While AFL is not a one-size-fits-all silver bullet approach for programmatic assessment, it is one notable robust tool to employ.
Collapse
Affiliation(s)
- Michael J Peeters
- University of Toledo College of Pharmacy and Pharmaceutical Sciences, Toledo, Ohio
| |
Collapse
|
27
|
Haines SL, Summa MA, Peeters MJ, Dy-Boarman EA, Boyle JA, Clifford KM, Willson MN. Toolkit for US colleges/schools of pharmacy to prepare learners for careers in academia. Curr Pharm Teach Learn 2017; 9:750-762. [PMID: 29233301 DOI: 10.1016/j.cptl.2017.05.002] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/16/2016] [Revised: 02/16/2017] [Accepted: 05/20/2017] [Indexed: 06/07/2023]
Abstract
INTRODUCTION The objective of this article is to provide an academic toolkit for use by colleges/schools of pharmacy to prepare student pharmacists/residents for academic careers. METHODS Through the American Association of Colleges of Pharmac (AACP) Section of Pharmacy Practice, the Student Resident Engagement Task Force (SRETF) collated teaching materials used by colleges/schools of pharmacy from a previously reported national survey. The SRETF developed a toolkit for student pharmacists/residents interested in academic pharmacy. RESULTS Eighteen institutions provided materials; five provided materials describing didactic coursework; over fifteen provided materials for an academia-focused Advanced Pharmacy Practice Experiences (APPE), while one provided materials for an APPE teaching-research elective. SRETF members created a syllabus template and sample lesson plan by integrating submitted resources. Submissions still needed to complete the toolkit include examples of curricular tracks and certificate programs. DISCUSSION AND CONCLUSIONS Pharmacy faculty vacancies still exist in pharmacy education. Engaging student pharmacists/residents about academia pillars of teaching, scholarship and service is critical for the future success of the academy.
Collapse
Affiliation(s)
- Seena L Haines
- Department of Pharmacy Practice, University of Mississippi School of Pharmacy, 2500 North State Street, Jackson, MS 39216, United States.
| | - Maria A Summa
- Pharmacy Practice and Administration, University of Saint Joseph, Hartford, CT, United States.
| | | | | | - Jaclyn A Boyle
- Community Pharmacy Innovation, Northeast Ohio Medical University, Rootstown, OH, United States.
| | - Kalin M Clifford
- Texas Tech University Health Sciences Center, Dallas, TX, United States.
| | - Megan N Willson
- Washington State University College of Pharmacy, Spokane, WA, United States.
| |
Collapse
|
28
|
Peeters MJ, Martin BA. Validation of learning assessments: A primer. Curr Pharm Teach Learn 2017; 9:925-933. [PMID: 29233326 DOI: 10.1016/j.cptl.2017.06.001] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/22/2016] [Revised: 05/25/2017] [Accepted: 06/02/2017] [Indexed: 06/07/2023]
Abstract
The Accreditation Council for Pharmacy Education's Standards 2016 has placed greater emphasis on validating educational assessments. In this paper, we describe validity, reliability, and validation principles, drawing attention to the conceptual change that highlights one validity with multiple evidence sources; to this end, we recommend abandoning historical (confusing) terminology associated with the term validity. Further, we describe and apply Kane's framework (scoring, generalization, extrapolation, and implications) for the process of validation, with its inferences and conclusions from varied uses of assessment instruments by different colleges and schools of pharmacy. We then offer five practical recommendations that can improve reporting of validation evidence in pharmacy education literature. We describe application of these recommendations, including examples of validation evidence in the context of pharmacy education. After reading this article, the reader should be able to understand the current concept of validation, and use a framework as they validate and communicate their own institution's learning assessments.
Collapse
Affiliation(s)
- Michael J Peeters
- University of Toledo College of Pharmacy & Pharmaceutical Sciences, 3000 Arlington Ave, Mail Stop 1013, Toledo, OH 43614, United States.
| | - Beth A Martin
- University of Wisconsin-Madison School of Pharmacy, 777 Highland Ave, Madison, WI 53705-2222, United States.
| |
Collapse
|
29
|
Peeters MJ, Hayes LM. A Plea for Psychometric Rigor. Am J Pharm Educ 2017; 81:79. [PMID: 28630520 PMCID: PMC5468717 DOI: 10.5688/ajpe81479] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/09/2023]
Affiliation(s)
- Michael J Peeters
- University of Toledo College of Pharmacy and Pharmaceutical Sciences, Toledo, Ohio
| | - Lisa M Hayes
- University of Toledo College of Pharmacy and Pharmaceutical Sciences, Toledo, Ohio
| |
Collapse
|
30
|
Peeters MJ, Garavalia LS. Teachable Moments Matter for: An analysis of the use of Pharmacy Curriculum Outcomes Assessment (PCOA) scores within one professional program. Curr Pharm Teach Learn 2017; 9:175-177. [PMID: 29233400 DOI: 10.1016/j.cptl.2016.12.001] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/18/2016] [Revised: 11/30/2016] [Accepted: 12/01/2016] [Indexed: 06/07/2023]
Abstract
Editors Note: The Teachable Moments Matter category of articles is designed to offer readers insight into a methodological issue identified within a companion article. Written in collaboration with one of these authors, these articles provide an opportunity to focus on a challenge experienced by the authors and, in the process, provide one or more perspectives as to how to successfully navigate this issue. Notably, this "issue" is not necessarily a problem (as this first paper in this series demonstrates). The Journal hopes this case-based approach will help highlight an issue nuance in context, something that might get "lost" in the entirety of a full-length article. This article discusses the importance of communicating a conceptual framework (i.e., theory) as a basis for scholarly articles. A specific example in the companion article is use of validity theory. In our community of researchers, we need to better communicate a conceptual framework as a basis to allow others to build on and grow our knowledge in pharmacy education.
Collapse
Affiliation(s)
- Michael J Peeters
- University of Toledo College of Pharmacy & Pharmaceutical Sciences, Toledo, OH 43614.
| | - Linda S Garavalia
- Western University of Health Sciences College of Pharmacy, Pomona, CA 91766-1854.
| |
Collapse
|
31
|
Haines SL, Dy-Boarman EA, Clifford KM, Summa MA, Willson MN, Boyle JA, Peeters MJ. Methods Used by Colleges and Schools of Pharmacy to Prepare Student Pharmacists for Careers in Academia. Am J Pharm Educ 2017; 81:6. [PMID: 28289296 PMCID: PMC5339592 DOI: 10.5688/ajpe8116] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/11/2015] [Accepted: 11/10/2015] [Indexed: 05/12/2023]
Abstract
Objective. To identify the methods used by US colleges and schools of pharmacy to prepare student pharmacists for academic careers. Method. An 18-item survey instrument was developed and distributed to US colleges and schools of pharmacy. Representatives were asked about faculty responsibilities, experiences in academia currently offered to student pharmacists, and representatives' perception of their student pharmacists' preparedness for careers in academia, including barriers in current programming. Results. Representatives from 96 colleges/schools responded. The vast majority (96%) provided academia-focused advanced pharmacy practice experiences (APPEs), 40% provided didactic coursework in academia, 28% offered a longitudinal research track, and 42% offered academia-focused independent studies. Teaching methods and creating learning objectives were the most common pedagogical content, while assessment activities were diverse. Time was the most prevalent barrier to providing training for academic careers; however, degree of student pharmacist interest, faculty inexperience, and lack of leadership support were also commonly reported. Conclusions: Colleges and schools of pharmacy vary in the extent to which they prepare student pharmacists for careers in academia. Advanced pharmacy practice experiences were the most common method of training offered. Standardization of training for academia may better promote this career path to student pharmacists.
Collapse
|
32
|
Baki G, Borden MJ, Peeters MJ. Introducing an Undergraduate Degree of Cosmetic Science and Formulation Design within a College of Pharmacy. Innov Pharm 2017. [DOI: 10.24926/iip.v8i1.489] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
As a unique and versatile undergraduate degree program, a Bachelor of Science in Pharmaceutical Sciences (BSPS) is offered by a number of colleges/schools of pharmacy. These provide a bachelor's degree concentrated in pharmaceutical sciences, and can be a non-Doctor of Pharmacy option, possibly before progressing to graduate degree studies. Recently implemented at the University of Toledo College of Pharmacy and Pharmaceutical Sciences (UTCPPS), one such BSPS major is Cosmetic Science and Formulation Design. This new undergraduate major was created to serve the needs of the cosmetic and personal care industry, with a great need identified for well-trained new professionals with basic knowledge in the sciences and business. This Cosmetic Science and Formulation Design major was added to four other BSPS majors at UTCPPS. Introduced in 2013, this major is the only functioning undergraduate degree in Cosmetic Science and Formulation Design in the United States. Preliminary job placement data provides promising evidence that this undergraduate major has helped graduates launch a career in the cosmetic and personal care, or pharmaceutical industries. Based on our experience from the past three years, we believe that this cosmetic science major has been worth its resource investment. We hope others designing new undergraduate pharmaceutical sciences programs might integrate advice from this experience into their impending programs.
Type: Idea Paper
Collapse
|
33
|
Peeters MJ, Vaidya VA. A Mixed-Methods Analysis in Assessing Students' Professional Development by Applying an Assessment for Learning Approach. Am J Pharm Educ 2016; 80:77. [PMID: 27402980 PMCID: PMC4937972 DOI: 10.5688/ajpe80577] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/20/2015] [Accepted: 07/30/2015] [Indexed: 05/26/2023]
Abstract
Objective. To describe an approach for assessing the Accreditation Council for Pharmacy Education's (ACPE) doctor of pharmacy (PharmD) Standard 4.4, which focuses on students' professional development. Methods. This investigation used mixed methods with triangulation of qualitative and quantitative data to assess professional development. Qualitative data came from an electronic developmental portfolio of professionalism and ethics, completed by PharmD students during their didactic studies. Quantitative confirmation came from the Defining Issues Test (DIT)-an assessment of pharmacists' professional development. Results. Qualitatively, students' development reflections described growth through this course series. Quantitatively, the 2015 PharmD class's DIT N2-scores illustrated positive development overall; the lower 50% had a large initial improvement compared to the upper 50%. Subsequently, the 2016 PharmD class confirmed these average initial improvements of students and also showed further substantial development among students thereafter. Conclusion. Applying an assessment for learning approach, triangulation of qualitative and quantitative assessments confirmed that PharmD students developed professionally during this course series.
Collapse
Affiliation(s)
- Michael J Peeters
- University of Toledo College of Pharmacy and Pharmaceutical Sciences, Toledo, OH
| | - Varun A Vaidya
- University of Toledo College of Pharmacy and Pharmaceutical Sciences, Toledo, OH
| |
Collapse
|
34
|
Peeters MJ, Boddu SHS. Assessing development in critical thinking: One institution's experience. Curr Pharm Teach Learn 2016; 8:271-278. [PMID: 30070235 DOI: 10.1016/j.cptl.2016.02.007] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/04/2015] [Accepted: 02/03/2016] [Indexed: 06/08/2023]
Abstract
OBJECTIVE Enhancing critical and moral thinking are goals of higher education. We sought to examine thinking development within a Doctor of Pharmacy (Pharm.D.) program. METHODS The California Critical Thinking Skills Test (CCTST), Health Sciences Reasoning Test (HSRT), and the Defining Issues Test (DIT2) were administered to Pharm.D. students over four sessions throughout their didactic studies. Students took tests in their P1 Fall, P1 Spring, P2 Spring, and P3 Spring. While CCTST and HSRT are similar for assessing foundational critical thinking, the DIT2 assesses complex moral thinking. Each thinking test was correlated with academic success by undergraduate and graduate grade-point averages (GPAs). RESULTS The CCTST was administered in P1 Fall (20.1 ± 5.0). For HSRT, mean ± S.D. was P1 Spring: 22.7 ± 3.5, P2 Spring: 22.6 ± 4.8, and P3 Spring: 23.8 ± 4.5. After converting P1-CCTST and P2-HSRT scores using user-manual interpretations, there was no difference on paired comparison (P = 0.22, 0.1 Cohen's d). There was a small difference between P1-HSRT and P3-HSRT (P < 0.01, 0.2 Cohen's d). Also administered each time, the DIT2 was P1 Fall: 40.4 ± 12.6, P1 Spring: 36.3 ± 13.7, P2 Spring: 44.9 ± 13.6, and P3 Spring: 43.4 ± 15.4. For DIT2, both P1 Fall to P2 Spring and P1 Spring to P3 Spring were significant with small and medium effect-sizes (both P < 0.01, 0.4 and 0.5 Cohen's d respectively). Importantly, multiple HSRT, and DIT2 assessments correlated with undergraduate and graduate GPAs. CONCLUSIONS During a Pharm.D. program of study, students developed substantially in moral reasoning though minimally in foundational critical thinking. Both foundational and moral reasoning correlated with academic success. Showing responsiveness to change, the DIT2 appears helpful as a measure of cognitive development for pharmacy education.
Collapse
Affiliation(s)
- Michael J Peeters
- University of Toledo College of Pharmacy and Pharmaceutical Sciences, Toledo, OH.
| | - Sai H S Boddu
- University of Toledo College of Pharmacy and Pharmaceutical Sciences, Toledo, OH
| |
Collapse
|
35
|
Peeters MJ, Zitko KL, Vaidya VA. Critical Thinking Development in Pharmacy Education: A Meta-Analysis. Innov Pharm 2016. [DOI: 10.24926/iip.v7i1.420] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Objective: The investigators aimed to summarize prior studies of critical thinking development among pharmacy students, using the California Critical Thinking Skills Test (CCTST), Health Sciences Reasoning Test (HSRT), and Defining Issues Test (DIT).
Methods: Independently, two investigators (KLZ, MJP) systematically searched available literature using PubMed, Google Scholar, ERIC, PsychInfo, as well as pharmacy education conference abstracts in American Journal of Pharmaceutical Education. Their search terms were ‘pharmacy’, and [‘critical thinking’, ‘HSRT’, ‘CCTST’, and ‘DIT’]. Studies included were those that investigated pharmacy students, used one of the tests (CCTST, HSRT, DIT), and used a longitudinal design with test administration at two or more time-points for the same subjects (i.e., development). On review, the CCTST and HSRT seem more foundational to analytical/critical thinking, while the DIT appears to measure moral/complex thinking. Summarizing used meta-analysis with Cohen’s d and random-effects modelling.
Results: Five studies involved thinking development with 10 separate cohorts for meta-analysis (8 cohorts for CCTST, 2 for DIT, and 0 for HSRT). At 5 institutions, 407 and 1148 students were included (CCTST and DIT, respectively). For the CCTST, the overall effect was 0.33 (0.19-0.47 95%CI) with some heterogeneity among study cohorts (I2=52%). For the DIT, the overall effect was -0.23 (-0.83-0.37 95%CI) with considerable heterogeneity between study cohorts (I2=95%). For the CCTST and DIT, some studies showed effect-sizes greater than 0.5. Meta-analysis of the HSRT could not be conducted (i.e., 0 studies found).
Implications: While measuring different aspects of “critical thinking”, the CCTST and DIT showed responsiveness to change and appear to be promising measures of cognitive development. These tests should be used in further well-designed research studies that explore strategies for improving cognitive development in pharmacy education.
Conflict of Interest
We declare no conflicts of interest or financial interests that the we or members of our immediate families have in any product or service discussed in the manuscript, including grants (pending or received), employment, gifts, stock holdings or options, honoraria, consultancies, expert testimony, patents and royalties.
Type: Original Research
Collapse
|
36
|
Abstract
The concept of development is ubiquitous throughout higher education. Development of critical thinking, problem-solving, and clinical reasoning are noted as important outcomes in higher education, including health professions education. In this era of widening scrutiny, demonstration of this outcome within programmatic assessment is becoming increasingly important. Programmatic assessment of critical thinking is complicated because of its multiple definitions, array of theoretical frameworks, and variety of measurement instruments. Additionally, recent guidelines and standards for pharmacy education have affirmed “habits of mind,” which are not new to education and encompass analytical critical thinking. In this paper, we sought to provide: 1) an overview of various critical thinking measurement instruments with their different associated critical thinking definitions, 2) a background and framework for thinking using the Dimensions of Learning model, 3) implications and applications for assessing cognitive development (critical and complex thinking) within the context of pharmacy education, and 4) specific suggestions for assessment in pharmacy education.
Type: Idea Paper
Collapse
|
37
|
Peeters MJ, Kelly CP, Cor MK. Summative Evaluations When Using an Objective Structured Teaching Exercise. Am J Pharm Educ 2015; 79:60. [PMID: 26089569 PMCID: PMC4469026 DOI: 10.5688/ajpe79460] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Affiliation(s)
| | | | - M Kenneth Cor
- University of Alberta Faculty of Pharmacy and Pharmaceutical Sciences, Edmonton, Alberta, Canada
| |
Collapse
|
38
|
Hoover MJ, Peeters MJ. Discussion of learning assessments in postgraduate teaching and learning curricula. Am J Health Syst Pharm 2015; 72:687-8. [DOI: 10.2146/ajhp140631] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022] Open
Affiliation(s)
- Matthew J. Hoover
- Northeast Ohio Medical University College of Pharmacy Rootstown, OH
- Clinical Specialist Cleveland Clinic Marymount Hospital Garfield Heights, OH
| | - Michael J. Peeters
- University of Toledo College of Pharmacy and Pharmaceutical Sciences Toledo, OH
| |
Collapse
|
39
|
Abstract
Type: Letter
Collapse
|
40
|
Peeters MJ, Gundram TE, Murphy JA. Comparing Vertical and Horizontal Screening Methods for Pharmacy Resident Candidates Before Interviews. Innov Pharm 2015. [DOI: 10.24926/iip.v6i3.388] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Purpose: Prior studies have examined autobiographical screening methods among medical student applicants, and demonstrated halo bias with single-rater scoring; though others have questioned its practical significance. Comparing with traditional vertical screening method, we evaluated a horizontal method for initial screening of Post-Graduate Year-1 (PGY-1) pharmacy practice resident candidate applications prior to interviews.
Methods: Our screening rubric for PGY-1 pharmacy residency candidates consisted of eight criteria, each scored using a 5-point Likert scale. During the 2014 residency recruitment season, two single-evaluators (A&B) scored all eight criteria and their scores were summed into total application scores (vertical method). Meanwhile two other evaluators (C&D) each evaluated only two criteria for all applications. The four combined-evaluators (A-D) scores, on two criteria each, were summed together into total application scores (horizontal method). For statistical comparison of single-evaluator and combined-evaluators, inter-component reliabilities were analyzed for each evaluator, while inter-rater consistency was also examined. For practical significance, actual selection differences were reviewed.
Results:Forty-six applications were evaluated to determine 24 invitations for on-site interviews. Inter-component reliability differed among evaluatorA, evaluatorB, combined-evaluators A-D (Cronbach’s alpha of 0.74, 0.73, 0.58, respectively; lower better). Among raters, inter-rater consistency was excellent (0.86 by intraclass correlation, p
Conclusion: Halo bias was seen with the single-evaluators (vertical method); two interview invitations were negatively impacted. For pharmacy resident screening, a horizontal screening method appears to be rigorous in promoting fairness for applicants. As pharmacy residency applications continue to grow, a fair and time-efficient method of screening seems imperative.
Type: Original Research
Collapse
|
41
|
Hoover MJ, Jung R, Jacobs DM, Peeters MJ. Educational testing validity and reliability in pharmacy and medical education literature. Am J Pharm Educ 2013; 77:213. [PMID: 24371337 PMCID: PMC3872932 DOI: 10.5688/ajpe7710213] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/21/2013] [Accepted: 07/29/2013] [Indexed: 05/13/2023]
Abstract
OBJECTIVES To evaluate and compare the reliability and validity of educational testing reported in pharmacy education journals to medical education literature. METHODS Descriptions of validity evidence sources (content, construct, criterion, and reliability) were extracted from articles that reported educational testing of learners' knowledge, skills, and/or abilities. Using educational testing, the findings of 108 pharmacy education articles were compared to the findings of 198 medical education articles. RESULTS For pharmacy educational testing, 14 articles (13%) reported more than 1 validity evidence source while 83 articles (77%) reported 1 validity evidence source and 11 articles (10%) did not have evidence. Among validity evidence sources, content validity was reported most frequently. Compared with pharmacy education literature, more medical education articles reported both validity and reliability (59%; p<0.001). CONCLUSION While there were more scholarship of teaching and learning (SoTL) articles in pharmacy education compared to medical education, validity, and reliability reporting were limited in the pharmacy education literature.
Collapse
Affiliation(s)
- Matthew J. Hoover
- College of Pharmacy, Northeast Ohio Medical University, Rootstown, Ohio
- Cleveland Clinic Marymount Hospital, Garfield Heights, Ohio
- University of Toledo College of Pharmacy and Pharmaceutical Sciences, Toledo, Ohio
| | - Rose Jung
- University of Toledo College of Pharmacy and Pharmaceutical Sciences, Toledo, Ohio
| | - David M. Jacobs
- University of Toledo College of Pharmacy and Pharmaceutical Sciences, Toledo, Ohio
- University of Houston, Houston, Texas
| | - Michael J. Peeters
- University of Toledo College of Pharmacy and Pharmaceutical Sciences, Toledo, Ohio
| |
Collapse
|
42
|
Peeters MJ, Beltyukova SA, Martin BA. Educational testing and validity of conclusions in the scholarship of teaching and learning. Am J Pharm Educ 2013; 77:186. [PMID: 24249848 PMCID: PMC3831397 DOI: 10.5688/ajpe779186] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/21/2013] [Accepted: 07/27/2013] [Indexed: 05/13/2023]
Abstract
Validity and its integral evidence of reliability are fundamentals for educational and psychological measurement, and standards of educational testing. Herein, we describe these standards of educational testing, along with their subtypes including internal consistency, inter-rater reliability, and inter-rater agreement. Next, related issues of measurement error and effect size are discussed. This article concludes with a call for future authors to improve reporting of psychometrics and practical significance with educational testing in the pharmacy education literature. By increasing the scientific rigor of educational research and reporting, the overall quality and meaningfulness of SoTL will be improved.
Collapse
Affiliation(s)
- Michael J. Peeters
- College of Pharmacy and Pharmaceutical Sciences, University of Toledo, Toledo, Ohio
| | | | - Beth A. Martin
- School of Pharmacy, University of Wisconsin-Madison, Madison, Wisconsin
| |
Collapse
|
43
|
Peeters MJ, Serres ML, Gundrum TE. Improving reliability of a residency interview process. Am J Pharm Educ 2013; 77:168. [PMID: 24159209 PMCID: PMC3806952 DOI: 10.5688/ajpe778168] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/02/2012] [Accepted: 03/29/2013] [Indexed: 05/26/2023]
Abstract
OBJECTIVE To improve the reliability and discrimination of a pharmacy resident interview evaluation form, and thereby improve the reliability of the interview process. METHODS In phase 1 of the study, authors used a Many-Facet Rasch Measurement model to optimize an existing evaluation form for reliability and discrimination. In phase 2, interviewer pairs used the modified evaluation form within 4 separate interview stations. In phase 3, 8 interviewers individually-evaluated each candidate in one-on-one interviews. RESULTS In phase 1, the evaluation form had a reliability of 0.98 with person separation of 6.56; reproducibly, the form separated applicants into 6 distinct groups. Using that form in phase 2 and 3, our largest variation source was candidates, while content specificity was the next largest variation source. The phase 2 g-coefficient was 0.787, while confirmatory phase 3 was 0.922. Process reliability improved with more stations despite fewer interviewers per station-impact of content specificity was greatly reduced with more interview stations. CONCLUSION A more reliable, discriminating evaluation form was developed to evaluate candidates during resident interviews, and a process was designed that reduced the impact from content specificity.
Collapse
|
44
|
|
45
|
Vaidya V, Peeters MJ, Partha G, Potnis P. Evaluating the association between type of prescription drug plan and asthma patients' use of controller medications. Journal of Pharmaceutical Health Services Research 2012. [DOI: 10.1111/j.1759-8893.2012.00082.x] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
Abstract
Objectives
To explore the association between the type of prescription drug plan (PDP) and taking preventive daily asthma medication (controller medication) in patients with poor asthma control of their asthma (defined as taking more than three canisters of short-acting β-agonists each month).
Methods
A retrospective, cross-sectional study using the 2008 Medical Expenditure Panel Survey, a nationally representative sample of the non-institutionalized, civilian US population. Asthma patients were identified by International Classification of Diseases, Ninth Revision, Clinical Modification diagnosis code 493. Only those patients that reported use of more than three canisters of rescue inhalers in a 3-month period were included. Based on patients' self-reported use of preventive medications, they were classified as controller drug users and non-users. Descriptive statistics were used to describe the population. A multiple logistic regression model was used to determine odds of controller usage based on type of PDP using demographic characteristics (age, gender, race, ethnicity, income, perceived health status) as confounders. All analysis was done using SAS version 9.1.
Key findings
Asthma controller drug use was found to be 67% among the study population. The logistic regression analysis showed that patients having Medicare as their PDP were more likely to use controller medications compared with patients with no PDP (odds ratio (OR) 4.58, 95% confidence interval (CI) 1.33–15.77). Higher odds were seen for Medicaid (OR 2.09, CI 0.96–4.54) and Veterans Affairs (OR 1.66, CI 0.16–17.05) prescription beneficiaries too, but the effect was not significant.
Conclusions
Type of PDP was found to have an effect of utilization of controller drugs among asthma patients. Future research should explore viable plan options that encourage guideline-recommended medication use in asthma patients.
Collapse
Affiliation(s)
- Varun Vaidya
- Pharmacy Health Care Administration, Department of Pharmacy Practice, University of Toledo College of Pharmacy and Pharmaceutical Sciences, Toledo, OH, USA
| | - Michael J Peeters
- Pharmacy Health Care Administration, Department of Pharmacy Practice, University of Toledo College of Pharmacy and Pharmaceutical Sciences, Toledo, OH, USA
| | - Gautam Partha
- Pharmacy Health Care Administration, Department of Pharmacy Practice, University of Toledo College of Pharmacy and Pharmaceutical Sciences, Toledo, OH, USA
| | - Priyanka Potnis
- Pharmacy Health Care Administration, Department of Pharmacy Practice, University of Toledo College of Pharmacy and Pharmaceutical Sciences, Toledo, OH, USA
| |
Collapse
|
46
|
Gleason BL, Peeters MJ, Resman-Targoff BH, Karr S, McBane S, Kelley K, Thomas T, Denetclaw TH. An active-learning strategies primer for achieving ability-based educational outcomes. Am J Pharm Educ 2011; 75:186. [PMID: 22171114 PMCID: PMC3230347 DOI: 10.5688/ajpe759186] [Citation(s) in RCA: 125] [Impact Index Per Article: 9.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/10/2011] [Accepted: 06/21/2011] [Indexed: 05/07/2023]
Abstract
Active learning is an important component of pharmacy education. By engaging students in the learning process, they are better able to apply the knowledge they gain. This paper describes evidence supporting the use of active-learning strategies in pharmacy education and also offers strategies for implementing active learning in pharmacy curricula in the classroom and during pharmacy practice experiences.
Collapse
|
47
|
|
48
|
Peeters MJ, Sahloff EG, Stone GE. A standardized rubric to evaluate student presentations. Am J Pharm Educ 2010; 74:171. [PMID: 21301605 PMCID: PMC2996761 DOI: 10.5688/aj7409171] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/18/2010] [Accepted: 08/02/2010] [Indexed: 05/16/2023]
Abstract
OBJECTIVE To design, implement, and assess a rubric to evaluate student presentations in a capstone doctor of pharmacy (PharmD) course. DESIGN A 20-item rubric was designed and used to evaluate student presentations in a capstone fourth-year course in 2007-2008, and then revised and expanded to 25 items and used to evaluate student presentations for the same course in 2008-2009. Two faculty members evaluated each presentation. ASSESSMENT The Many-Facets Rasch Model (MFRM) was used to determine the rubric's reliability, quantify the contribution of evaluator harshness/leniency in scoring, and assess grading validity by comparing the current grading method with a criterion-referenced grading scheme. In 2007-2008, rubric reliability was 0.98, with a separation of 7.1 and 4 rating scale categories. In 2008-2009, MFRM analysis suggested 2 of 98 grades be adjusted to eliminate evaluator leniency, while a further criterion-referenced MFRM analysis suggested 10 of 98 grades should be adjusted. CONCLUSION The evaluation rubric was reliable and evaluator leniency appeared minimal. However, a criterion-referenced re-analysis suggested a need for further revisions to the rubric and evaluation process.
Collapse
|
49
|
Peeters MJ, Gallegos PJ. Wikis and open-access education? Am J Pharm Educ 2010; 74:16d. [PMID: 20221370 PMCID: PMC2829147] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
|
50
|
Peeters MJ. Book Review: Preceptor's Handbook for Pharmacists, 2nd Edition. Ann Pharmacother 2009. [DOI: 10.1345/aph.1m438] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022] Open
|