1
|
Boyanovsky BB, Belghasem M, White BA, Kadavakollu S. Incorporating Augmented Reality Into Anatomy Education in a Contemporary Medical School Curriculum. Cureus 2024; 16:e57443. [PMID: 38699098 PMCID: PMC11064471 DOI: 10.7759/cureus.57443] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/17/2024] [Indexed: 05/05/2024] Open
Abstract
Anatomy education in the medical school curriculum has encountered considerable challenges during the last decade. The exponential growth of medical science has necessitated a review of the classical ways to teach anatomy to shorten the time students spend dissecting, allowing them to acquire critical, new knowledge in other disciplines. Augmented and mixed reality technologies have developed tremendously during the last few years, offering a wide variety of possibilities to deliver anatomy education to medical students. Here, we provide a methodology to develop, deliver, and assess an anatomy laboratory course using augmented reality applications. We suggest a novel approach, based on Microsoft® HoloLens II, to develop systematic sequences of holograms to reproduce human dissection. The laboratory sessions are prepared before classes and include a series of holograms revealing sequential layers of the human body, isolated structures, or a combination of structures forming a system or a functional unit. The in-class activities are conducted either as one group of students (n = 8-9) with a leading facilitator or small groups of students (n = 4) with facilitators (n = 4) joining the groups for discussion. The same or different sessions may be used for the assessment of students' knowledge. Although currently in its infancy, the use of holograms will soon become a substantial part of medical education. Currently, several companies are offering a range of useful learning platforms, from anatomy education to patient encounters. By describing the holographic program at our institution, we hope to provide a roadmap for other institutions looking to implement a systematic approach to teaching anatomy through holographic dissection. This approach has several benefits, including a sequential 3D presentation of the human body with varying layers of dissection, demonstrations of facilitator-selected three-dimensional (3D) anatomical regions or specific body units, and the option for classroom or remote facilitation, with the ability for students to review each session individually.
Collapse
Affiliation(s)
| | - Mostafa Belghasem
- Department of Biomedical Sciences, Kaiser Permanente Bernard J. Tyson School of Medicine, Pasadena, USA
| | - Brett A White
- Department of Clinical Science, Kaiser Permanente Bernard J. Tyson School of Medicine, Pasadena, USA
| | - Samuel Kadavakollu
- Department of Biomedical Education, College of Osteopathic Medicine, California Health Sciences University, Clovis, USA
| |
Collapse
|
2
|
Bond AP, Butts T, Tierney CM. Spot(ters) the difference: Bringing traditional anatomical examinations online. Clin Anat 2024; 37:284-293. [PMID: 37409502 DOI: 10.1002/ca.24092] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2022] [Revised: 06/23/2023] [Accepted: 06/24/2023] [Indexed: 07/07/2023]
Abstract
The COVID-19 pandemic caused a shift in anatomy education forcing institutions to find innovative ways to teach and assess online. This study details the development of an online spotter across multiple modules that allowed students to sit the examination at home whilst still maintaining the integrity of the assessment. The online spotter consisted of individual, Zoom calls between students and examiners whereby slides with images and questions were screen shared. To examine the viability of this spotter in non-lockdown scenarios several parameters were considered. Mean marks were compared to traditional versions and Pearson's r correlation coefficients were calculated between online and traditional spotters and between online spotters and overall performance in anatomy modules. A survey was carried out to determine the students' view of the assessment. Pearson's r was between 0.33 and 0.49 when comparing online spotters to the traditional format, and between 0.65 and 0.75 (p < 0.01) when compared to a calculated anatomy score. The survey indicated overall student satisfaction as 82.5% reported that it was a fair way to test their knowledge and 55% reported the same or lower levels of anxiety when compared to traditional spotters. However, there was nothing to indicate that the students preferred this format over laboratory-based spotters. These results indicate that this new exam format would be useful for small cohorts who are undertaking online or hybrid courses, or in circumstances when running a full spotter is too costly, and represents a fair and robust way to assess practical anatomical knowledge online.
Collapse
Affiliation(s)
- Alistair P Bond
- Human Anatomy Resource Centre, Education Directorate, Faculty of Health and Life Science, University of Liverpool, Liverpool, UK
| | - Thomas Butts
- School of Medicine, University of Sunderland, Sunderland, UK
| | - Claire M Tierney
- Human Anatomy Resource Centre, Education Directorate, Faculty of Health and Life Science, University of Liverpool, Liverpool, UK
| |
Collapse
|
3
|
Merzougui WH, Myers MA, Hall S, Elmansouri A, Parker R, Robson AD, Kurn O, Parrott R, Geoghegan K, Harrison CH, Anbu D, Dean O, Border S. Multiple-Choice versus Open-Ended Questions in Advanced Clinical Neuroanatomy: Using a National Neuro anatomy Assessment to Investigate Variability in Performance Using Different Question Types. Anat Sci Educ 2021; 14:296-305. [PMID: 33420758 DOI: 10.1002/ase.2053] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/12/2020] [Revised: 12/10/2020] [Accepted: 01/05/2021] [Indexed: 06/12/2023]
Abstract
Methods of assessment in anatomy vary across medical schools in the United Kingdom (UK) and beyond; common methods include written, spotter, and oral assessment. However, there is limited research evaluating these methods in regards to student performance and perception. The National Undergraduate Neuroanatomy Competition (NUNC) is held annually for medical students throughout the UK. Prior to 2017, the competition asked open-ended questions (OEQ) in the anatomy spotter examination, and in subsequent years also asked single best answer (SBA) questions. The aim of this study is to assess medical students' performance on, and perception of, SBA and OEQ methods of assessment in a spotter style anatomy examination. Student examination performance was compared between OEQ (2013-2016) and SBA (2017-2020) for overall score and each neuroanatomical subtopic. Additionally, a questionnaire explored students' perceptions of SBAs. A total of 631 students attended the NUNC in the studied period. The average mark was significantly higher in SBAs compared to OEQs (60.6% vs. 43.1%, P < 0.0001)-this was true for all neuroanatomical subtopics except the cerebellum. Students felt that they performed better on SBA than OEQs, and diencephalon was felt to be the most difficult neuroanatomical subtopic (n = 38, 34.8%). Students perceived SBA questions to be easier than OEQs and performed significantly better on them in a neuroanatomical spotter examination. Further work is needed to ascertain whether this result is replicable throughout anatomy education.
Collapse
Affiliation(s)
- Wassim H Merzougui
- Center for Learning Anatomical Sciences, Faculty of Medicine, University of Southampton, Southampton, United Kingdom
- Department of Trauma and Orthopedics, Pilgrim Hospital, Boston, United Kingdom
| | - Matthew A Myers
- Center for Learning Anatomical Sciences, Faculty of Medicine, University of Southampton, Southampton, United Kingdom
- Department of Neurosurgery, Salford Royal NHS Foundation Trust, Salford, United Kingdom
| | - Samuel Hall
- Center for Learning Anatomical Sciences, Faculty of Medicine, University of Southampton, Southampton, United Kingdom
- Department of Neurosurgery, Wessex Neurological Centre, Southampton, United Kingdom
| | - Ahmad Elmansouri
- Department of Medical Education, Brighton and Sussex Medical School, University of Sussex, Brighton, United Kingdom
| | - Rob Parker
- University of Southampton, School of Medicine, Southampton, United Kingdom
| | - Alistair D Robson
- University of Southampton, School of Medicine, Southampton, United Kingdom
| | - Octavia Kurn
- University of Southampton, School of Medicine, Southampton, United Kingdom
| | - Rachel Parrott
- Department of Anatomy, St Andrews University, St Andrews, Scotland
| | - Kate Geoghegan
- Department of Cardiology, Royal United Hospital, Bath, United Kingdom
| | - Charlotte H Harrison
- Department of Emergency Medicine, Oxford University Hospitals NHS Foundation Trust, Oxford, United Kingdom
| | - Deepika Anbu
- University of Southampton, School of Medicine, Southampton, United Kingdom
| | - Oliver Dean
- University of Southampton, School of Medicine, Southampton, United Kingdom
| | - Scott Border
- Center for Learning Anatomical Sciences, Faculty of Medicine, University of Southampton, Southampton, United Kingdom
| |
Collapse
|
4
|
Guimarães B, Ribeiro J, Cruz B, Ferreira A, Alves H, Cruz-Correia R, Madeira MD, Ferreira MA. Performance equivalency between computer-based and traditional pen-and-paper assessment: A case study in clinical anatomy. Anat Sci Educ 2018; 11:124-136. [PMID: 28817229 DOI: 10.1002/ase.1720] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/29/2016] [Revised: 05/09/2017] [Accepted: 07/19/2017] [Indexed: 06/07/2023]
Abstract
The time, material, and staff-consuming nature of anatomy's traditional pen-and-paper assessment system, the increase in the number of students enrolling in medical schools and the ever-escalating workload of academic staff have made the use of computer-based assessment (CBA) an attractive proposition. To understand the impact of such shift in the assessment method, an experimental study evaluating its effect on students' performance was designed. Additionally, students' opinions toward CBA were gathered. Second-year medical students attending a Clinical Anatomy course were randomized by clusters in two groups. The pen-and-paper group attended two sessions, each consisting of a traditional sectional anatomy steeplechase followed by a theoretical examination, while the computer group was involved in two similar sessions conducted in a computerized environment. At the end of each of the computer sessions, students in this group filled an anonymous questionnaire. In the first session, pen-and-paper group students scored significantly better than computer-group students in both the steeplechase (mean ± standard deviation: 66.00 ± 14.15% vs. 43.50 ± 19.10%; P < 0.001) and the theoretical examination (52.50 ± 12.70% vs. 39.00 ± 21.10%; P < 0.001). In the second session, no statistically significant differences were found for both the steeplechase (59.50 ± 17.30% vs. 54.50 ± 17.00%; P = 0.085) and the theoretical examination (57.50 ± 13.70% vs. 54.00 ± 14.30%; P = 0.161). Besides, an intersession improvement in students' perceptions toward CBA was registered. These results suggest that, after a familiarization period, CBA might be a performance equivalent and student accepted alternative to clinical anatomy pen-and-paper theoretical and practical examinations. Anat Sci Educ 11: 124-136. © 2017 American Association of Anatomists.
Collapse
Affiliation(s)
- Bruno Guimarães
- Department of Biomedicine, Faculty of Medicine, University of Porto, Porto, Portugal
- Department of Public Health, Forensic Sciences and Medical Education, Faculty of Medicine, University of Porto, Porto, Portugal
- Center for Research in Health Technologies and Information Systems (CINTESIS), Faculty of Medicine, University of Porto, Porto, Portugal
- Department of Physical and Rehabilitation Medicine, Centro Hospitalar de Entre o Douro e Vouga, Santa Maria da Feira, Portugal
| | - José Ribeiro
- Department of Biomedicine, Faculty of Medicine, University of Porto, Porto, Portugal
- Department of Public Health, Forensic Sciences and Medical Education, Faculty of Medicine, University of Porto, Porto, Portugal
| | - Bernardo Cruz
- Department of Biomedicine, Faculty of Medicine, University of Porto, Porto, Portugal
- Department of Public Health, Forensic Sciences and Medical Education, Faculty of Medicine, University of Porto, Porto, Portugal
| | - André Ferreira
- Department of Biomedicine, Faculty of Medicine, University of Porto, Porto, Portugal
- Center for Research in Health Technologies and Information Systems (CINTESIS), Faculty of Medicine, University of Porto, Porto, Portugal
| | - Hélio Alves
- Department of Biomedicine, Faculty of Medicine, University of Porto, Porto, Portugal
- Department of Public Health, Forensic Sciences and Medical Education, Faculty of Medicine, University of Porto, Porto, Portugal
| | - Ricardo Cruz-Correia
- Center for Research in Health Technologies and Information Systems (CINTESIS), Faculty of Medicine, University of Porto, Porto, Portugal
| | - Maria Dulce Madeira
- Department of Biomedicine, Faculty of Medicine, University of Porto, Porto, Portugal
- Center for Research in Health Technologies and Information Systems (CINTESIS), Faculty of Medicine, University of Porto, Porto, Portugal
| | - Maria Amélia Ferreira
- Department of Public Health, Forensic Sciences and Medical Education, Faculty of Medicine, University of Porto, Porto, Portugal
| |
Collapse
|
5
|
Pickering JD, Bickerdike SR. Medical student use of Facebook to support preparation for anatomy assessments. Anat Sci Educ 2017; 10:205-214. [PMID: 27806192 DOI: 10.1002/ase.1663] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/12/2016] [Revised: 10/17/2016] [Accepted: 10/17/2016] [Indexed: 06/06/2023]
Abstract
The use of Facebook to support students is an emerging area of educational research. This study explored how a Facebook Page could support Year 2 medical (MBChB) students in preparation for summative anatomy assessments and alleviate test anxiety. Overall, Facebook analytics revealed that in total 49 (19.8% of entire cohort) students posted a comment in preparation for either the first (33 students) or second (34) summative anatomy assessments. 18 students commented in preparation for both. In total, 155 comments were posted, with 83 for the first and 72 for the second. Of the 83 comments, 45 related to checking anatomical information, 30 were requiring assessment information and 8 wanted general course information. For the second assessment this was 52, 14 and 6, respectively. Student perceptions on usage, and impact on learning and assessment preparation were obtained via a five-point Likert-style questionnaire, with 119 students confirming they accessed the Page. Generally, students believed the Page was an effective way to support their learning, and provided information which supported their preparation with increases in perceived confidence and reductions in anxiety. There was no difference between gender, except for males who appeared to be significantly less likely to ask a question as they may be perceived to lack knowledge (P < 0.05). This study suggests that Facebook can play an important role in supporting students in preparation for anatomy assessments. Anat Sci Educ 10: 205-214. © 2016 American Association of Anatomists.
Collapse
Affiliation(s)
- James D Pickering
- Division of Anatomy, Leeds Institute of Medical Education, School of Medicine, University of Leeds, Leeds, United Kingdom
| | - Suzanne R Bickerdike
- Leeds Institute of Medical Education, School of Medicine, University of Leeds, Leeds, United Kingdom
| |
Collapse
|
6
|
Byram JN, Seifert MF, Brooks WS, Fraser-Cotlin L, Thorp LE, Williams JM, Wilson AB. Using generalizability analysis to estimate parameters for anatomy assessments: A multi-institutional study. Anat Sci Educ 2017; 10:109-119. [PMID: 27458988 DOI: 10.1002/ase.1631] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/22/2016] [Revised: 06/09/2016] [Accepted: 06/09/2016] [Indexed: 06/06/2023]
Abstract
With integrated curricula and multidisciplinary assessments becoming more prevalent in medical education, there is a continued need for educational research to explore the advantages, consequences, and challenges of integration practices. This retrospective analysis investigated the number of items needed to reliably assess anatomical knowledge in the context of gross anatomy and histology. A generalizability analysis was conducted on gross anatomy and histology written and practical examination items that were administered in a discipline-based format at Indiana University School of Medicine and in an integrated fashion at the University of Alabama School of Medicine and Rush University Medical College. Examination items were analyzed using a partially nested design s×(i:o) in which items were nested within occasions (i:o) and crossed with students (s). A reliability standard of 0.80 was used to determine the minimum number of items needed across examinations (occasions) to make reliable and informed decisions about students' competence in anatomical knowledge. Decision study plots are presented to demonstrate how the number of items per examination influences the reliability of each administered assessment. Using the example of a curriculum that assesses gross anatomy knowledge over five summative written and practical examinations, the results of the decision study estimated that 30 and 25 items would be needed on each written and practical examination to reach a reliability of 0.80, respectively. This study is particularly relevant to educators who may question whether the amount of anatomy content assessed in multidisciplinary evaluations is sufficient for making judgments about the anatomical aptitude of students. Anat Sci Educ 10: 109-119. © 2016 American Association of Anatomists.
Collapse
Affiliation(s)
- Jessica N Byram
- Department of Anatomy and Cell Biology, Indiana University School of Medicine, Indianapolis, Indiana
| | - Mark F Seifert
- Department of Anatomy and Cell Biology, Indiana University School of Medicine, Indianapolis, Indiana
| | - William S Brooks
- Department of Cell, Developmental, and Integrative Biology, University of Alabama at Birmingham School of Medicine, Birmingham, Alabama
| | - Laura Fraser-Cotlin
- Department of Cell, Developmental, and Integrative Biology, University of Alabama at Birmingham School of Medicine, Birmingham, Alabama
| | - Laura E Thorp
- Department of Physical Therapy, University of Illinois at Chicago, Chicago, Illinois
| | - James M Williams
- Department of Anatomy and Cell Biology, Rush University, Chicago, Illinois
| | - Adam B Wilson
- Department of Anatomy and Cell Biology, Rush University, Chicago, Illinois
| |
Collapse
|
7
|
Cui D, Wilson TD, Rockhold RW, Lehman MN, Lynch JC. Evaluation of the effectiveness of 3D vascular stereoscopic models in anatomy instruction for first year medical students. Anat Sci Educ 2017; 10:34-45. [PMID: 27273896 DOI: 10.1002/ase.1626] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/02/2015] [Revised: 05/04/2016] [Accepted: 05/04/2016] [Indexed: 05/26/2023]
Abstract
The head and neck region is one of the most complex areas featured in the medical gross anatomy curriculum. The effectiveness of using three-dimensional (3D) models to teach anatomy is a topic of much discussion in medical education research. However, the use of 3D stereoscopic models of the head and neck circulation in anatomy education has not been previously studied in detail. This study investigated whether 3D stereoscopic models created from computed tomographic angiography (CTA) data were efficacious teaching tools for the head and neck vascular anatomy. The test subjects were first year medical students at the University of Mississippi Medical Center. The assessment tools included: anatomy knowledge tests (prelearning session knowledge test and postlearning session knowledge test), mental rotation tests (spatial ability; presession MRT and postsession MRT), and a satisfaction survey. Results were analyzed using a Wilcoxon rank-sum test and linear regression analysis. A total of 39 first year medical students participated in the study. The results indicated that all students who were exposed to the stereoscopic 3D vascular models in 3D learning sessions increased their ability to correctly identify the head and neck vascular anatomy. Most importantly, for students with low-spatial ability, 3D learning sessions improved postsession knowledge scores to a level comparable to that demonstrated by students with high-spatial ability indicating that the use of 3D stereoscopic models may be particularly valuable to these students with low-spatial ability. Anat Sci Educ 10: 34-45. © 2016 American Association of Anatomists.
Collapse
Affiliation(s)
- Dongmei Cui
- Department of Neurobiology and Anatomical Sciences, University of Mississippi Medical Center, Jackson, Mississippi
| | - Timothy D Wilson
- Schulich School of Medicine and Dentistry, Department of Anatomy and Cell Biology, Western University, London, Ontario, Canada
- Corps for Research of Instructional and Perceptual Technologies (CRIPT), Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada
| | - Robin W Rockhold
- Department of Pharmacology and Toxicology, and Health Sciences, University of Mississippi Medical Center, Jackson, Mississippi
| | - Michael N Lehman
- Department of Neurobiology and Anatomical Sciences, University of Mississippi Medical Center, Jackson, Mississippi
| | - James C Lynch
- Department of Neurobiology and Anatomical Sciences, University of Mississippi Medical Center, Jackson, Mississippi
| |
Collapse
|
8
|
Notebaert AJ. The effect of images on item statistics in multiple choice anatomy examinations. Anat Sci Educ 2017; 10:68-78. [PMID: 27472765 DOI: 10.1002/ase.1637] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/01/2015] [Revised: 07/08/2016] [Accepted: 07/11/2016] [Indexed: 06/06/2023]
Abstract
Although multiple choice examinations are often used to test anatomical knowledge, these often forgo the use of images in favor of text-based questions and answers. Because anatomy is reliant on visual resources, examinations using images should be used when appropriate. This study was a retrospective analysis of examination items that were text based compared to the same questions when a reference image was included with the question stem. Item difficulty and discrimination were analyzed for 15 multiple choice items given across two different examinations in two sections of an undergraduate anatomy course. Results showed that there were some differences item difficulty but these were not consistent to either text items or items with reference images. Differences in difficulty were mainly attributable to one group of students performing better overall on the examinations. There were no significant differences for item discrimination for any of the analyzed items. This implies that reference images do not significantly alter the item statistics, however this does not indicate if these images were helpful to the students when answering the questions. Care should be taken by question writers to analyze item statistics when making changes to multiple choice questions, including ones that are included for the perceived benefit of the students. Anat Sci Educ 10: 68-78. © 2016 American Association of Anatomists.
Collapse
Affiliation(s)
- Andrew J Notebaert
- Department of Neurobiology and Anatomical Sciences, Clinical Anatomy Division, University of Mississippi Medical Center, Jackson, Mississippi
| |
Collapse
|
9
|
Raubenheimer D, Raubenheimer JE, van Zyl S. A scoring framework for assessing anatomy competence of undergraduate preclinical students. Anat Sci Educ 2016; 9:319-329. [PMID: 26588194 DOI: 10.1002/ase.1585] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/21/2015] [Revised: 10/26/2015] [Accepted: 10/31/2015] [Indexed: 06/05/2023]
Abstract
Recent higher education changes toward outcomes-based education emphasize competent learners, but a widely accepted definition of competence is still lacking. Although the importance of anatomy in health professions education is recognized, there is still uncertainty about what anatomical competence entails and how to assess it. This study aimed to provide a framework for assessing anatomical competence, using an anatomy competence score, for the anatomy course in the undergraduate medical learning program at the University of the Free State in South Africa. All assessments within the dissection program of two student groups (July 2012 to June 2014) were explored to determine the representation of the three competence domains: knowledge, skill and application in context. Student performance in the final objective structured practical examination (OSPE) was investigated for the three domains and the different body regions. Knowledge had ±50% representation in assessments and the different body regions (in final OSPE) for both groups, and skill and application represented ±25% each in both groups. The best average student performance was in the skill domain (64% and 67% for the respective groups). All domains showed good reliabilities (> 0.75) and student performance correlated well between the domains (P < 0.001). This study suggests a representation ratio of 2:1:1 between knowledge, skill and application (i.e., 50% knowledge and 25% skill and application respectively), for anatomical competence assessment. However, this ratio depends on the assessment type, the stage of the anatomy course and the institutional context. Nonetheless, it provides a guideline for ensuring that assessments address all competence domains. Anat Sci Educ 9: 319-329. © 2015 American Association of Anatomists.
Collapse
Affiliation(s)
- Daleen Raubenheimer
- Department of Basic Medical Sciences, Faculty of Health Sciences, University of the Free State, Bloemfontein, South Africa
| | - Jacques Eugene Raubenheimer
- Department of Biostatistics, Faculty of Health Sciences, University of the Free State, Bloemfontein, South Africa
| | - Sanet van Zyl
- Department of Basic Medical Sciences, Faculty of Health Sciences, University of the Free State, Bloemfontein, South Africa
| |
Collapse
|
10
|
O'Mahony SM, Sbayeh A, Horgan M, O'Flynn S, O'Tuathaigh CMP. Association between learning style preferences and anatomy assessment outcomes in graduate-entry and undergraduate medical students. Anat Sci Educ 2016; 9:391-399. [PMID: 26845590 DOI: 10.1002/ase.1600] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/14/2015] [Revised: 01/06/2016] [Accepted: 01/06/2016] [Indexed: 06/05/2023]
Abstract
An improved understanding of the relationship between anatomy learning performance and approaches to learning can lead to the development of a more tailored approach to delivering anatomy teaching to medical students. This study investigated the relationship between learning style preferences, as measured by Visual, Aural, Read/write, and Kinesthetic (VARK) inventory style questionnaire and Honey and Mumford's learning style questionnaire (LSQ), and anatomy and clinical skills assessment performance at an Irish medical school. Additionally, mode of entry to medical school [undergraduate/direct-entry (DEM) vs. graduate-entry (GEM)], was examined in relation to individual learning style, and assessment results. The VARK and LSQ were distributed to first and second year DEM, and first year GEM students. DEM students achieved higher clinical skills marks than GEM students, but anatomy marks did not differ between each group. Several LSQ style preferences were shown to be weakly correlated with anatomy assessment performance in a program- and year-specific manner. Specifically, the "Activist" style was negatively correlated with anatomy scores in DEM Year 2 students (rs = -0.45, P = 0.002). The "Theorist" style demonstrated a weak correlation with anatomy performance in DEM Year 2 (rs = 0.18, P = 0.003). Regression analysis revealed that, among the LSQ styles, the "Activist" was associated with poorer anatomy assessment performance (P < 0.05), while improved scores were associated with students who scored highly on the VARK "Aural" modality (P < 0.05). These data support the contention that individual student learning styles contribute little to variation in academic performance in medical students. Anat Sci Educ 9: 391-399. © 2016 American Association of Anatomists.
Collapse
Affiliation(s)
- Siobhain M O'Mahony
- Department of Anatomy and Neuroscience, School of Medicine, University College Cork, Ireland
| | - Amgad Sbayeh
- Department of Anatomy and Neuroscience, School of Medicine, University College Cork, Ireland
- Medical Education Unit, School of Medicine, University College Cork, Ireland
- Graduate Entry Medical School, University of Limerick, Ireland
| | - Mary Horgan
- Medical Education Unit, School of Medicine, University College Cork, Ireland
| | - Siun O'Flynn
- Medical Education Unit, School of Medicine, University College Cork, Ireland
| | | |
Collapse
|
11
|
Green RA, Cates T, White L, Farchione D. Do collaborative practical tests encourage student-centered active learning of gross anatomy? Anat Sci Educ 2016; 9:231-237. [PMID: 26415089 DOI: 10.1002/ase.1564] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/09/2015] [Revised: 08/07/2015] [Accepted: 08/07/2015] [Indexed: 06/05/2023]
Abstract
Benefits of collaborative testing have been identified in many disciplines. This study sought to determine whether collaborative practical tests encouraged active learning of anatomy. A gross anatomy course included a collaborative component in four practical tests. Two hundred and seven students initially completed the test as individuals and then worked as a team to complete the same test again immediately afterwards. The relationship between mean individual, team, and difference (between team and individual) test scores to overall performance on the final examination (representing overall learning in the course) was examined using regression analysis. The overall mark in the course increased by 9% with a decreased failure rate. There was a strong relationship between individual score and final examination mark (P < 0.001) but no relationship for team score (P = 0.095). A longitudinal analysis showed that the test difference scores increased after Test 1 which may be indicative of social loafing and this was confirmed by a significant negative relationship between difference score on Test 4 (indicating a weaker student) and final examination mark (P < 0.001). It appeared that for this cohort, there was little peer-to-peer learning occurring during the collaborative testing and that weaker students gained the benefit from team marks without significant active learning taking place. This negative outcome may be due to insufficient encouragement of the active learning strategies that were expected to occur during the collaborative testing process. An improved understanding of the efficacy of collaborative assessment could be achieved through the inclusion of questionnaire based data to allow a better interpretation of learning outcomes. Anat Sci Educ 9: 231-237. © 2015 American Association of Anatomists.
Collapse
Affiliation(s)
- Rodney A Green
- Department of Pharmacy and Applied Sciences, College of Science, Health and Engineering, La Trobe University, Bendigo, Victoria, Australia
| | - Tanya Cates
- Department of Physiology, Anatomy and Microbiology, College of Science, Health and Engineering, La Trobe University, Bundoora, Victoria, Australia
| | - Lloyd White
- Department of Physiology, Anatomy and Microbiology, College of Science, Health and Engineering, La Trobe University, Bundoora, Victoria, Australia
| | - Davide Farchione
- Department of Mathematics and Statistics, College of Science, Health and Engineering, La Trobe University, Bundoora, Victoria, Australia
| |
Collapse
|
12
|
Zhang G, Fenderson BA, Schmidt RR, Veloski JJ. Equivalence of students' scores on timed and untimed anatomy practical examinations. Anat Sci Educ 2013; 6:281-285. [PMID: 23463722 DOI: 10.1002/ase.1357] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/01/2012] [Revised: 12/18/2012] [Accepted: 01/15/2013] [Indexed: 06/01/2023]
Abstract
Untimed examinations are popular with students because there is a perception that first impressions may be incorrect, and that difficult questions require more time for reflection. In this report, we tested the hypothesis that timed anatomy practical examinations are inherently more difficult than untimed examinations. Students in the Doctor of Physical Therapy program at Thomas Jefferson University were assessed on their understanding of anatomic relationships using multiple-choice questions. For the class of 2012 (n = 46), students were allowed to circulate freely among 40 testing stations during the 40-minute testing session. For the class of 2013 (n = 46), students were required to move sequentially through the 40 testing stations (one minute per item). Students in both years were given three practical examinations covering the back/upper limb, lower limb, and trunk. An identical set of questions was used for both groups of students (untimed and timed examinations). Our results indicate that there is no significant difference between student performance on untimed and timed examinations (final percent scores of 87.3 and 88.9, respectively). This result also held true for students in the top and bottom 20th percentiles of the class. Moreover, time limits did not lead to errors on even the most difficult, higher-order questions (i.e., items with P-values < 0.70). Thus, limiting time at testing stations during an anatomy practical examination does not adversely affect student performance.
Collapse
Affiliation(s)
- Guiyun Zhang
- Department of Pathology, Anatomy and Cell Biology, Jefferson Medical College, Thomas Jefferson University, Philadelphia, Pennsylvania
| | | | | | | |
Collapse
|