1
|
Penner JC, Schuwirth L, Durning SJ. From Noise to Music: Reframing the Role of Context in Clinical Reasoning. J Gen Intern Med 2024; 39:851-857. [PMID: 38243110 PMCID: PMC11043232 DOI: 10.1007/s11606-024-08612-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/02/2023] [Accepted: 01/05/2024] [Indexed: 01/21/2024]
Affiliation(s)
- John C Penner
- Department of Medicine, University of California, San Francisco, CA, USA.
- Medical Service, San Francisco Veterans Affairs Medical Center, San Francisco, CA, USA.
| | - Lambert Schuwirth
- Prideaux Discipline of Clinical Education, Flinders University, Adelaide, SA, Australia
| | - Steven J Durning
- Center for Health Professions Education, Uniformed Services University of the Health Sciences, Bethesda, MD, USA
| |
Collapse
|
2
|
Schuwirth L, O'Connor N, Lyubomirsky A. Modelling the rate of trainees transitioning to Fellowship… response to Amos et al. Australas Psychiatry 2024; 32:102. [PMID: 38157228 DOI: 10.1177/10398562231224170] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 01/03/2024]
|
3
|
Fawns T, Schuwirth L. Rethinking the value proposition of assessment at a time of rapid development in generative artificial intelligence. Med Educ 2024; 58:14-16. [PMID: 37882469 DOI: 10.1111/medu.15259] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/07/2023] [Accepted: 10/11/2023] [Indexed: 10/27/2023]
Abstract
The authors argue for evolving assessment practices by focusing on their value proposition to educators (producing the medical graduates needed for society) rather than protecting existing processes.
Collapse
Affiliation(s)
- Tim Fawns
- Monash Education Academy, Monash University, Melbourne, Australia
| | - Lambert Schuwirth
- College of Medicine and Public Health, Flinders University, Adelaide, Australia
| |
Collapse
|
4
|
O'Connor N, Schuwirth L, Bhatt A, Lyubomirsky A. Is workplace-based assessment achievable in practice? Response to Holmes. Australas Psychiatry 2023; 31:856. [PMID: 37904325 DOI: 10.1177/10398562231211111] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/01/2023]
|
5
|
Schuwirth L, Heyligers I. [Fair assessment judgements of specialty registrars in clinical practise]. Ned Tijdschr Geneeskd 2023; 167:D7337. [PMID: 38175609] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 01/05/2024]
Abstract
Forming judgements about registrars in a workplace-based assessment context is not easy. Of course, the workplace is an important environment for learning, but it's also often very busy and sometimes complex. Assessment judgements can, therefore, not always be standardised and structured. Nevertheless, judgements have to be defensible and fair. In this paper we will discuss the main guidelines to ensure that assessment judgements are defensible and fair. We will approach these from a perspective of assessment for learning - with the intent to use feedback in order to guide learning - and from the perspective of assessment of learning, using judgements to determine whether a registrar is ready to progress to a next phase. Core elements of good judgement are transparency in the form of constructive and actionable feedback in dialogue with the registrar, and procedural fairness in the form of multiple judgements, over time and with clear documentation. This all needs to be supported by a training quality assurance system.
Collapse
|
6
|
Valentine N, Durning SJ, Shanahan EM, Schuwirth L. What Stops Fairness from Emerging in Assessment? The Forces on a Complex Adaptive System. Perspect Med Educ 2023; 12:338-347. [PMID: 37636331 PMCID: PMC10453954 DOI: 10.5334/pme.994] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/31/2023] [Accepted: 08/07/2023] [Indexed: 08/29/2023]
Abstract
Introduction Workplace-based assessment occurs in authentic, dynamic clinical environments where reproducible, measurement-based assessments can often not be implemented. In these environments, research approaches that respect these multiple dynamic interactions, such as complexity perspectives, are encouraged. Previous research has shown that fairness in assessment is a nonlinear phenomenon that emerges from interactions between its components and behaves like a complex adaptative system. The aim of this study was to understand the external forces on the complex adaptive system which may disrupt fairness from emerging. Methods We conducted online focus groups with a purposeful sample of nineteen academic leaders in the Netherlands. We used an iterative approach to collection, analysis and coding of the data and interpreted the results using a lens of complexity, focusing on how individual elements of fairness work in concert to create systems with complex behaviour. Results We identified three themes of forces which can disrupt fairness: forces impairing interactivity, forces impairing adaption and forces impairing embeddedness. Within each of these themes, we identified subthemes: assessor and student forces, tool forces and system forces. Discussion Consistent with complexity theory, this study suggests there are multiple forces which can hamper the emergence of fairness. Whilst complexity thinking does not reduce the scale of the challenge, viewing forces through this lens provides insight into why and how these forces are disrupting fairness. This allows for more purposeful, meaningful changes to support the use of fair judgement in assessment in dynamic authentic clinical workplaces.
Collapse
Affiliation(s)
- Nyoli Valentine
- Prideaux Discipline of Clinical Education, Flinders University, Bedford Park, South Australia, Australia
| | - Steven J. Durning
- Center for Health Professions Education, Uniformed Services University of the Health Sciences, Bethesda, Maryland, USA
| | | | - Lambert Schuwirth
- Prideaux Discipline of Clinical Education, Flinders University, Bedford Park, South Australia, Australia
| |
Collapse
|
7
|
Valentine N, Durning SJ, Shanahan EM, Schuwirth L. Fairness in Assessment: Identifying a Complex Adaptive System. Perspect Med Educ 2023; 12:315-326. [PMID: 37520508 PMCID: PMC10377744 DOI: 10.5334/pme.993] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/31/2023] [Accepted: 07/02/2023] [Indexed: 08/01/2023]
Abstract
Introduction Assessment design in health professions education is continuously evolving. There is an increasing desire to better embrace human judgement in assessment. Thus, it is essential to understand what makes this judgement fair. This study builds upon existing literature by studying how assessment leaders conceptualise the characteristics of fair judgement. Methods Sixteen assessment leaders from 15 medical schools in Australia and New Zealand participated in online focus groups. Data collection and analysis occurred concurrently and iteratively. We used the constant comparison method to identify themes and build on an existing conceptual model of fair judgement in assessment. Results Fairness is a multi-dimensional construct with components at environment, system and individual levels. Components influencing fairness include articulated and agreed learning outcomes relating to the needs of society, a culture which allows for learner support, stakeholder agency and learning (environmental level), collection, interpretation and combination of evidence, procedural strategies (system level) and appropriate individual assessments and assessor expertise and agility (individual level). Discussion We observed that within the data at fractal, that is an infinite pattern repeating at different scales, could be seen suggesting fair judgement should be considered a complex adaptive system. Within complex adaptive systems, it is primarily the interaction between the entities which influences the outcome it produces, not simply the components themselves. Viewing fairness in assessment through a lens of complexity rather than as a linear, causal model has significant implications for how we design assessment programs and seek to utilise human judgement in assessment.
Collapse
Affiliation(s)
- Nyoli Valentine
- Prideaux Discipline of Clinical Education, Flinders University, Bedford Park, South Australia, Australia
| | - Steven J. Durning
- Department of Medicine, Director, Center for Health Professions Education, Uniformed Services University of the Health Sciences, Bethesda, MD, United States
| | | | - Lambert Schuwirth
- Prideaux Discipline of Clinical Education, Flinders University, Bedford Park, South Australia, Australia
| |
Collapse
|
8
|
Valentine N, Schuwirth L. Using fairness to reconcile tensions between coaching and assessment. Med Educ 2023; 57:213-216. [PMID: 36346304 DOI: 10.1111/medu.14968] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/27/2022] [Accepted: 11/02/2022] [Indexed: 06/16/2023]
Affiliation(s)
- Nyoli Valentine
- Prideaux Discipline of Clinical Education, Flinders University, Bedford Park, South Australia, Australia
| | - Lambert Schuwirth
- Prideaux Discipline of Clinical Education, Flinders University, Bedford Park, South Australia, Australia
| |
Collapse
|
9
|
Handoyo NE, Claramita M, Keraf MKPA, Ash J, Schuwirth L, Rahayu GR. The importance of developing meaningfulness and manageability for resilience in rural doctors. Med Teach 2023; 45:32-39. [PMID: 36202102 DOI: 10.1080/0142159x.2022.2128734] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
INTRODUCTION Retention of rural doctors is a problem in many countries. A previous study has identified resilience as a factor associated with longer retention. However, this needs a deeper study to understand what local and personal factors are at play. Studies suggest resilience can be developed during training. We propose that a better understanding of factors associated with resilience might assist in training students for rural practice and increase retention. AIM This study aimed to understand the differences in resilience development between the more and the less resilient rural doctors. A secondary purpose was to identify how to assist this developmental process through health professional education. METHODS This study employed a mixed-method design and was part of a more extensive study aiming to develop rural doctors' resilience in a low-resource setting. A prior survey assessed rural doctors' resilience levels. This study sampled high-level and low-level resilience participants to be interviewed. A total of 22 rural doctors participated in the individual semi-structured interviews. The interviews were analyzed qualitatively based on Richardson's Resilience Model and the six resilience dimensions looking for factors that explained high or low resilience. RESULTS Two important themes emerged during the qualitative analysis: 'meaningfulness' and 'manageability.' The different responses of high and low-resilient participants can be explained through cases. CONCLUSIONS The participants' perceived meaningfulness and manageability of the stressor determine the responses. We suggest that teachers may better construct students' resilience by focussing on assisting them in finding meaning and developing a sense of manageability.
Collapse
Affiliation(s)
- Nicholas E Handoyo
- Faculty of Medicine and Veterinary Medicine, Universitas Nusa Cendana, Kupang, Nusa Tenggara Timur, Indonesia
| | - Mora Claramita
- Department of Medical, Health Professions Education, and Bioethics, Faculty of Medicine Nursing and Public Health, Universitas Gadjah Mada, Yogyakarta, Indonesia
| | | | - Julie Ash
- Prideaux Center for Research in Health Professions Education, Flinders University, Adelaide, South Australia
- Adelaide Rural Clinical School, University of Adelaide, Adelaide, South Australia
| | - Lambert Schuwirth
- Prideaux Center for Research in Health Professions Education, Flinders University, Adelaide, South Australia
| | - Gandes R Rahayu
- Department of Medical, Health Professions Education, and Bioethics, Faculty of Medicine Nursing and Public Health, Universitas Gadjah Mada, Yogyakarta, Indonesia
| |
Collapse
|
10
|
Torre D, Schuwirth L, Van der Vleuten C, Heeneman S. An international study on the implementation of programmatic assessment: Understanding challenges and exploring solutions. Med Teach 2022; 44:928-937. [PMID: 35701165 DOI: 10.1080/0142159x.2022.2083487] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
INTRODUCTION Programmatic assessment is an approach to assessment aimed at optimizing the learning and decision function of assessment. It involves a set of key principles and ground rules that are important for its design and implementation. However, despite its intuitive appeal, its implementation remains a challenge. The purpose of this paper is to gain a better understanding of the factors that affect the implementation process of programmatic assessment and how specific implementation challenges are managed across different programs. METHODS An explanatory multiple case (collective) approach was used for this study. We identified 6 medical programs that had implemented programmatic assessment with variation regarding health profession disciplines, level of education and geographic location. We conducted interviews with a key faculty member from each of the programs and analyzed the data using inductive thematic analysis. RESULTS We identified two major factors in managing the challenges and complexity of the implementation process: knowledge brokers and a strategic opportunistic approach. Knowledge brokers were the people who drove and designed the implementation process acting by translating evidence into practice allowing for real-time management of the complex processes of implementation. These knowledge brokers used a 'strategic opportunistic' or agile approach to recognize new opportunities, secure leadership support, adapt to the context and take advantage of the unexpected. Engaging in an overall curriculum reform process was a critical factor for a successful implementation of programmatic assessment. DISCUSSION The study contributes to the understanding of the intricacies of implementation processes of programmatic assessment across different institutions. Managing opportunities, adaptive planning, awareness of context, were all critical aspects of thinking strategically and opportunistically in the implementation of programmatic assessment. Future research is needed to provide a more in-depth understanding of values and beliefs that underpin the assessment culture of an organization, and how such values may affect implementation.
Collapse
Affiliation(s)
- Dario Torre
- Director of Assessment, and Professor of Medicine, University of Central Florida College of Medicine, Orlando, FL, USA
| | - Lambert Schuwirth
- College of Medicine and Public Health, Flinders University, Adelaide, Australia
| | - Cees Van der Vleuten
- Department of Educational Development and Research, School of Health Profession Education, Maastricht University, Maastricht, The Netherlands
| | - Sylvia Heeneman
- Department of Pathology, School Health Profession Education, Cardiovascular Research Institute Maastricht, Maastricht University, Maastricht, The Netherlands
| |
Collapse
|
11
|
Abstract
Health professions education has undergone significant changes over the last few decades, including the rise of competency-based medical education, a shift to authentic workplace-based assessments, and increased emphasis on programmes of assessment. Despite these changes, there is still a commonly held assumption that objectivity always leads to and is the only way to achieve fairness in assessment. However, there are well-documented limitations to using objectivity as the 'gold standard' to which assessments are judged. Fairness, on the other hand, is a fundamental quality of assessment and a principle that almost no one contests. Taking a step back and changing perspectives to focus on fairness in assessment may help re-set a traditional objective approach and identify an equal role for subjective human judgement in assessment alongside objective methods. This paper explores fairness as a fundamental quality of assessments. This approach legitimises human judgement and shared subjectivity in assessment decisions alongside objective methods. Widening the answer to the question: 'What is fair assessment' to include not only objectivity but also expert human judgement and shared subjectivity can add significant value in ensuring learners are better equipped to be the health professionals required of the 21st century.
Collapse
Affiliation(s)
- Nyoli Valentine
- Prideaux Discipline of Clinical Education, Flinders University, Bedford Park, Australia
| | - Steven J Durning
- Center for Health Professions Education, Uniformed Services University of the Health Sciences, Bethesda, MD, USA
| | | | - Cees van der Vleuten
- Department of Educational Development and Research, Maastricht University, Maastricht, Netherlands
| | - Lambert Schuwirth
- Prideaux Discipline of Clinical Education, Flinders University, Bedford Park, Australia
| |
Collapse
|
12
|
Teunissen PW, Atherley A, Cleland JJ, Holmboe E, Hu WCY, Durning SJ, Nishigori H, Samarasekera DD, Schuwirth L, van Schalkwyk S, Maggio LA. Advancing the science of health professions education through a shared understanding of terminology: a content analysis of terms for "faculty". Perspect Med Educ 2022; 11:22-27. [PMID: 34506010 PMCID: PMC8733114 DOI: 10.1007/s40037-021-00683-8] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/16/2020] [Revised: 07/02/2021] [Accepted: 07/05/2021] [Indexed: 06/13/2023]
Abstract
INTRODUCTION Health professions educators risk misunderstandings where terms and concepts are not clearly defined, hampering the field's progress. This risk is especially pronounced with ambiguity in describing roles. This study explores the variety of terms used by researchers and educators to describe "faculty", with the aim to facilitate definitional clarity, and create a shared terminology and approach to describing this term. METHODS The authors analyzed journal article abstracts to identify the specific words and phrases used to describe individuals or groups of people referred to as faculty. To identify abstracts, PubMed articles indexed with the Medical Subject Heading "faculty" published between 2007 and 2017 were retrieved. Authors iteratively extracted data and used content analysis to identify patterns and themes. RESULTS A total of 5,436 citations were retrieved, of which 3,354 were deemed eligible. Based on a sample of 594 abstracts (17.7%), we found 279 unique terms. The most commonly used terms accounted for approximately one-third of the sample and included faculty or faculty member/s (n = 252; 26.4%); teacher/s (n = 59; 6.2%) and medical educator/s (n = 26; 2.7%) were also well represented. Content analysis highlighted that the different descriptors authors used referred to four role types: healthcare (e.g., doctor, physician), education (e.g., educator, teacher), academia (e.g., professor), and/or relationship to the learner (e.g., mentor). DISCUSSION Faculty are described using a wide variety of terms, which can be linked to four role descriptions. The authors propose a template for researchers and educators who want to refer to faculty in their papers. This is important to advance the field and increase readers' assessment of transferability.
Collapse
Affiliation(s)
- Pim W Teunissen
- Maastricht University Medical Center, Maastricht, The Netherlands.
- School of Health Professions Education (SHE), Faculty of Health Medicine and Life Sciences, Maastricht University, Maastricht, The Netherlands.
| | - Anique Atherley
- School of Health Professions Education (SHE), Faculty of Health Medicine and Life Sciences, Maastricht University, Maastricht, The Netherlands
- Western Sydney University, Sydney, Australia
| | - Jennifer J Cleland
- LKC School of Medicine, Nanyang University Singapore, Singapore, Singapore
- University of Aberdeen, Aberdeen, UK
| | - Eric Holmboe
- Accreditation Council for Graduate Medical Education, Chicago, IL, USA
| | | | - Steven J Durning
- Center for Health Professions Education and Department of Medicine, Uniformed Services University of the Health Sciences, Bethesda, MD, USA
| | | | - Dujeepa D Samarasekera
- Centre for Medical Education (CenMED), National University of Singapore, Singapore, Singapore
- Ministry of Health Singapore, Singapore, Singapore
| | - Lambert Schuwirth
- Flinders Health and Medical Research Institute, Prideaux at Flinders University, Adelaide, Australia
| | - Susan van Schalkwyk
- Centre for Health Professions Education and Faculty of Medicine and Health Sciences, Stellenbosch University, Stellenbosch, South Africa
| | - Lauren A Maggio
- Center for Health Professions Education and Department of Medicine, Uniformed Services University of the Health Sciences, Bethesda, MD, USA
| |
Collapse
|
13
|
Cleland J, Gates LJ, Waiter GD, Ho VB, Schuwirth L, Durning S. Even a little sleepiness influences neural activation and clinical reasoning in novices. Health Sci Rep 2021; 4:e406. [PMID: 34761123 PMCID: PMC8566838 DOI: 10.1002/hsr2.406] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2021] [Revised: 08/23/2021] [Accepted: 08/26/2021] [Indexed: 11/08/2022] Open
Abstract
BACKGROUND AND AIMS Sleepiness influences alertness and cognitive functioning and impacts many aspects of medical care, including clinical reasoning. However, dual processing theory suggests that sleepiness will impact clinical reasoning differently in different individual, depending on their level of experience with the given condition. Our aim, therefore, was to examine the association between clinical reasoning, neuroanatomical activation, and sleepiness in senior medical students. METHODS Our methodology replicated an earlier study but with novices rather than board-certified physicians. Eighteen final-year medical students answered validated multiple-choice questions (MCQs) during an fMRI scan. Each MCQ was projected in three phases: reading, answering, and reflection (modified think aloud). Echo-planar imaging (EPI) scans gave a time series that reflected blood oxygenation level dependent (BOLD) signal in each location (voxel) within the brain. Sleep data were collected via self-report (Epworth Sleepiness Scale) and actigraphy. These data were correlated with answer accuracy using Pearson correlation. RESULTS Analysis revealed an increased BOLD signal in the right dorsomedial prefrontal cortex (P < .05) during reflection (Phase 3) associated with increased self-reported sleepiness (ESS) immediately before scanning. Covariate analysis also revealed that increased BOLD signal in the right supramarginal gyrus (P < .05) when reflecting (Phase 3) was associated with increased correct answer response time. Both patterns indicate effortful analytic (System 2) reasoning. CONCLUSION Our findings that novices use System 2 thinking for clinical reasoning and even a little (perceived) sleepiness influences their clinical reasoning ability to suggest that the parameters for safe working may be different for novices (eg, junior doctors) and experienced physicians.
Collapse
Affiliation(s)
- Jennifer Cleland
- Lee Kong Chian School of Medicine, Nanyang Technological UniversitySingaporeSingapore
| | - Laura J. Gates
- Institute of Education for Medical and Dental Sciences, University of AberdeenAberdeenUK
| | - Gordon D. Waiter
- Aberdeen Biomedical Imaging Centre, University of AberdeenAberdeenUK
| | - Vincent B. Ho
- Department of Radiology and Radiological SciencesUniformed Services University of the Health SciencesBethesdaMarylandUSA
| | - Lambert Schuwirth
- College of Medicine and Public Health, Flinders UniversityAdelaideAustralia
| | - Steven Durning
- Department of MedicineUniformed Services University of the Health SciencesBethesdaMarylandUSA
| |
Collapse
|
14
|
Roberts C, Khanna P, Lane AS, Reimann P, Schuwirth L. Exploring complexities in the reform of assessment practice: a critical realist perspective. Adv Health Sci Educ Theory Pract 2021; 26:1641-1657. [PMID: 34431028 DOI: 10.1007/s10459-021-10065-8] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/02/2020] [Accepted: 08/08/2021] [Indexed: 06/13/2023]
Abstract
Although the principles behind assessment for and as learning are well-established, there can be a struggle when reforming traditional assessment of learning to a program which encompasses assessment for and as learning. When introducing and reporting reforms, tensions in faculty may arise because of differing beliefs about the relationship between assessment and learning and the rules for the validity of assessments. Traditional systems of assessment of learning privilege objective, structured quantification of learners' performances, and are done to the students. Newer systems of assessment promote assessment for learning, emphasise subjectivity, collate data from multiple sources, emphasise narrative-rich feedback to promote learner agency, and are done with the students. This contrast has implications for implementation and evaluative research. Research of assessment which is done to students typically asks, "what works", whereas assessment that is done with the students focuses on more complex questions such as "what works, for whom, in which context, and why?" We applied such a critical realist perspective drawing on the interplay between structure and agency, and a systems approach to explore what theory says about introducing programmatic assessment in the context of pre-existing traditional approaches. Using a reflective technique, the internal conversation, we developed four factors that can assist educators considering major change to assessment practice in their own contexts. These include enabling positive learner agency and engagement; establishing argument-based validity frameworks; designing purposeful and eclectic evidence-based assessment tasks; and developing a shared narrative that promotes reflexivity in appreciating the complex relationships between assessment and learning.
Collapse
Affiliation(s)
- Chris Roberts
- Faculty of Medicine and Health, Education Office, Sydney Medical School, The University of Sydney, Sydney, NSW, Australia.
| | - Priya Khanna
- Faculty of Medicine and Health, Education Office, Sydney Medical School, The University of Sydney, Sydney, NSW, Australia
| | - Andrew Stuart Lane
- Faculty of Medicine and Health, Education Office, Sydney Medical School, The University of Sydney, Sydney, NSW, Australia
| | - Peter Reimann
- Centre for Research on Learning and Innovation (CRLI), The University of Sydney, Sydney, NSW, Australia
| | - Lambert Schuwirth
- Prideaux Discipline of Clinical Education, College of Medicine and Public Health, Flinders University, Adelaide, South Australia, Australia
| |
Collapse
|
15
|
Handoyo NE, Rahayu GR, Claramita M, Keraf MKPA, Octrisdey K, Yuniarti KW, Ash J, Schuwirth L. Developing Personal Resilience Questionnaire for rural doctors: an indigenous approach study in Indonesia. BMC Psychol 2021; 9:158. [PMID: 34654485 PMCID: PMC8518302 DOI: 10.1186/s40359-021-00666-8] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2021] [Accepted: 09/30/2021] [Indexed: 11/19/2022] Open
Abstract
Background Resilience is recognized as a critical component of well-being and is an essential factor in coping with stress. There are issues of using a standardized resilience scale developed for one cultural population to be used in the different cultural populations. This study aimed to create a specific measurement scale for measuring doctors’ resilience levels in the rural Indonesian context. Method A total of 527 rural doctors and health professional educators joined this study (37 and 490 participants in the pilot studies and the survey, respectively). An indigenous psychological approach was implemented in linguistic and cultural adaptation and validation of an existing instrument into the local Indonesian rural health context. A combined method of back-translation, committee approach, communication with the original author, and exploratory qualitative study in the local context was conducted. The indigenous psychological approach was implemented in exploring the local context and writing additional local items.
Result The final questionnaire consisted of six dimensions and 30 items with good internal consistency (Cronbach’s α ranged 0.809–0.960 for each dimension). Ten locally developed items were added to the final questionnaire as a result of the indigenous psychological approach. Conclusion An indigenous psychological approach may enrich the linguistic and cultural adaptation and validation process of an existing scale. Supplementary Information The online version contains supplementary material available at 10.1186/s40359-021-00666-8.
Collapse
Affiliation(s)
- Nicholas Edwin Handoyo
- Faculty of Medicine, University of Nusa Cendana, Jl. Adi Sucipto, Penfui, Kupang, Nusa Tenggara Timur, Indonesia.
| | - Gandes Retno Rahayu
- Department of Medical, Health Professions Education, and Bioethics, Faculty of Medicine Nursing and Public Health, University of Gadjah Mada, Yogyakarta, Indonesia
| | - Mora Claramita
- Department of Medical, Health Professions Education, and Bioethics, Faculty of Medicine Nursing and Public Health, University of Gadjah Mada, Yogyakarta, Indonesia
| | | | - Karol Octrisdey
- Polytechnic of Health, Kupang, Nusa Tenggara Timur, Indonesia
| | | | - Julie Ash
- Prideaux Discipline of Clinical Education, Flinders University, Adelaide, SA, Australia
| | - Lambert Schuwirth
- Prideaux Discipline of Clinical Education, Flinders University, Adelaide, SA, Australia
| |
Collapse
|
16
|
Valentine N, Shanahan EM, Durning SJ, Schuwirth L. Making it fair: Learners' and assessors' perspectives of the attributes of fair judgement. Med Educ 2021; 55:1056-1066. [PMID: 34060124 DOI: 10.1111/medu.14574] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/05/2021] [Revised: 05/19/2021] [Accepted: 05/26/2021] [Indexed: 06/12/2023]
Abstract
INTRODUCTION Optimising the use of subjective human judgement in assessment requires understanding what makes judgement fair. Whilst fairness cannot be simplistically defined, the underpinnings of fair judgement within the literature have been previously combined to create a theoretically-constructed conceptual model. However understanding assessors' and learners' perceptions of what is fair human judgement is also necessary. The aim of this study is to explore assessors' and learners' perceptions of fair human judgement, and to compare these to the conceptual model. METHODS A thematic analysis approach was used. A purposive sample of twelve assessors and eight post-graduate trainees undertook semi-structured interviews using vignettes. Themes were identified using the process of constant comparison. Collection, analysis and coding of the data occurred simultaneously in an iterative manner until saturation was reached. RESULTS This study supported the literature-derived conceptual model suggesting fairness is a multi-dimensional construct with components at individual, system and environmental levels. At an individual level, contextual, longitudinally-collected evidence, which is supported by narrative, and falls within ill-defined boundaries is essential for fair judgement. Assessor agility and expertise are needed to interpret and interrogate evidence, identify boundaries and provide narrative feedback to allow for improvement. At a system level, factors such as multiple opportunities to demonstrate competence and improvement, multiple assessors to allow for different perspectives to be triangulated, and documentation are needed for fair judgement. These system features can be optimized through procedural fairness. Finally, appropriate learning and working environments which considers patient needs and learners personal circumstances are needed for fair judgments. DISCUSSION This study builds on the theory-derived conceptual model demonstrating the components of fair judgement can be explicitly articulated whilst embracing the complexity and contextual nature of health-professions assessment. Thus it provides a narrative to support dialogue between learner, assessor and institutions about ensuring fair judgements in assessment.
Collapse
Affiliation(s)
- Nyoli Valentine
- Prideaux Discipline of Clinical Education, Flinders University, SA, Australia
| | | | - Steven J Durning
- Center for Health Professions Education, Uniformed Services University of the Health Sciences, Bethesda, MD, USA
| | - Lambert Schuwirth
- Prideaux Discipline of Clinical Education, Flinders University, SA, Australia
| |
Collapse
|
17
|
Prentice S, Kirkpatrick E, Schuwirth L, Benson J. Identifying the at-risk General Practice trainee: a retrospective cohort meta-analysis of General Practice registrar flagging. Adv Health Sci Educ Theory Pract 2021; 26:1001-1025. [PMID: 33587217 DOI: 10.1007/s10459-021-10031-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/19/2020] [Accepted: 01/23/2021] [Indexed: 06/12/2023]
Abstract
A central principle of programmatic assessment is that the final decision is not a surprise to the learner. To achieve this, assessments must demonstrate predictive and consequential validity, however, to date, research has only focussed on the former. The present study attempts to address this gap by examining the predictive and consequential validity of flagging systems used by Australian General Practice regional training organisations (RTOs) in relation to Fellowship examinations. Informed by unstructured interviews with Senior Medical Educators to understand the flagging system of each RTO, meta-analyses of routinely-collected flagging data were used to examine the predictive validity of flagging at various points in training and exam performance. Additionally, flagging system features identified from the interviews were used to inform exploratory subgroup analyses and meta-regressions to further assess the predictive and consequential validity of these systems. Registrars flagged near the end of their training were two to four times more likely to fail Fellowship exams than their non-flagged counterparts. Regarding flagging system features, having graded (i.e. ordinal) flagging systems was associated with higher accuracy, whilst involving the assigned medical educator in remediation and initiating a formal diagnostic procedure following a flag improved registrars' chances of passing exams. These results demonstrate both predictive and consequential validity of flagging systems. We argue that flagging is most effective when initiated early in training in conjunction with mechanisms to maximise diagnostic accuracy and the quality of remediation programs.
Collapse
Affiliation(s)
- Shaun Prentice
- GPEx Ltd., 132 Greenhill Road, Unley, South Australia, 5061, Australia.
| | - Emily Kirkpatrick
- GPEx Ltd., 132 Greenhill Road, Unley, South Australia, 5061, Australia
| | - Lambert Schuwirth
- Discipline of Clinical Education, Flinders University, Bedford Park, Australia
| | - Jill Benson
- GPEx Ltd., 132 Greenhill Road, Unley, South Australia, 5061, Australia
- Discipline of Clinical Education, Flinders University, Bedford Park, Australia
| |
Collapse
|
18
|
King S, Damarell R, Schuwirth L, Vakulin A, Chai-Coetzer CL, McEvoy RD. Knowledge to action: a scoping review of approaches to educate primary care providers in the identification and management of routine sleep disorders. J Clin Sleep Med 2021; 17:2307-2324. [PMID: 33983109 DOI: 10.5664/jcsm.9374] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
STUDY OBJECTIVES The referral burden on healthcare systems for routine sleep disorders could be alleviated by educating primary care providers (PCPs) to diagnose and manage patients with sleep health issues. This requires effective professional education strategies and resources. This scoping review examined the literature on existing approaches to educate PCPs in sleep health management. METHODS A comprehensive literature search was conducted across eight databases to identify citations describing the education of PCPs in diagnosing and managing sleep disorders, specifically insomnia and sleep apnea. A conceptual framework, developed from the knowledge-to-action cycle, was used to analyze citations from a knowledge translation perspective. RESULTS Searches identified 616 unique citations and after selection criteria were applied, 22 reports were included. Reports spanning 38 years were analyzed using components of the knowledge-to-action cycle to understand how educational interventions were designed, developed, implemented, and evaluated. Interventions involved didactic (32%), active (18%) and blended (41%) approaches, using face-to-face (27%), technology-mediated (45%) and multimodal (5%) delivery. Educational effectiveness was assessed in 73% of reports, most commonly using a pre/post questionnaire (41%). CONCLUSIONS While this scoping review has utility in describing existing educational interventions to upskill PCPs to diagnose and manage sleep disorders, the findings suggest that interventions are often developed without explicitly considering the evidence of best educational practice. Future interventional designs may achieve greater sustained effectiveness by considering characteristics of the target audience, the pedagogical approaches best suited to its needs, and any environmental drivers and barriers that might impede the translation of evidence into practice.
Collapse
Affiliation(s)
- Svetlana King
- Prideaux Centre for Research in Health Professions Education, College of Medicine and Public Health, Flinders University
| | - Raechel Damarell
- College of Nursing and Health Sciences, Flinders University, Adelaide, Australia
| | - Lambert Schuwirth
- Prideaux Centre for Research in Health Professions Education, College of Medicine and Public Health, Flinders University
| | - Andrew Vakulin
- Flinders Health and Medical Research Institute Sleep Health / Adelaide Institute of Sleep Health, College of Medicine and Public Health, Flinders University, Adelaide, Australia.,National Centre for Sleep Health Services Research: A NHMRC Centre of Research Excellence
| | - Ching Li Chai-Coetzer
- Flinders Health and Medical Research Institute Sleep Health / Adelaide Institute of Sleep Health, College of Medicine and Public Health, Flinders University, Adelaide, Australia.,National Centre for Sleep Health Services Research: A NHMRC Centre of Research Excellence
| | - R Doug McEvoy
- Flinders Health and Medical Research Institute Sleep Health / Adelaide Institute of Sleep Health, College of Medicine and Public Health, Flinders University, Adelaide, Australia.,National Centre for Sleep Health Services Research: A NHMRC Centre of Research Excellence
| |
Collapse
|
19
|
Valentine N, Durning S, Shanahan EM, Schuwirth L. Fairness in human judgement in assessment: a hermeneutic literature review and conceptual framework. Adv Health Sci Educ Theory Pract 2021; 26:713-738. [PMID: 33123837 DOI: 10.1007/s10459-020-10002-1] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/13/2020] [Accepted: 10/19/2020] [Indexed: 06/11/2023]
Abstract
Human judgement is widely used in workplace-based assessment despite criticism that it does not meet standards of objectivity. There is an ongoing push within the literature to better embrace subjective human judgement in assessment not as a 'problem' to be corrected psychometrically but as legitimate perceptions of performance. Taking a step back and changing perspectives to focus on the fundamental underlying value of fairness in assessment may help re-set the traditional objective approach and provide a more relevant way to determine the appropriateness of subjective human judgements. Changing focus to look at what is 'fair' human judgement in assessment, rather than what is 'objective' human judgement in assessment allows for the embracing of many different perspectives, and the legitimising of human judgement in assessment. However, this requires addressing the question: what makes human judgements fair in health professions assessment? This is not a straightforward question with a single unambiguously 'correct' answer. In this hermeneutic literature review we aimed to produce a scholarly knowledge synthesis and understanding of the factors, definitions and key questions associated with fairness in human judgement in assessment and a resulting conceptual framework, with a view to informing ongoing further research. The complex construct of fair human judgement could be conceptualised through values (credibility, fitness for purpose, transparency and defensibility) which are upheld at an individual level by characteristics of fair human judgement (narrative, boundaries, expertise, agility and evidence) and at a systems level by procedures (procedural fairness, documentation, multiple opportunities, multiple assessors, validity evidence) which help translate fairness in human judgement from concepts into practical components.
Collapse
Affiliation(s)
- Nyoli Valentine
- Prideaux Health Professions Education, Flinders University, Bedford Park 5042, SA, Australia.
| | - Steven Durning
- Center for Health Professions Education, Uniformed Services University of the Health Sciences, Bethesda, MD, USA
| | - Ernst Michael Shanahan
- Prideaux Health Professions Education, Flinders University, Bedford Park 5042, SA, Australia
| | - Lambert Schuwirth
- Prideaux Health Professions Education, Flinders University, Bedford Park 5042, SA, Australia
| |
Collapse
|
20
|
Prentice S, Benson J, Kirkpatrick E, Schuwirth L. Workplace-based assessments in postgraduate medical education: A hermeneutic review. Med Educ 2020; 54:981-992. [PMID: 32403200 DOI: 10.1111/medu.14221] [Citation(s) in RCA: 26] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/10/2019] [Revised: 03/30/2020] [Accepted: 05/06/2020] [Indexed: 06/11/2023]
Abstract
OBJECTIVES Since their introduction, workplace-based assessments (WBAs) have proliferated throughout postgraduate medical education. Previous reviews have identified mixed findings regarding WBAs' effectiveness, but have not considered the importance of user-tool-context interactions. The present review was conducted to address this gap by generating a thematic overview of factors important to the acceptability, effectiveness and utility of WBAs in postgraduate medical education. METHOD This review utilised a hermeneutic cycle for analysis of the literature. Four databases were searched to identify articles pertaining to WBAs in postgraduate medical education from the United Kingdom, Canada, Australia, New Zealand, the Netherlands and Scandinavian countries. Over the course of three rounds, 30 published articles were thematically analysed in an iterative fashion to deeply engage with the literature in order to answer three scoping questions concerning acceptability, effectiveness and assessment training. As each round was coded, themes were refined and questions added until saturation was reached. RESULTS Stakeholders value WBAs for permitting assessment of trainees' performance in an authentic context. Negative perceptions of WBAs stem from misuse due to low assessment literacy, disagreement with definitions and frameworks, and inadequate summative use of WBAs. Effectiveness is influenced by user (eg, engagement and assessment literacy) and tool attributes (eg, definitions and scales), but most fundamentally by user-tool-context interactions, particularly trainee-assessor relationships. Assessors' assessment literacy must be combined with cultural and administrative factors in organisations and the broader medical discipline. CONCLUSIONS The pivotal determinants of WBAs' effectiveness and utility are the user-tool-context interactions. From the identified themes, we present 12 lessons learned regarding users, tools and contexts to maximise WBA utility, including the separation of formative and summative WBA assessors, use of maximally useful scales, and instituting measures to reduce competitive demands.
Collapse
Affiliation(s)
- Shaun Prentice
- GPEx Ltd., Adelaide, South Australia, Australia
- School of Psychology, University of Adelaide, Adelaide, South Australia, Australia
| | - Jill Benson
- GPEx Ltd., Adelaide, South Australia, Australia
- Health in Human Diversity Unit, School of Medicine, University of Adelaide, Adelaide, South Australia, Australia
- Prideaux Centre, Flinders University, Adelaide, South Australia, Australia
| | - Emily Kirkpatrick
- GPEx Ltd., Adelaide, South Australia, Australia
- School of Medicine, University of Adelaide, Adelaide, South Australia, Australia
| | - Lambert Schuwirth
- Prideaux Centre, Flinders University, Adelaide, South Australia, Australia
- Maastrich University, Maastricht, the Netherlands
- Uniformed University for the Health Sciences, Bethesda, Maryland, USA
| |
Collapse
|
21
|
Abstract
Numerous and substantial challenges exist in the provision of safe, cost-effective, and efficient health care. The prevalence and consequences of diagnostic error, one of these challenges, have been established by the literature; however, these errors persist, and the pace of improvement has been slow. One potential reason for the lack of needed progress is that addressing delayed and wrong diagnoses will require contributions from 2 currently distinct worlds: clinical reasoning and diagnostic error. In this Invited Commentary, the authors argue for merging the diagnostic error and clinical reasoning fields as the perspectives, frameworks, and methodologies of these 2 fields could be leveraged to yield a more aligned approach to understanding and subsequently to mitigating diagnostic error. The authors focus on the problem of diagnostic labeling (a categorization task where one has to choose the correct label or diagnosis). The authors elaborate on why this alignment could help guide health care improvement efforts, using the vexing problem of context specificity that leads to unwanted variance in health care as an example.
Collapse
Affiliation(s)
- Steven J Durning
- S.J. Durning is professor and director, Graduate Programs in Health Professions Education, Department of Medicine, Uniformed Services University of the Health Sciences, Bethesda, Maryland; ORCID: https://orcid.org/0000-0001-5223-1597
| | - Robert L Trowbridge
- R.L. Trowbridge is associate professor of medicine, Tufts University School of Medicine, Portland, Maine; ORCID: https://orcid.org/0000-0002-0460-2733
| | - Lambert Schuwirth
- L. Schuwirth is strategic professor in medical education, Flinders University, Adelaide, South Australia, Australia; ORCID: https://orcid.org/0000-0002-6279-5158
| |
Collapse
|
22
|
Michael Shanahan E, van der Vleuten C, Schuwirth L. Conflict between clinician teachers and their students: the clinician perspective. Adv Health Sci Educ Theory Pract 2020; 25:401-414. [PMID: 31641944 DOI: 10.1007/s10459-019-09933-1] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/05/2019] [Accepted: 10/14/2019] [Indexed: 06/10/2023]
Abstract
The relationship between clinician teachers and their students is of major importance in medical education. However, there is little known about the effects on clinicians when conflict occurs with their students. What do clinicians perceive to be major causes of these conflicts? How do they react when and after conflict occurs? A phenomenological inquiry exploring the lived experience of 12 clinician teachers in medical schools was performed. The clinicians were selected using purposeful sampling and snowballing techniques. The interviews revolved around discussions based on episodes of conflict with medical students that the clinicians considered significant. The analysis and emergent themes were partially constructed around and informed by theories of conflict, and conflict management. A number of themes emerged from this study. Clinicians experienced that significant psychological and behavioural problems of students had a dominant impact on the likelihood and severity of conflict; these conflicts had a significant emotional impact on clinicians; though the responses to conflict varied, "avoidance" was a mechanism commonly used by clinicians and thus the assessment of attitudinal and behavioural professional issues in the workplace was problematic. This study shows how the clinician perspective to challenging student/clinician encounters impacts on the quality of education they are able to provide. We recommend medical schools consider these issues when designing their programs in order to develop and maintain clinician-teacher engagement and participation.
Collapse
Affiliation(s)
- Ernst Michael Shanahan
- Flinders University and the South Australian Health Service Adelaide, Bedford Park, SA, 5042, Australia.
| | - Cees van der Vleuten
- Department of Educational Development, Maastricht University, PO Box 616, 6200 MD, Maastricht, The Netherlands
| | - Lambert Schuwirth
- Flinders University and the South Australian Health Service Adelaide, Bedford Park, SA, 5042, Australia
| |
Collapse
|
23
|
Young ME, Thomas A, Lubarsky S, Gordon D, Gruppen LD, Rencic J, Ballard T, Holmboe E, Da Silva A, Ratcliffe T, Schuwirth L, Dory V, Durning SJ. Mapping clinical reasoning literature across the health professions: a scoping review. BMC Med Educ 2020; 20:107. [PMID: 32264895 PMCID: PMC7140328 DOI: 10.1186/s12909-020-02012-9] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/15/2019] [Accepted: 03/19/2020] [Indexed: 05/10/2023]
Abstract
BACKGROUND Clinical reasoning is at the core of health professionals' practice. A mapping of what constitutes clinical reasoning could support the teaching, development, and assessment of clinical reasoning across the health professions. METHODS We conducted a scoping study to map the literature on clinical reasoning across health professions literature in the context of a larger Best Evidence Medical Education (BEME) review on clinical reasoning assessment. Seven databases were searched using subheadings and terms relating to clinical reasoning, assessment, and Health Professions. Data analysis focused on a comprehensive analysis of bibliometric characteristics and the use of varied terminology to refer to clinical reasoning. RESULTS Literature identified: 625 papers spanning 47 years (1968-2014), in 155 journals, from 544 first authors, across eighteen Health Professions. Thirty-seven percent of papers used the term clinical reasoning; and 110 other terms referring to the concept of clinical reasoning were identified. Consensus on the categorization of terms was reached for 65 terms across six different categories: reasoning skills, reasoning performance, reasoning process, outcome of reasoning, context of reasoning, and purpose/goal of reasoning. Categories of terminology used differed across Health Professions and publication types. DISCUSSION Many diverse terms were present and were used differently across literature contexts. These terms likely reflect different operationalisations, or conceptualizations, of clinical reasoning as well as the complex, multi-dimensional nature of this concept. We advise authors to make the intended meaning of 'clinical reasoning' and associated terms in their work explicit in order to facilitate teaching, assessment, and research communication.
Collapse
Affiliation(s)
- Meredith E. Young
- Institute of Health Sciences Education in the Faculty of Medicine, McGill University, Room 200 Lady Meredith House, 1110 Pine Avenue West, Montreal, QC H3A 1A3 Canada
| | - Aliki Thomas
- School of Physical and Occupational Therapy, Institute of Health Sciences Education in the Faculty of Medicine at McGill University, Centre for Interdisciplinary Research in Rehabilitation of greater Montreal, Montréal, Canada
| | - Stuart Lubarsky
- Department of Neurology and Institute of Health Sciences Education, McGill University, Montreal, Canada
| | - David Gordon
- Division of Emergency Medicine and the Department of Surgery, Duke University School of Medicine, Durham, North Carolina USA
| | - Larry D. Gruppen
- Department of Learning Health Sciences, University of Michigan Medical School, Ann Arbor, USA
| | - Joseph Rencic
- Division of General Internal Medicine at Tufts Medical Center, Tufts University School of Medicine, Boston, MA USA
| | | | - Eric Holmboe
- Chief Research, Milestone Development, and Evaluation Officer, ACGME, Chicago, IL USA
- Feinberg School of Medicine of Northwestern University, Chicago, IL USA
| | | | - Temple Ratcliffe
- Department of Medicine, University of Texas Health Science Center, San Antonio, TX USA
| | - Lambert Schuwirth
- Flinders University, Adelaide, Australia
- Maastricht University, Maastricht, Netherlands
- Chang Gung University, Taoyuan City, Taiwan
- Uniformed Services University of the Health Sciences, Bethesda, USA
| | - Valérie Dory
- Department of Medicine, an Assessment Specialist for undergraduate medical education in the Faculty of Medicine, McGill University, Montreal, Canada
| | - Steven J. Durning
- Department of Medicine, Uniformed Services University of the Health Sciences, Bethesda, MD USA
| |
Collapse
|
24
|
Young M, Thomas A, Gordon D, Gruppen L, Lubarsky S, Rencic J, Ballard T, Holmboe E, Da Silva A, Ratcliffe T, Schuwirth L, Durning SJ. The terminology of clinical reasoning in health professions education: Implications and considerations. Med Teach 2019; 41:1277-1284. [PMID: 31314612 DOI: 10.1080/0142159x.2019.1635686] [Citation(s) in RCA: 34] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Introduction: Clinical reasoning is considered to be at the core of health practice. Here, we report on the diversity and inferred meanings of the terms used to refer to clinical reasoning and consider implications for teaching and assessment. Methods: In the context of a Best Evidence Medical Education (BEME) review of 625 papers drawn from 18 health professions, we identified 110 terms for clinical reasoning. We focus on iterative categorization of these terms across three phases of coding and considerations for how terminology influences educational practices. Results: Following iterative coding with 5 team members, consensus was possible for 74, majority coding was possible for 16, and full team disagreement existed for 20 terms. Categories of terms included: purpose/goal of reasoning, outcome of reasoning, reasoning performance, reasoning processes, reasoning skills, and context of reasoning. Discussion: Findings suggest that terms used in reference to clinical reasoning are non-synonymous, not uniformly understood, and the level of agreement differed across terms. If the language we use to describe, to teach, or to assess clinical reasoning is not similarly understood across clinical teachers, program directors, and learners, this could lead to confusion regarding what the educational or assessment targets are for "clinical reasoning."
Collapse
Affiliation(s)
- Meredith Young
- Department of Medicine, McGill University , Montreal , Canada
- Institute for Health Sciences Education, McGill University , Montreal , Canada
| | - Aliki Thomas
- Department of Medicine, McGill University , Montreal , Canada
- School of Physical and Occupational Therapy, McGill University , Montreal , Canada
- Centre for Interdisciplinary Research in Rehabilitation of Greater Montreal , Montreal , Canada
| | - David Gordon
- Division of Emergency Medicine, Department of Surgery, Duke University School of Medicine , Durham , NC , USA
| | - Larry Gruppen
- Department of Medical Education, University of Michigan , Ann Arbor , MI , USA
| | - Stuart Lubarsky
- Institute for Health Sciences Education, McGill University , Montreal , Canada
- Department of Neurology, McGill University , Montreal , Canada
| | - Joseph Rencic
- School of Medicine, Tufts University , Boston , MA , USA
- Division of General Internal Medicine, Tufts Medical Center , Boston , MA , USA
| | - Tiffany Ballard
- Department of Medical Education, University of Michigan , Ann Arbor , MI , USA
| | - Eric Holmboe
- Accreditation Council for Graduate Medical Education , Chicago , IL , USA
- Department of Medicine, Yale School of Medicine, Yale University , New Haven , CT , USA
- Feinberg School of Medicine, Northwestern University , Chicago , Illinois , USA
| | - Ana Da Silva
- Swansea University Medical School, Swansea University , Swansea , UK
| | - Temple Ratcliffe
- Department of Medicine, University of Texas Health Science Center , San Antonio , TX , USA
| | - Lambert Schuwirth
- Flinders University Prideaux Centre for Research in Health Professions Education , Adelaide , Australia
- Department of Educational Development and Research, Maastricht University , Maastricht , the Netherlands
- Medical Education Research Centre, Chang Gung University , Taoyuan City , Taiwan, China
- Uniformed Services, University of the Health Sciences , Bethesda , MD , USA
| | - Steven J Durning
- Uniformed Services, University of the Health Sciences , Bethesda , MD , USA
| |
Collapse
|
25
|
Solhjoo S, Haigney MC, McBee E, van Merrienboer JJG, Schuwirth L, Artino AR, Battista A, Ratcliffe TA, Lee HD, Durning SJ. Heart Rate and Heart Rate Variability Correlate with Clinical Reasoning Performance and Self-Reported Measures of Cognitive Load. Sci Rep 2019; 9:14668. [PMID: 31604964 PMCID: PMC6789096 DOI: 10.1038/s41598-019-50280-3] [Citation(s) in RCA: 22] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2019] [Accepted: 09/05/2019] [Indexed: 01/05/2023] Open
Abstract
Cognitive load is a key mediator of cognitive processing that may impact clinical reasoning performance. The purpose of this study was to gather biologic validity evidence for correlates of different types of self-reported cognitive load, and to explore the association of self-reported cognitive load and physiologic measures with clinical reasoning performance. We hypothesized that increased cognitive load would manifest evidence of elevated sympathetic tone and would be associated with lower clinical reasoning performance scores. Fifteen medical students wore Holter monitors and watched three videos depicting medical encounters before completing a post-encounter form and standard measures of cognitive load. Correlation analysis was used to investigate the relationship between cardiac measures (mean heart rate, heart rate variability and QT interval variability) and self-reported measures of cognitive load, and their association with clinical reasoning performance scores. Despite the low number of participants, strong positive correlations were found between measures of intrinsic cognitive load and heart rate variability. Performance was negatively correlated with mean heart rate, as well as single-item cognitive load measures. Our data signify a possible role for using physiologic monitoring for identifying individuals experiencing high cognitive load and those at risk for performing poorly during clinical reasoning tasks.
Collapse
Affiliation(s)
- Soroosh Solhjoo
- Division of Cardiovascular Pathology, Johns Hopkins University School of Medicine, Baltimore, USA.
| | - Mark C Haigney
- Department of Medicine, F. Edward Hébert School of Medicine, Uniformed Services University of The Health Sciences, Bethesda, USA
| | - Elexis McBee
- Department of Medicine, F. Edward Hébert School of Medicine, Uniformed Services University of The Health Sciences, Naval Medical Center, San Diego, USA
| | | | - Lambert Schuwirth
- Prideaux Centre for Research in Health Professions Education, Flinders University, Bedford Park, Australia
| | - Anthony R Artino
- Department of Medicine, F. Edward Hébert School of Medicine, Uniformed Services University of The Health Sciences, Bethesda, USA
| | - Alexis Battista
- Department of Medicine, F. Edward Hébert School of Medicine, Uniformed Services University of The Health Sciences, Bethesda, USA
| | - Temple A Ratcliffe
- Department of Medicine, University of Texas Health Science Center, San Antonio, USA
| | - Howard D Lee
- San Antonio Uniformed Services Health Education Consortium, San Antonio, USA
| | - Steven J Durning
- Department of Medicine, F. Edward Hébert School of Medicine, Uniformed Services University of The Health Sciences, Bethesda, USA
| |
Collapse
|
26
|
Abstract
Medical education research and development's reason for existence is their contribution to producing better doctors. Arguably this is as. notion that nobody would disagree with. But, answering this question is not as straightforward as it may look. In this paper we describe six complexities that impact on such research and unfortunately contribute to the difficulties surrounding medical education knowledge translation to practice.
Collapse
Affiliation(s)
- Lambert Schuwirth
- Prideaux Centre for Health Professions Education Research, Flinders University, Adelaide, Australia
| | - Cees van der Vleuten
- Education Development and Research, Maastricht University, Maastricht, Netherlands
| |
Collapse
|
27
|
McBee E, Blum C, Ratcliffe T, Schuwirth L, Polston E, Artino AR, Durning SJ. Use of clinical reasoning tasks by medical students. ACTA ACUST UNITED AC 2019; 6:127-135. [PMID: 30851156 DOI: 10.1515/dx-2018-0077] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2018] [Accepted: 02/08/2019] [Indexed: 11/15/2022]
Abstract
Background A framework of clinical reasoning tasks used by physicians during clinical encounters was previously developed proposing that clinical reasoning is a complex process composed of 26 possible tasks. The aim of this paper was to analyze the verbalized clinical reasoning processes of medical students utilizing commonly encountered internal medicine cases. Methods In this mixed-methods study, participants viewed three video recorded clinical encounters. After each encounter, participants completed a think-aloud protocol. The qualitative data from the transcribed think-aloud transcripts were analyzed by two investigators using a constant comparative approach. The type, frequency, and pattern of codes used were analyzed. Results Seventeen third and fourth year medical students participated. They used 15 reasoning tasks across all cases. The average number of tasks used in cases 1, 2, and 3 was (respectively) 5.6 (range 3-8), 5.9 (range 4-8), and 5.3 (range 3-10). The order in which medical students verbalized reasoning tasks varied and appeared purposeful but non-sequential. Conclusions Consistent with prior research in residents, participants progressed through the encounter in a purposeful but non-sequential fashion. Reasoning tasks related to framing the encounter and diagnosis were not used in succession but interchangeably. This suggests that teaching successful clinical reasoning may involve encouraging or demonstrating multiple pathways through a problem. Further research exploring the association between use of clinical reasoning tasks and clinical reasoning accuracy could enhance the medical community's understanding of variance in clinical reasoning.
Collapse
Affiliation(s)
- Elexis McBee
- Department of Medicine, F. Edward Hébert School of Medicine, Uniformed Services University, Naval Medical Center San Diego, 34800 Bob Wilson Drive, San Diego, CA 92134, USA
| | - Christina Blum
- Department of Medicine, Naval Hospital Camp Pendleton, Oceanside, CA, USA
| | - Temple Ratcliffe
- Department of Medicine, University of Texas Health Science Center at San Antonio, San Antonio, TX, USA
| | - Lambert Schuwirth
- Flinders University, School of Medicine, Adelaide, South Australia, Australia
| | - Elizabeth Polston
- Department of Medicine, F. Edward Hébert School of Medicine, Uniformed Services University, Bethesda, MD, USA
| | - Anthony R Artino
- Department of Medicine, F. Edward Hébert School of Medicine, Uniformed Services University, Bethesda, MD, USA
| | - Steven J Durning
- Department of Medicine, F. Edward Hébert School of Medicine, Uniformed Services University, Bethesda, MD, USA
| |
Collapse
|
28
|
Prakash S, Sladek RM, Schuwirth L. Interventions to improve diagnostic decision making: A systematic review and meta-analysis on reflective strategies. Med Teach 2019; 41:517-524. [PMID: 30244625 DOI: 10.1080/0142159x.2018.1497786] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/17/2023]
Abstract
Aims: To identify, appraise and describe studies of cognitive interventions to improve diagnostic decision making (DDM) amongst medical professionals, assess their effectiveness and identify methodological limitations in existing studies. Methods: We systematically searched for studies (publication date 2000-2016) in multiple databases including Cochrane Controlled Trials, EMBASE, ERIC, Medline, PubMed and PsycINFO, and used additional strategies such as hand searching and snowballing. Included studies evaluated cognitive interventions to enhance DDM amongst medical professionals, using defined outcomes such as diagnostic accuracy. A meta-analysis assessed the impact of "reflection". Results: Forty-four studies out of 10,114 screened citations, involving 4380 medical professionals, were included. Studies evaluated reasoning workshops/curricula, de-biasing workshops, checklists, reflection, feedback, and instructions to induce analytical thinking. Guided reflection was demonstrated to improve DDM [effect size 0.38(95%CI 0.23-0.52), p < 0.001]. Immediate feedback and modeling reflection using contrasting examples also appeared to improve diagnostic accuracy, however underlying methodological issues prevented a quantitative assessment of any strategies other than reflection. Conclusions: Educational interventions incorporating practising deliberate reflection on a formulated diagnosis, modeled reflection on contrasting examples and immediate feedback are promising strategies for improving DDM. The effectiveness of other strategies is unknown, with more methodological refinements required in future research.
Collapse
Affiliation(s)
- Shivesh Prakash
- a College of Medicine and Public Health , Flinders University , Bedford Park , Australia
- b Prideaux Centre for Research in Health Professions Education , Flinders University , Bedford Park , Australia
- c Intensive Care Specialist at Southern Adelaide Health Network , Bedford Park , Australia
| | - Ruth M Sladek
- a College of Medicine and Public Health , Flinders University , Bedford Park , Australia
- b Prideaux Centre for Research in Health Professions Education , Flinders University , Bedford Park , Australia
| | - Lambert Schuwirth
- b Prideaux Centre for Research in Health Professions Education , Flinders University , Bedford Park , Australia
| |
Collapse
|
29
|
Abstract
INTRODUCTION Modern assessment in medical education is increasingly reliant on human judgement, as it is clear that quantitative scales have limitations in fully assessing registrars' development of competence and providing them with meaningful feedback to assist learning. For this, possession of an expert vocabulary is essential. AIM This study aims to explore how medical education experts voice their subjective judgements about learners and to what extent they are using clear, information-rich terminology (high-level semantic qualifiers); and to gain a better understanding of the experts' language used in these subjective judgements. METHODS Six experienced medical educators from urban and rural environments were purposefully selected. Each educator reviewed a registrar clinical case analysis in a think out loud manner. The transcribed data were analyzed, codes were identified and ordered into themes. Analysis continued until saturation was reached. RESULTS Five themes with subthemes emerged. The main themes were: (1) Demonstration of expertise; (2) Personal credibility; (3) Professional credibility; (4) Using a predefined structure and (5) Relevance. DISCUSSION Analogous to what experienced clinicians do in clinical reasoning, experienced medical educators verbalize their judgements using high-level semantic qualifiers. In this study, we were able to unpack these. Although there may be individual variability in the exact words used, clear themes emerged. These findings can be used to develop a helpful shared narrative for educators in observation-based assessment. The provision of a rich, detailed narrative will also assist in providing clarity to registrar feedback with areas of weakness clearly articulated to improve learning and remediation.
Collapse
|
30
|
McBee E, Ratcliffe T, Schuwirth L, O'Neill D, Meyer H, Madden SJ, Durning SJ. Context and clinical reasoning : Understanding the medical student perspective. Perspect Med Educ 2018; 7:256-263. [PMID: 29704167 PMCID: PMC6086813 DOI: 10.1007/s40037-018-0417-x] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/30/2023]
Abstract
INTRODUCTION Studies have shown that a physician's clinical reasoning performance can be influenced by contextual factors. We explored how the clinical reasoning performance of medical students was impacted by contextual factors in order to expand upon previous findings in resident and board certified physicians. Using situated cognition as the theoretical framework, our aim was to evaluate the verbalized clinical reasoning processes of medical students in order to describe what impact the presence of contextual factors has on their reasoning performance. METHODS Seventeen medical student participants viewed three video recordings of clinical encounters portraying straightforward diagnostic cases in internal medicine with explicit contextual factors inserted. Participants completed a computerized post-encounter form as well as a think-aloud protocol. Three authors analyzed verbatim transcripts from the think-aloud protocols using a constant comparative approach. After iterative coding, utterances were analyzed and grouped into categories and themes. RESULTS Six categories and ten associated themes emerged, which demonstrated overlap with findings from previous studies in resident and attending physicians. Four overlapping categories included emotional disturbances, behavioural inferences about the patient, doctor-patient relationship, and difficulty with closure. Two new categories emerged to include anchoring and misinterpretation of data. DISCUSSION The presence of contextual factors appeared to impact clinical reasoning performance in medical students. The data suggest that a contextual factor can be innate to the clinical scenario, consistent with situated cognition theory. These findings build upon our understanding of clinical reasoning performance from both a theoretical and practical perspective.
Collapse
Affiliation(s)
- Elexis McBee
- Department of Medicine, F. Edward Hébert School Of Medicine, Uniformed Services University, at Naval Medical Centre San Diego, San Diego, CA, USA.
| | - Temple Ratcliffe
- Department of Medicine, University of Texas Health Science Centre at San Antonio, San Antonio, TX, USA
| | - Lambert Schuwirth
- School of Medicine, Flinders University, Adelaide, South Australia, Australia
| | - Daniel O'Neill
- Department of Medicine, F. Edward Hébert School Of Medicine, Uniformed Services University, Bethesda, MD, USA
| | - Holly Meyer
- Department of Medicine, F. Edward Hébert School Of Medicine, Uniformed Services University, Bethesda, MD, USA
| | - Shelby J Madden
- Department of Medicine, F. Edward Hébert School Of Medicine, Uniformed Services University, Bethesda, MD, USA
| | - Steven J Durning
- Department of Medicine, F. Edward Hébert School Of Medicine, Uniformed Services University, Bethesda, MD, USA
| |
Collapse
|
31
|
Young M, Thomas A, Lubarsky S, Ballard T, Gordon D, Gruppen LD, Holmboe E, Ratcliffe T, Rencic J, Schuwirth L, Durning SJ. Drawing Boundaries: The Difficulty in Defining Clinical Reasoning. Acad Med 2018; 93:990-995. [PMID: 29369086 DOI: 10.1097/acm.0000000000002142] [Citation(s) in RCA: 30] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/08/2023]
Abstract
Clinical reasoning is an essential component of a health professional's practice. Yet clinical reasoning research has produced a notably fragmented body of literature. In this article, the authors describe the pause-and-reflect exercise they undertook during the execution of a synthesis of the literature on clinical reasoning in the health professions. Confronted with the challenge of establishing a shared understanding of the nature and relevant components of clinical reasoning, members of the review team paused to independently generate their own personal definitions and conceptualizations of the construct. Here, the authors describe the variability of definitions and conceptualizations of clinical reasoning present within their own team. Drawing on an analogy from mathematics, they hypothesize that the presence of differing "boundary conditions" could help explain individuals' differing conceptualizations of clinical reasoning and the fragmentation at play in the wider sphere of research on clinical reasoning. Specifically, boundary conditions refer to the practice of describing the conditions under which a given theory is expected to hold, or expected to have explanatory power. Given multiple theoretical frameworks, research methodologies, and assessment approaches contained within the clinical reasoning literature, different boundary conditions are likely at play. Open acknowledgment of different boundary conditions and explicit description of the conceptualization of clinical reasoning being adopted within a given study would improve research communication, support comprehensive approaches to teaching and assessing clinical reasoning, and perhaps encourage new collaborative partnerships among researchers who adopt different boundary conditions.
Collapse
Affiliation(s)
- Meredith Young
- M. Young is assistant professor, Department of Medicine, and research scientist, Centre for Medical Education, Faculty of Medicine, McGill University, Montreal, Quebec, Canada. A. Thomas is assistant professor, School of Physical and Occupational Therapy, and research scientist, Centre for Medical Education, Faculty of Medicine, McGill University; and researcher, Centre for Interdisciplinary Research in Rehabilitation of Greater Montreal, Montreal, Quebec, Canada. S. Lubarsky is assistant professor, Department of Neurology, and core member, Centre for Medical Education, Faculty of Medicine, McGill University, Montreal, Quebec, Canada. T. Ballard is a plastic surgery resident, University of Michigan, Ann Arbor, Michigan. D. Gordon is associate professor, Division of Emergency Medicine, Department of Surgery, Duke University School of Medicine, Durham, North Carolina. L.D. Gruppen is professor, Department of Learning Health Sciences, University of Michigan Medical School, Ann Arbor, Michigan, United States. E. Holmboe is senior vice president for milestone evaluation and development, Accreditation Council for Graduate Medical Education, Chicago, Illinois, and adjunct professor of medicine, Yale University, New Haven, Connecticut, and Feinberg School of Medicine, Northwestern University, Chicago, Illinois. T. Ratcliffe is associate professor, Department of Medicine, University of Texas Health Science Center, San Antonio, Texas. J. Rencic is associate professor of medicine, Tufts University School of Medicine, and member, Division of General Internal Medicine, Tufts Medical Center, Boston, Massachusetts. L. Schuwirth is professor of medical education, Flinders University, and director, Flinders University Prideaux Centre for Research in Health Professions Education, Adelaide, South Australia, Australia; and professor of medical education, Maastricht University, Maastricht, the Netherlands; Chang Gung University, Taoyuan City, Taiwan; and Uniformed Services University of the Health Sciences, Bethesda, Maryland. S.J. Durning is professor of medicine and director of graduate programs in health professions education, Uniformed Services University of the Health Sciences, Bethesda, Maryland
| | | | | | | | | | | | | | | | | | | | | |
Collapse
|
32
|
McBee E, Ratcliffe T, Picho K, Schuwirth L, Artino AR, Yepes-Rios AM, Masel J, van der Vleuten C, Durning SJ. Contextual factors and clinical reasoning: differences in diagnostic and therapeutic reasoning in board certified versus resident physicians. BMC Med Educ 2017; 17:211. [PMID: 29141616 PMCID: PMC5688653 DOI: 10.1186/s12909-017-1041-x] [Citation(s) in RCA: 27] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/24/2016] [Accepted: 11/02/2017] [Indexed: 05/02/2023]
Abstract
BACKGROUND The impact of context on the complex process of clinical reasoning is not well understood. Using situated cognition as the theoretical framework and videos to provide the same contextual "stimulus" to all participants, we examined the relationship between specific contextual factors on diagnostic and therapeutic reasoning accuracy in board certified internists versus resident physicians. METHODS Each participant viewed three videotaped clinical encounters portraying common diagnoses in internal medicine. We explicitly modified the context to assess its impact on performance (patient and physician contextual factors). Patient contextual factors, including English as a second language and emotional volatility, were portrayed in the videos. Physician participant contextual factors were self-rated sleepiness and burnout.. The accuracy of diagnostic and therapeutic reasoning was compared with covariates using Fisher Exact, Mann-Whitney U tests and Spearman Rho's correlations as appropriate. RESULTS Fifteen board certified internists and 10 resident physicians participated from 2013 to 2014. Accuracy of diagnostic and therapeutic reasoning did not differ between groups despite residents reporting significantly higher rates of sleepiness (mean rank 20.45 vs 8.03, U = 0.5, p < .001) and burnout (mean rank 20.50 vs 8.00, U = 0.0, p < .001). Accuracy of diagnosis and treatment were uncorrelated (r = 0.17, p = .65). In both groups, the proportion scoring correct responses for treatment was higher than the proportion scoring correct responses for diagnosis. CONCLUSIONS This study underscores that specific contextual factors appear to impact clinical reasoning performance. Further, the processes of diagnostic and therapeutic reasoning, although related, may not be interchangeable. This raises important questions about the impact that contextual factors have on clinical reasoning and provides insight into how clinical reasoning processes in more authentic settings may be explained by situated cognition theory.
Collapse
Affiliation(s)
- Elexis McBee
- Department of Medicine, Naval Medical Center San Diego, 34800 Bob Wilson Drive, San Diego, 92134 California USA
| | - Temple Ratcliffe
- Department of Medicine, University of Texas Health Science Center at San Antonio, 7703 Floyd Curl Drive, San Antonio, 78229 Texas USA
| | - Katherine Picho
- Department of Medicine, F. Edward Hébert School Of Medicine, Uniformed Services University, 4301 Jones Bridge Rd, Bethesda, 20814 Maryland USA
| | - Lambert Schuwirth
- Flinders University, School of Medicine, GPO Box 2100, Adelaide, 5001 South Australia Australia
| | - Anthony R. Artino
- Department of Medicine, F. Edward Hébert School Of Medicine, Uniformed Services University, 4301 Jones Bridge Rd, Bethesda, 20814 Maryland USA
| | - Ana Monica Yepes-Rios
- Department of Medicine, F. Edward Hébert School Of Medicine, Uniformed Services University, 4301 Jones Bridge Rd, Bethesda, 20814 Maryland USA
| | - Jennifer Masel
- Department of Medicine, Walter Reed National Military Medical Center, 8901 Wisconsin Ave, Bethesda, 20889 Maryland USA
| | - Cees van der Vleuten
- Department of Educational Development and Research, Maastricht University, Maastricht, 6200 MD The Netherlands
| | - Steven J. Durning
- Department of Medicine, F. Edward Hébert School Of Medicine, Uniformed Services University, 4301 Jones Bridge Rd, Bethesda, 20814 Maryland USA
| |
Collapse
|
33
|
Schuwirth L, Valentine N, Dilena P. An application of programmatic assessment for learning (PAL) system for general practice training. GMS J Med Educ 2017; 34:Doc56. [PMID: 29226224 PMCID: PMC5704621 DOI: 10.3205/zma001133] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Subscribe] [Scholar Register] [Received: 10/18/2016] [Revised: 03/08/2017] [Accepted: 05/08/2017] [Indexed: 05/25/2023]
Abstract
Aim: Programmatic assessment for learning (PAL) is becoming more and more popular as a concept but its implementation is not without problems. In this paper we describe the design principles behind a PAL program in a general practice training context. Design principles: The PAL program was designed to optimise the meaningfulness of assessment information for the registrar and to make him/her use that information to self regulate their learning. The main principles in the program were cognitivist and transformative. The main cognitive principles we used were fostering the understanding of deep structures and stimulating transfer by making registrars constantly connect practice experiences with background knowledge. Ericsson's deliberate practice approach was built in with regard to the provision of feedback combined with Pintrich's model of self regulation. Mezirow's transformative learning and insights from social network theory on collaborative learning were used to support the registrars in their development to become GP professionals. Finally the principal of test enhanced learning was optimised. Epilogue: We have provided this example explain the design decisions behind our program, but not want to present our program as the solution to any given situation.
Collapse
Affiliation(s)
- Lambert Schuwirth
- Flinders Universität, Adelaide, Australia
- Universität Maastricht, Mastricht, The Netherlands
- Chang Gung Universität, Taiwan
- Uniformed Services University, USA
| | | | | |
Collapse
|
34
|
Cecilio-Fernandes D, Medema H, Collares CF, Schuwirth L, Cohen-Schotanus J, Tio RA. Comparison of formula and number-right scoring in undergraduate medical training: a Rasch model analysis. BMC Med Educ 2017; 17:192. [PMID: 29121888 PMCID: PMC5679154 DOI: 10.1186/s12909-017-1051-8] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/16/2017] [Accepted: 11/02/2017] [Indexed: 05/10/2023]
Abstract
BACKGROUND Progress testing is an assessment tool used to periodically assess all students at the end-of-curriculum level. Because students cannot know everything, it is important that they recognize their lack of knowledge. For that reason, the formula-scoring method has usually been used. However, where partial knowledge needs to be taken into account, the number-right scoring method is used. Research comparing both methods has yielded conflicting results. As far as we know, in all these studies, Classical Test Theory or Generalizability Theory was used to analyze the data. In contrast to these studies, we will explore the use of the Rasch model to compare both methods. METHODS A 2 × 2 crossover design was used in a study where 298 students from four medical schools participated. A sample of 200 previously used questions from the progress tests was selected. The data were analyzed using the Rasch model, which provides fit parameters, reliability coefficients, and response option analysis. RESULTS The fit parameters were in the optimal interval ranging from 0.50 to 1.50, and the means were around 1.00. The person and item reliability coefficients were higher in the number-right condition than in the formula-scoring condition. The response option analysis showed that the majority of dysfunctional items emerged in the formula-scoring condition. CONCLUSIONS The findings of this study support the use of number-right scoring over formula scoring. Rasch model analyses showed that tests with number-right scoring have better psychometric properties than formula scoring. However, choosing the appropriate scoring method should depend not only on psychometric properties but also on self-directed test-taking strategies and metacognitive skills.
Collapse
Affiliation(s)
- Dario Cecilio-Fernandes
- Center for Education Development and Research in Health Professions (CEDAR), University of Groningen and University Medical Center Groningen, Antonius Deusinglaan 1, FC40, 9713 AV Groningen, The Netherlands
| | - Harro Medema
- Department Business IT & Management, NHL University of Applied Sciences, Leeuwarden, Netherlands
| | - Carlos Fernando Collares
- Faculty of Health, Medicine and Life Sciences, Educational Development and Research, Maastricht University, Maastricht, Netherlands
| | - Lambert Schuwirth
- Faculty of Health, Medicine and Life Sciences, Educational Development and Research, Maastricht University, Maastricht, Netherlands
- Prideaux Centre for Research into Health Professions Education, Flinders University, Adelaide, Australia
| | - Janke Cohen-Schotanus
- Center for Education Development and Research in Health Professions (CEDAR), University of Groningen and University Medical Center Groningen, Antonius Deusinglaan 1, FC40, 9713 AV Groningen, The Netherlands
| | - René A. Tio
- Center for Education Development and Research in Health Professions (CEDAR), University of Groningen and University Medical Center Groningen, Antonius Deusinglaan 1, FC40, 9713 AV Groningen, The Netherlands
| |
Collapse
|
35
|
|
36
|
Chehade MJ, Gill TK, Kopansky-Giles D, Schuwirth L, Karnon J, McLiesh P, Alleyne J, Woolf AD. Building multidisciplinary health workforce capacity to support the implementation of integrated, people-centred Models of Care for musculoskeletal health. Best Pract Res Clin Rheumatol 2017; 30:559-584. [PMID: 27886946 DOI: 10.1016/j.berh.2016.09.005] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2016] [Revised: 09/09/2016] [Indexed: 10/20/2022]
Abstract
To address the burden of musculoskeletal (MSK) conditions, a competent health workforce is required to support the implementation of MSK models of care. Funding is required to create employment positions with resources for service delivery and training a fit-for-purpose workforce. Training should be aligned to define "entrustable professional activities", and include collaborative skills appropriate to integrated and people-centred care and supported by shared education resources. Greater emphasis on educating MSK healthcare workers as effective trainers of peers, students and patients is required. For quality, efficiency and sustainability of service delivery, education and research capabilities must be integrated across disciplines and within the workforce, with funding models developed based on measured performance indicators from all three domains. Greater awareness of the societal and economic burden of MSK conditions is required to ensure that solutions are prioritised and integrated within healthcare policies from local to regional to international levels. These healthcare policies require consumer engagement and alignment to social, economic, educational and infrastructure policies to optimise effectiveness and efficiency of implementation.
Collapse
Affiliation(s)
- M J Chehade
- Chair International MSK Musculoskeletal Education Task Force Global Alliance for Musculoskeletal Health of the Bone and Joint Decade (GMUSC), Discipline of Orthopaedics and Trauma, Level 4 Bice Building, Royal Adelaide Hospital, North Terrace, Adelaide, SA 5000, Australia.
| | - T K Gill
- School of Medicine, Faculty of Health Sciences, The University of Adelaide, Level 7, South Australian Health and Medical Research Institute, North Terrace, Adelaide, SA 5000, Australia
| | - D Kopansky-Giles
- Graduate Education and Research, Canadian Memorial Chiropractic College, Department of Family and Community Medicine, University of Toronto, 6100 Leslie Street, Toronto, ON M2H 3J1, Canada
| | - L Schuwirth
- Prideaux Centre for Research in Health Professions Education, Flinders University, GPO Box 2100, Adelaide, SA 5001, Australia
| | - J Karnon
- School of Public Health, The University of Adelaide, 178 North Terrace, Adelaide, SA 5000, Australia
| | - P McLiesh
- Australian and New Zealand Orthopaedic Nurses Association, School of Nursing, Faculty of Health Sciences, The University of Adelaide, Royal Adelaide Hospital, Eleanor Harrald Building, North Terrace, Adelaide, SA 5000, Australia
| | - J Alleyne
- University of Toronto, Department of Family and Community Medicine, Toronto Rehabilitation Institute, Musculoskeletal Program, Toronto, Canada
| | - A D Woolf
- Bone and Joint Research Group, University of Exeter Medical School, Knowledge Spa, Royal Cornwall Hospital, Truro TR1 3HD, England, United Kingdom
| |
Collapse
|
37
|
Schuwirth L, van der Vleuten C, Durning SJ. What programmatic assessment in medical education can learn from healthcare. Perspect Med Educ 2017; 6:211-215. [PMID: 28397009 PMCID: PMC5542889 DOI: 10.1007/s40037-017-0345-1] [Citation(s) in RCA: 43] [Impact Index Per Article: 6.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/16/2023]
Affiliation(s)
- L Schuwirth
- Prideaux Centre for Research in Health Professions Education, School of Medicine, Flinders University, Adelaide, South Australia, Australia.
- Department of Educational Development and Research, Maastricht University, Maastricht, The Netherlands.
| | - C van der Vleuten
- Department of Educational Development and Research, Maastricht University, Maastricht, The Netherlands
| | - S J Durning
- Prideaux Centre for Research in Health Professions Education, School of Medicine, Flinders University, Adelaide, South Australia, Australia
- Department of Medicine and Pathology, F. Edward Hébert School of Medicine, Uniformed Services University, Bethesda, USA
| |
Collapse
|
38
|
Kikukawa M, Stalmeijer RE, Okubo T, Taketomi K, Emura S, Miyata Y, Yoshida M, Schuwirth L, Scherpbier AJJA. Development of culture-sensitive clinical teacher evaluation sheet in the Japanese context. Med Teach 2017; 39:844-850. [PMID: 28509610 DOI: 10.1080/0142159x.2017.1324138] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
AIM Many instruments for evaluating clinical teaching have been developed, albeit most in Western countries. This study aims to develop a validated cultural and local context sensitive instrument for clinical teachers in an East Asian setting (Japan), Japanese Clinical Teacher Evaluation Sheet (JaCTES). METHODS A multicenter, cross-sectional evaluation study was conducted. We collected a total of 1368 questionnaires on 304 clinical teachers, completed by residents in 16 teaching hospitals. The construct validity was examined by conducting a factor analysis and using structural equation modeling (SEM). We also assessed the reliability using generalizability analysis and decision study. RESULTS Exploratory factor analysis resulted in three-factor (role model, teaching activities, and accessibility) model including 18 items. Confirmatory factor analysis was performed, using SEM. The comparative fit index was 0.931 and the root mean square error of approximation was 0.087, meaning an acceptable goodness of fit for this model. To obtain a reliable dependability-coefficient of at least 0.70 or higher, 5-8 resident responses are necessary. DISCUSSION AND CONCLUSION JaCTES is the first reported instrument with validity evidence of content and internal structure and high feasibility in Japan, an East Asian setting. Medical educators should be aware of the local context and cultural aspects in evaluating clinical teachers.
Collapse
Affiliation(s)
- Makoto Kikukawa
- a Department of Medical Education , Kyushu University , Fukuoka, Japan
| | - Renee E Stalmeijer
- b Faculty of Health, Medicine, and Life Sciences, Department of Educational Development and Research , Maastricht University , Maastricht , The Netherlands
| | - Tomoya Okubo
- c The National Center for University Entrance Examinations , Research Division , Tokyo , Japan
| | - Kikuko Taketomi
- d Center for Medical Education , Hokkaido University , Sapporo , Japan
| | - Sei Emura
- e The Center for Graduate Medical Education Development and Research, Saga University Hospital , Saga , Japan
| | - Yasushi Miyata
- f Department of Primary Care and Community Health , Aichi Medical University School of Medicine , Nagakute , Japan
| | - Motofumi Yoshida
- g Office of Medical Education , International University of Health and Welfare , Narita , Japan
| | - Lambert Schuwirth
- h Prideaux Centre for Research in Health Professions Education, Flinders University , Adelaide, Australia
| | - Albert J J A Scherpbier
- b Faculty of Health, Medicine, and Life Sciences, Department of Educational Development and Research , Maastricht University , Maastricht , The Netherlands
| |
Collapse
|
39
|
McBee E, Schuwirth L, Durning SJ. In Reply to Ma et al. Acad Med 2017; 92:426-427. [PMID: 28350593 DOI: 10.1097/acm.0000000000001615] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Affiliation(s)
- Elexis McBee
- Assistant professor, Department of Medicine, Uniformed Services University of the Health Sciences, Bethesda, Maryland; . Professor of medical education, School of Medicine, Flinders University, Adelaide, Australia. Professor of medicine, Department of Medicine, Uniformed Services University of the Health Sciences, Bethesda, Maryland
| | | | | |
Collapse
|
40
|
Prakash S, Bihari S, Need P, Sprick C, Schuwirth L. Immersive high fidelity simulation of critically ill patients to study cognitive errors: a pilot study. BMC Med Educ 2017; 17:36. [PMID: 28178963 PMCID: PMC5299766 DOI: 10.1186/s12909-017-0871-x] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/30/2016] [Accepted: 01/26/2017] [Indexed: 06/06/2023]
Abstract
BACKGROUND The majority of human errors in healthcare originate from cognitive errors or biases. There is dearth of evidence around relative prevalence and significance of various cognitive errors amongst doctors in their first post-graduate year. This study was conducted with the objective of using high fidelity clinical simulation as a tool to study the relative occurrence of selected cognitive errors amongst doctors in their first post-graduate year. METHODS Intern simulation sessions on acute clinical problems, conducted in year 2014, were reviewed by two independent assessors with expertise in critical care. The occurrence of cognitive errors was identified using Likert scale based questionnaire and think-aloud technique. Teamwork and leadership skills were assessed using Ottawa Global Rating Scale. RESULTS The most prevalent cognitive errors included search satisfying (90%), followed by premature closure (PC) (78.6%), and anchoring (75.7%). The odds of occurrence of various cognitive errors did not change with time during internship, in contrast to teamwork and leadership skills (x2 = 11.9, P = 0.01). Anchoring appeared to be significantly associated with delay in diagnoses (P = 0.007) and occurrence of PC (P = 0.005). There was a negative association between occurrence of confirmation bias and the ability to make correct diagnosis (P = 0.05). CONCLUSIONS Our study demonstrated a high prevalence of anchoring, premature closure, and search satisfying amongst doctors in their first post-graduate year, using high fidelity simulation as a tool. The occurrence of selected cognitive errors impaired clinical performance and their prevalence did not change with time.
Collapse
Affiliation(s)
- Shivesh Prakash
- Prideaux Centre for Research in Health Professions Education, Flinders University, Bedford Park, South Australia, 5042, Australia.
- Department of Intensive care, Flinders Medical Centre, 1 Flinders drive, Bedford Park, South Australia, 5042, Australia.
| | - Shailesh Bihari
- Department of Intensive care, Flinders Medical Centre, 1 Flinders drive, Bedford Park, South Australia, 5042, Australia
| | - Penelope Need
- Director of General Practice Training, Flinders Medical Centre, Flinders Drive, Bedford Park, SA, 5042, Australia
| | - Cyle Sprick
- Simulation Unit, School of Medicine - Flinders University, Bedford Park, South Australia, 5042, Australia
| | - Lambert Schuwirth
- Prideaux Centre for Research in Health Professions Education, Flinders University, Bedford Park, South Australia, 5042, Australia
- Health Professions Education, School of Medicine, Flinders University, Bedford Park, South Australia, 5042, Australia
| |
Collapse
|
41
|
Ratcliffe TA, McBee E, Schuwirth L, Picho K, van der Vleuten CPM, Artino AR, van Merrienboer JJG, Leppink J, Durning SJ. Exploring Implications of Context Specificity and Cognitive Load in Residents. MedEdPublish 2017. [DOI: 10.15694/mep.2017.000048] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
|
42
|
Durning SJ, Artino AR, Costanzo M, Beckman TJ, Van der Vleuten C, Roy MJ, Holmboe ES, Schuwirth L. Response to: Functional neuroimaging and diagnostic reasoning. Med Teach 2016; 38:753-754. [PMID: 27049880 DOI: 10.3109/0142159x.2016.1150991] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Affiliation(s)
- Steven J Durning
- a Uniformed Services , University of the Health Sciences - Medicine (ICR) , MA 20814 , USA
| | - Anthony R Artino
- a Uniformed Services , University of the Health Sciences - Medicine (ICR) , MA 20814 , USA
| | - Michelle Costanzo
- a Uniformed Services , University of the Health Sciences - Medicine (ICR) , MA 20814 , USA
| | - Thomas J Beckman
- b Department of Internal Medicine , Mayo Clinic , 200 First Street SW , Rochester , MN 55905 , USA
| | - Cees Van der Vleuten
- c Department of Educational Development and Research , Universiteit Maastricht , P.O. Box 616 , Maastricht 6200 MD , The Netherlands
| | | | - Eric S Holmboe
- e Accreditation Council for Graduate Medical Education , Chicago , IL , USA
| | - Lambert Schuwirth
- f Flinders Innovations in Clinical Education , Flinders University , GPO box 2100 , Adelaide , South Australia 6200 MD , Australia
| |
Collapse
|
43
|
McBee E, Ratcliffe T, Goldszmidt M, Schuwirth L, Picho K, Artino AR, Masel J, Durning SJ. Clinical Reasoning Tasks and Resident Physicians: What Do They Reason About? Acad Med 2016; 91:1022-1028. [PMID: 26650677 DOI: 10.1097/acm.0000000000001024] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
PURPOSE A framework of clinical reasoning tasks thought to occur in a clinical encounter was recently developed. It proposes that diagnostic and therapeutic reasoning comprise 24 tasks. The authors of this current study used this framework to investigate what internal medicine residents reason about when they approach straightforward clinical cases. METHOD Participants viewed three video-recorded clinical encounters portraying common diagnoses. After each video, participants completed a post encounter form and think-aloud protocol. Two authors analyzed transcripts from the think-aloud protocols using a constant comparative approach. They conducted iterative coding of the utterances, classifying each according to the framework of clinical reasoning tasks. They evaluated the type, number, and sequence of tasks the residents used. RESULTS Ten residents participated in the study in 2013-2014. Across all three cases, the residents employed 14 clinical reasoning tasks. Nearly all coded tasks were associated with framing the encounter or diagnosis. The order in which residents used specific tasks varied. The average number of tasks used per case was as follows: Case 1, 4.4 (range 1-10); Case 2, 4.6 (range 1-6); and Case 3, 4.7 (range 1-7). The residents used some tasks repeatedly; the average number of task utterances was 11.6, 13.2, and 14.7 for, respectively, Case 1, 2, and 3. CONCLUSIONS Results suggest that the use of clinical reasoning tasks occurs in a varied, not sequential, process. The authors provide suggestions for strengthening the framework to more fully encompass the spectrum of reasoning tasks that occur in residents' clinical encounters.
Collapse
Affiliation(s)
- Elexis McBee
- E. McBee is assistant professor of medicine, Uniformed Services University of the Health Sciences, based at Naval Medical Center San Diego, San Diego, California. T. Ratcliffe is assistant professor of medicine, University of Texas Health Science Center, San Antonio, Texas. M. Goldszmidt is associate professor of medicine, Schulich School of Medicine & Dentistry, Western University, London, Ontario, Canada. L. Schuwirth is professor of medicine, Flinders University, Adelaide, Australia. K. Picho is assistant professor of medicine, Uniformed Services University of the Health Sciences, Bethesda, Maryland. A.R. Artino Jr is associate professor of preventive medicine and biometrics, Uniformed Services University of the Health Sciences, Bethesda, Maryland. J. Masel is third-year resident, Walter Reed National Military Medical Center, Bethesda, Maryland. S.J. Durning is professor of medicine and pathology, Uniformed Services University of the Health Sciences, Bethesda, Maryland
| | | | | | | | | | | | | | | |
Collapse
|
44
|
Durning SJ, Costanzo ME, Beckman TJ, Artino AR, Roy MJ, van der Vleuten C, Holmboe ES, Lipner RS, Schuwirth L. Functional neuroimaging correlates of thinking flexibility and knowledge structure in memory: Exploring the relationships between clinical reasoning and diagnostic thinking. Med Teach 2016; 38:570-577. [PMID: 26079668 DOI: 10.3109/0142159x.2015.1047755] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Abstract
BACKGROUND Diagnostic reasoning involves the thinking steps up to and including arrival at a diagnosis. Dual process theory posits that a physician's thinking is based on both non-analytic or fast, subconscious thinking and analytic thinking that is slower, more conscious, effortful and characterized by comparing and contrasting alternatives. Expertise in clinical reasoning may relate to the two dimensions measured by the diagnostic thinking inventory (DTI): memory structure and flexibility in thinking. AIM Explored the functional magnetic resonance imaging (fMRI) correlates of these two aspects of the DTI: memory structure and flexibility of thinking. METHODS Participants answered and reflected upon multiple-choice questions (MCQs) during fMRI. A DTI was completed shortly after the scan. The brain processes associated with the two dimensions of the DTI were correlated with fMRI phases - assessing flexibility in thinking during analytical clinical reasoning, memory structure during non-analytical clinical reasoning and the total DTI during both non-analytical and analytical reasoning in experienced physicians. RESULTS Each DTI component was associated with distinct functional neuroanatomic activation patterns, particularly in the prefrontal cortex. CONCLUSION Our findings support diagnostic thinking conceptual models and indicate mechanisms through which cognitive demands may induce functional adaptation within the prefrontal cortex. This provides additional objective validity evidence for the use of the DTI in medical education and practice settings.
Collapse
Affiliation(s)
| | | | | | | | - Michael J Roy
- a Uniformed Services University of the Health Sciences , USA
| | | | | | | | | |
Collapse
|
45
|
Durning SJ, Dong T, Ratcliffe T, Schuwirth L, Artino AR, Boulet JR, Eva K. Comparing Open-Book and Closed-Book Examinations: A Systematic Review. Acad Med 2016; 91:583-99. [PMID: 26535862 DOI: 10.1097/acm.0000000000000977] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/03/2023]
Abstract
PURPOSE To compare the relative utility of open-book examinations (OBEs) and closed-book examinations (CBEs) given the rapid expansion and accessibility of knowledge. METHOD A systematic review of peer-reviewed articles retrieved from MEDLINE, ERIC, Embase, and PsycINFO (through June 2013). In 2013-2014, articles that met inclusion criteria were reviewed by at least two investigators and coded for six outcome categories: (1) examination preparation, (2) test anxiety, (3) exam performance, (4) psychometrics and logistics, (5) testing effects, and (6) public perception. RESULTS From 4,192 identified studies, 37 were included. The level of learner and subject studied varied. The frequency of each outcome category was as follows: (1) exam preparation (n = 20; 54%); (2) test anxiety (n = 14; 38%); (3) exam performance (n = 30; 81%); (4) psychometrics and logistics (n = 5; 14%); (5) testing effects (n = 24; 65%); and (6) public perception (n = 5; 14%). Preexamination outcome findings were equivocal, but students may prepare more extensively for CBEs. For during-examination outcomes, examinees appear to take longer to complete OBEs. Studies addressing examination performance favored CBE, particularly when preparation for CBE was greater than for OBE. Postexamination outcomes suggest little difference in testing effects or public perception. CONCLUSIONS Given the data available, there does not appear to be sufficient evidence for exclusively using CBE or OBE. As such, a combined approach could become a more significant part of testing protocols as licensing bodies seek ways to assess competencies other than the maintenance of medical knowledge.
Collapse
Affiliation(s)
- Steven J Durning
- S.J. Durning is professor of medicine and pathology, Uniformed Services University of the Health Sciences, Bethesda, Maryland. T. Dong is assistant professor of medicine, Uniformed Services University of the Health Sciences, Bethesda, Maryland. T. Ratcliffe is assistant professor of medicine, Uniformed Services University of the Health Sciences, Bethesda, Maryland. L. Schuwirth is professor of medical education, Flinders University, Bedford Park, South Australia, Australia, and professor for innovative assessment, Maastricht University, Maastricht, the Netherlands. A.R. Artino Jr is professor of medicine, Uniformed Services University of the Health Sciences, Bethesda, Maryland. J.R. Boulet is vice president of research and evaluation, Foundation for Advancement of International Medical Education and Research, Philadelphia, Pennsylvania. K. Eva is professor of medicine, University of British Columbia, Vancouver, British Columbia, Canada
| | | | | | | | | | | | | |
Collapse
|
46
|
Scott K, Caldwell P, Schuwirth L. Response to Ten steps to health professional education research. Clin Teach 2016; 13:167. [PMID: 27004782 DOI: 10.1111/tct.12464] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Affiliation(s)
- Karen Scott
- Discipline of Paediatrics and Child Health, University of Sydney, New South Wales, Australia
| | - Patrina Caldwell
- Discipline of Paediatrics and Child Health, University of Sydney, New South Wales, Australia.,Centre for Kidney Research, Children's Hospital at Westmead, New South Wales, Australia
| | - Lambert Schuwirth
- Health Professional Education, Flinders University, Adelaide, South Australia, Australia
| |
Collapse
|
47
|
|
48
|
McBee E, Ratcliffe T, Picho K, Artino AR, Schuwirth L, Kelly W, Masel J, van der Vleuten C, Durning SJ. Consequences of contextual factors on clinical reasoning in resident physicians. Adv Health Sci Educ Theory Pract 2015; 20:1225-36. [PMID: 25753295 DOI: 10.1007/s10459-015-9597-x] [Citation(s) in RCA: 34] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/28/2014] [Accepted: 02/19/2015] [Indexed: 05/26/2023]
Abstract
Context specificity and the impact that contextual factors have on the complex process of clinical reasoning is poorly understood. Using situated cognition as the theoretical framework, our aim was to evaluate the verbalized clinical reasoning processes of resident physicians in order to describe what impact the presence of contextual factors have on their clinical reasoning. Participants viewed three video recorded clinical encounters portraying straightforward diagnoses in internal medicine with select patient contextual factors modified. After watching each video recording, participants completed a think-aloud protocol. Transcripts from the think-aloud protocols were analyzed using a constant comparative approach. After iterative coding, utterances were analyzed for emergent themes with utterances grouped into categories, themes and subthemes. Ten residents participated in the study with saturation reached during analysis. Participants universally acknowledged the presence of contextual factors in the video recordings. Four categories emerged as a consequence of the contextual factors: (1) emotional reactions (2) behavioral inferences (3) optimizing the doctor patient relationship and (4) difficulty with closure of the clinical encounter. The presence of contextual factors may impact clinical reasoning performance in resident physicians. When confronted with the presence of contextual factors in a clinical scenario, residents experienced difficulty with closure of the encounter, exhibited as diagnostic uncertainty. This finding raises important questions about the relationship between contextual factors and clinical reasoning activities and how this relationship might influence the cost effectiveness of care. This study also provides insight into how the phenomena of context specificity may be explained using situated cognition theory.
Collapse
Affiliation(s)
- Elexis McBee
- Department of Medicine, Naval Medical Center San Diego, 34800 Bob Wilson Drive, San Diego, CA, 92134, USA.
| | - Temple Ratcliffe
- Department of Medicine, University of Texas Health Science Center at San Antonio, 7703 Floyd Curl Drive, San Antonio, TX, 78229, USA
| | - Katherine Picho
- Department of Medicine, F. Edward Hébert School of Medicine, Uniformed Services University, 4301 Jones Bridge Rd., Bethesda, MD, 20814, USA
| | - Anthony R Artino
- Department of Medicine, F. Edward Hébert School of Medicine, Uniformed Services University, 4301 Jones Bridge Rd., Bethesda, MD, 20814, USA
| | - Lambert Schuwirth
- Flinders University, School of Medicine, GPO Box 2100, Adelaide, 5001, South Australia
| | - William Kelly
- Department of Medicine, F. Edward Hébert School of Medicine, Uniformed Services University, 4301 Jones Bridge Rd., Bethesda, MD, 20814, USA
| | - Jennifer Masel
- Department of Medicine, Walter Reed National Military Medical Center, 8901 Wisconsin Ave., Bethesda, MD, 20889, USA
| | - Cees van der Vleuten
- Department of Educational Development and Research, Maastricht University, 6200 MD, Maastricht, The Netherlands
| | - Steven J Durning
- Department of Medicine, F. Edward Hébert School of Medicine, Uniformed Services University, 4301 Jones Bridge Rd., Bethesda, MD, 20814, USA
| |
Collapse
|
49
|
Durning SJ, Dong T, Artino AR, van der Vleuten C, Holmboe E, Schuwirth L. Dual processing theory and experts' reasoning: exploring thinking on national multiple-choice questions. Perspect Med Educ 2015; 4:168-175. [PMID: 26243535 PMCID: PMC4530528 DOI: 10.1007/s40037-015-0196-6] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/26/2023]
Abstract
BACKGROUND An ongoing debate exists in the medical education literature regarding the potential benefits of pattern recognition (non-analytic reasoning), actively comparing and contrasting diagnostic options (analytic reasoning) or using a combination approach. Studies have not, however, explicitly explored faculty's thought processes while tackling clinical problems through the lens of dual process theory to inform this debate. Further, these thought processes have not been studied in relation to the difficulty of the task or other potential mediating influences such as personal factors and fatigue, which could also be influenced by personal factors such as sleep deprivation. We therefore sought to determine which reasoning process(es) were used with answering clinically oriented multiple-choice questions (MCQs) and if these processes differed based on the dual process theory characteristics: accuracy, reading time and answering time as well as psychometrically determined item difficulty and sleep deprivation. METHODS We performed a think-aloud procedure to explore faculty's thought processes while taking these MCQs, coding think-aloud data based on reasoning process (analytic, nonanalytic, guessing or combination of processes) as well as word count, number of stated concepts, reading time, answering time, and accuracy. We also included questions regarding amount of work in the recent past. We then conducted statistical analyses to examine the associations between these measures such as correlations between frequencies of reasoning processes and item accuracy and difficulty. We also observed the total frequencies of different reasoning processes in the situations of getting answers correctly and incorrectly. RESULTS Regardless of whether the questions were classified as 'hard' or 'easy', non-analytical reasoning led to the correct answer more often than to an incorrect answer. Significant correlations were found between self-reported recent number of hours worked with think-aloud word count and number of concepts used in the reasoning but not item accuracy. When all MCQs were included, 19 % of the variance of correctness could be explained by the frequency of expression of these three think-aloud processes (analytic, nonanalytic, or combined). DISCUSSION We found evidence to support the notion that the difficulty of an item in a test is not a systematic feature of the item itself but is always a result of the interaction between the item and the candidate. Use of analytic reasoning did not appear to improve accuracy. Our data suggest that individuals do not apply either System 1 or System 2 but instead fall along a continuum with some individuals falling at one end of the spectrum.
Collapse
Affiliation(s)
- Steven J Durning
- Uniformed Services University of the Health Sciences, 4301 Jones Bridge Road, 20814, Bethesda, MD, USA.
| | - Ting Dong
- Uniformed Services University of the Health Sciences, 4301 Jones Bridge Road, 20814, Bethesda, MD, USA
| | - Anthony R Artino
- Uniformed Services University of the Health Sciences, 4301 Jones Bridge Road, 20814, Bethesda, MD, USA
| | | | - Eric Holmboe
- Accreditation Council for Graduate Medical Education, 515 North State Street, 60654, Chicago, IL, USA
| | | |
Collapse
|
50
|
Abstract
BACKGROUND The approaches used to educate future clinicians must be continually improved through evidence-based methods. Clinicians interested in conducting education research need to understand the terminology and conventions of health professional education, in the same way that health professional educators from education backgrounds need to be aware of clinical practices and scientific mores and jargon. This article provides clinicians with 10 steps to conducting health professional education research, and encourages collaboration between clinicians interested in education and health professional educators. SUMMARY The basic steps in conducting education research are introduced, beginning with literature searches, using appropriate terminology and writing conventions, and finding research collaborators. We encourage researchers to ask themselves, 'So what?' about their research idea to ensure it is interesting and relevant to a journal's readers. The nuts and bolts of educational research are then presented, including research questions and methodologies, outcome measures, theoretical frameworks and epistemologies. The final two steps aim to foster internationally relevant and well-designed research studies. CONCLUSION Conducting and publishing education research is often difficult for clinicians, who struggle with what is required. Yet clinicians who teach are ideally placed to identify the knowledge gaps about how we can more effectively educate future clinicians. These 10 steps provide clinicians with guidance on how to conduct education research so relevant research findings can inform the education of future clinicians. Conducting and publishing education research is often difficult for clinicians.
Collapse
Affiliation(s)
- Karen Scott
- Discipline of Paediatrics and Child Health, The University of Sydney, Australia
| | - Patrina Caldwell
- Discipline of Paediatrics and Child Health, The University of Sydney, and Centre for Kidney Research, The Children's Hospital at Westmead, Australia
| | - Lambert Schuwirth
- Health Professional Education, Flinders University, Adelaide, Australia
| |
Collapse
|