1
|
Hayes J, Gabbard JL, Mehta RK. Learning selection-based augmented reality interactions across different training modalities: uncovering sex-specific neural strategies. FRONTIERS IN NEUROERGONOMICS 2025; 6:1539552. [PMID: 40357524 PMCID: PMC12066766 DOI: 10.3389/fnrgo.2025.1539552] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/04/2024] [Accepted: 04/09/2025] [Indexed: 05/15/2025]
Abstract
Introduction Recent advancements in augmented reality (AR) technology have opened up potential applications across various industries. In this study, we assess the effectiveness of psychomotor learning in AR compared to video-based training methods. Methods Thirty-three participants (17 males) trained on four selection-based AR interactions by either watching a video or engaging in hands-on practice. Both groups were evaluated by executing these learned interactions in AR. Results The AR group reported a higher subjective workload during training but showed significantly faster completion times during evaluation. We analyzed brain activation and functional connectivity using functional near-infrared spectroscopy during the evaluation phase. Our findings indicate that participants who trained in AR displayed more efficient brain networks, suggesting improved neural efficiency. Discussion Differences in sex-related activation and connectivity hint at varying neural strategies used during motor learning in AR. Future studies should investigate how demographic factors might influence performance and user experience in AR-based training programs.
Collapse
Affiliation(s)
- John Hayes
- Department of Industrial and Systems Engineering, Texas A&M University, College Station, TX, United States
| | - Joseph L. Gabbard
- Grado Department of Industrial and Systems Engineering, Virginia Tech, Blacksburg, VA, United States
| | - Ranjana K. Mehta
- Department of Industrial and Systems Engineering, University of Wisconsin Madison, Madison, WI, United States
| |
Collapse
|
2
|
Hanley A, Locke A, Singh J, Tung P, Hucker WJ, D'Angelo R, Avari Silva JN, Silva JR, d'Avila A, Michaud GF. The PARADIGM Study: Procedural Augmented Reality Assessment in a 3-Dimensional Image-Guided Modality. Circ Arrhythm Electrophysiol 2025; 18:e013222. [PMID: 40177752 DOI: 10.1161/circep.124.013222] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/15/2024] [Accepted: 03/18/2025] [Indexed: 04/05/2025]
Abstract
BACKGROUND The CommandEP system v2 (Sentiar, St. Louis, MO) utilizes an augmented reality headset (Magic Leap, Plantation, FL) to display a real-time 3-dimensional electroanatomic map, catheter locations, and ablation catheter contact force data to the electrophysiologist using a hands-free interface. In the intra-PARADIGM study (Procedural Augmented Reality Assessment in a 3-Dimensional Image Guided Modality), the impact of the CommandEP system on the electrophysiologist's ability to navigate accurately, intraprocedural communications, and system usability were studied. METHODS CommandEP was used prospectively in patients undergoing electrophysiologist studies at 2 sites with 8 users. The electrophysiologist's ability to navigate accurately was calculated as catheter tip displacement from the target using CommandEP versus the electroanatomic mapping system. Physician-mapper interactions were quantified and classified as high- versus low-quality communications (high quality directly impacted navigation, medical decision-making, or patient care). Usability was assessed via survey. RESULTS A total of 102 patients completed the study with the following diagnoses: atrial fibrillation (n=78/102, 76%), atrial flutter (8/102, 8%), atrial tachycardia/supraventricular tachycardia (n=9/102, 9%), premature ventricular contraction (n=6/102, 6%), and cardiac neuroablation (1/102, 1%). The physician's ability to navigate was more accurate when using the CommandEP system with an average distance of 2.98±2 mm versus electroanatomic mapping system 3.27±2 mm (P=0.02); 21% of points navigated using CommandEP versus 28% of points navigated using electroanatomic mapping system were >4 mm from the target (P=0.03). In all, 393 communications during study tasks were counted with 30 events when using CommandEP versus 363 events when using electroanatomic mapping system. Subanalysis showed no difference in accuracy pre- versus postcontact force (p=ns) and a slight reduction in both low- and high-quality communications (p=ns). Notably, 94% agreed/strongly agreed that they felt comfortable using the system, and 72% agreed/strongly agreed they would be comfortable using the CommandEP system in most/all EPS. CONCLUSIONS The CommandEP system improved physicians' ability to navigate accuracy, reduced the number of communications, increased the quality of communications, and had high usability.
Collapse
Affiliation(s)
- Alan Hanley
- Division of Cardiology, Massachusetts General Hospital, Boston (A.H., J.S., W.J.H., G.F.M.)
| | - Andrew Locke
- Beth Israel Hospital, Boston, MA (A.L., P.T., R.D., A.A.)
| | - Jagmeet Singh
- Division of Cardiology, Massachusetts General Hospital, Boston (A.H., J.S., W.J.H., G.F.M.)
| | - Patricia Tung
- Beth Israel Hospital, Boston, MA (A.L., P.T., R.D., A.A.)
| | - William J Hucker
- Division of Cardiology, Massachusetts General Hospital, Boston (A.H., J.S., W.J.H., G.F.M.)
| | | | - Jennifer N Avari Silva
- Division of Pediatric Cardiology, Washington University in St. Louis School of Medicine, MO (J.N.A.S.)
- Department of Biomedical Engineering, McKelvey School of Engineering, Washington University in St. Louis, MO (J.N.A.S., J.R.S.)
- Sentiar, Inc, St. Louis, MO (J.N.A.S., J.R.S.)
| | - Jonathan R Silva
- Department of Biomedical Engineering, McKelvey School of Engineering, Washington University in St. Louis, MO (J.N.A.S., J.R.S.)
- Sentiar, Inc, St. Louis, MO (J.N.A.S., J.R.S.)
| | - Andre d'Avila
- Beth Israel Hospital, Boston, MA (A.L., P.T., R.D., A.A.)
| | - Gregory F Michaud
- Division of Cardiology, Massachusetts General Hospital, Boston (A.H., J.S., W.J.H., G.F.M.)
| |
Collapse
|
3
|
Ripsam M, Nerdel C. Augmented reality for chemistry education to promote the use of chemical terminology in teacher training. Front Psychol 2024; 15:1392529. [PMID: 39105150 PMCID: PMC11299336 DOI: 10.3389/fpsyg.2024.1392529] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2024] [Accepted: 06/13/2024] [Indexed: 08/07/2024] Open
Abstract
Chemistry as a whole is divided into three levels. The macroscopic level describes real, observable phenomena of the material world. The submicroscopic level focuses on particles. The representative level includes pictorial and symbolic representations to visualize substance in its nature. Students often have problems separating these levels and conceptually transfer each of the three levels to the other. Therefore, teachers need to use chemical terminology correctly when teaching the substance-particle concept. Augmented Reality (AR) connects real and virtual world. The observer physically moves in a real environment that integrates virtual elements. The AR technology has great potential for learning in the subject chemistry, especially when it comes to making the "invisible" visible and illustrating scientific phenomena at particle level. The simultaneous presentation should avoid split-attention and offers new possibilities to interactively deal with (M)ER. The question arises whether AR has a positive effect on the use of technical language and the associated understanding of the concept of dealing with (M)ER at the substance and particle levels. With an AR app on the tablet and the AR glasses, the chemical processes of a real experiment are represented by AR visualizations. Therefore, the AR app was piloted. This study captured the chemistry handling with (M)ER of chemistry teachers (N = 30) using a pre-post survey. The participating preservice teachers are described below. Each test includes five tasks elaborated by thinking aloud. The thinking-aloud protocols to acquire the use of the chemical terminology are evaluated in MAXQDA.
Collapse
Affiliation(s)
- Melanie Ripsam
- Associate Professorship of Life Sciences Education, TUM School of Social Sciences and Technology, Technical University of Munich, Munich, Germany
| | | |
Collapse
|
5
|
The Hitchhiker’s Guide to Fused Twins: A Review of Access to Digital Twins In Situ in Smart Cities. REMOTE SENSING 2022. [DOI: 10.3390/rs14133095] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/04/2023]
Abstract
Smart Cities already surround us, and yet they are still incomprehensibly far from directly impacting everyday life. While current Smart Cities are often inaccessible, the experience of everyday citizens may be enhanced with a combination of the emerging technologies Digital Twins (DTs) and Situated Analytics. DTs represent their Physical Twin (PT) in the real world via models, simulations, (remotely) sensed data, context awareness, and interactions. However, interaction requires appropriate interfaces to address the complexity of the city. Ultimately, leveraging the potential of Smart Cities requires going beyond assembling the DT to be comprehensive and accessible. Situated Analytics allows for the anchoring of city information in its spatial context. We advance the concept of embedding the DT into the PT through Situated Analytics to form Fused Twins (FTs). This fusion allows access to data in the location that it is generated in in an embodied context that can make the data more understandable. Prototypes of FTs are rapidly emerging from different domains, but Smart Cities represent the context with the most potential for FTs in the future. This paper reviews DTs, Situated Analytics, and Smart Cities as the foundations of FTs. Regarding DTs, we define five components (physical, data, analytical, virtual, and Connection Environments) that we relate to several cognates (i.e., similar but different terms) from existing literature. Regarding Situated Analytics, we review the effects of user embodiment on cognition and cognitive load. Finally, we classify existing partial examples of FTs from the literature and address their construction from Augmented Reality, Geographic Information Systems, Building/City Information Models, and DTs and provide an overview of future directions.
Collapse
|
6
|
Application and Investigation of Multimedia Design Principles in Augmented Reality Learning Environments. INFORMATION 2022. [DOI: 10.3390/info13020074] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Digital media have changed the way educational instructions are designed. Learning environments addressing different presentation modes, sensory modalities and realities have evolved, with augmented reality (AR) as one of the latest developments in which multiple aspects of all three dimensions can be united. Multimedia learning principles can generally be applied to AR scenarios that combine physical environments and virtual elements, but their AR-specific effectiveness is unclear so far. In the current paper, we describe two studies examining AR-specific occurrences of two basic multimedia learning principles: (1) the spatial contiguity principle with visual learning material, leveraging AR-specific spatiality potentials, and (2) the coherence principle with audiovisual learning material, leveraging AR-specific contextuality potentials. Both studies use video-based implementations of AR experiences combining textual and pictorial representation modes as well as virtual and physical visuals. We examine the effects of integrated and separated visual presentations of virtual and physical elements (study one, N = 80) in addition to the effects of the omission of or the addition of matching or non-matching sounds (study two, N = 130) on cognitive load, task load and knowledge. We find only few significant effects and interesting descriptive results. We discuss the results and the implementations based on theory and make suggestions for future research.
Collapse
|
7
|
Smartphones and Learning: An Extension of M-Learning or a Distinct Area of Inquiry. EDUCATION SCIENCES 2022. [DOI: 10.3390/educsci12010050] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
The smartphone has become an integral part of the education landscape. While there has been significant smartphone research in education under the guise of m-learning, the unique role of the device suggests that m-learning may not be an appropriate characterization. The purpose of this paper is to review the use of m-learning as a primary descriptor for smartphone- and learning-related research. In support of this goal, the paper reviews the definitions associated with m-learning, smartphones, and related technologies from the perspective of educational research. In addition, a review of author keywords of research on smartphones in education is used to provide context to the classification of the research. Finally, three theoretically guided smartphone programs are presented as evidence of the unique nature of smartphone and learning research. This review concludes with recommendations for the characterization of future research.
Collapse
|
8
|
Investigation of the Effectiveness of an Augmented Reality and a Dynamic Simulation System Collaboration in Oil Pump Maintenance. APPLIED SCIENCES-BASEL 2021. [DOI: 10.3390/app12010350] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
The maintenance of oil pumps is a complex task for any operating organization, and for an industrial enterprise in the oil and gas sector of the economy, this issue has a high degree of urgency. One of the reasons for this is a wide spread of pumping equipment in all areas of oil and gas enterprises. At the same time, an aggressive environment, uneven load, remote facilities, and harsh climatic zones (especially in the areas of the Arctic region or production platforms) are factors that make it relevant to develop special systems that help or simplify the maintenance of pumping equipment. Dynamic modeling is one of the modern technologies which allows for solving the urgent issue of assessing the technical condition of equipment. It is the basis of systems that carry out diagnostics and prognostic calculations and allow for assessing the dynamic state of objects under various conditions of their operation, among other functions. Augmented reality technology is a technology that allows for reducing the time for equipment maintenance by reducing the time for searching and processing various information required in the maintenance process. This paper presents an investigation of the effectiveness of an augmented reality and a dynamic simulation system collaboration in oil pump maintenance. Since there is insufficient research on the joint application of these two technologies, the urgent issue is to prove the effectiveness of such collaboration. For this purpose, this paper provides a description of the system structure, gives a description of the development process of the augmented reality system application and tests the application using Microsoft HoloLens 2.
Collapse
|