1
|
Castelo S, Rulff J, Solunke P, McGowan E, Wu G, Roman I, Lopez R, Steers B, Sun Q, Bello J, Feest B, Middleton M, Mckendrick R, Silva C. HuBar: A Visual Analytics Tool to Explore Human Behavior Based on fNIRS in AR Guidance Systems. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2025; 31:119-129. [PMID: 39250412 DOI: 10.1109/tvcg.2024.3456388] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/11/2024]
Abstract
The concept of an intelligent augmented reality (AR) assistant has significant, wide-ranging applications, with potential uses in medicine, military, and mechanics domains. Such an assistant must be able to perceive the environment and actions, reason about the environment state in relation to a given task, and seamlessly interact with the task performer. These interactions typically involve an AR headset equipped with sensors which capture video, audio, and haptic feedback. Previous works have sought to facilitate the development of intelligent AR assistants by visualizing these sensor data streams in conjunction with the assistant's perception and reasoning model outputs. However, existing visual analytics systems do not focus on user modeling or include biometric data, and are only capable of visualizing a single task session for a single performer at a time. Moreover, they typically assume a task involves linear progression from one step to the next. We propose a visual analytics system that allows users to compare performance during multiple task sessions, focusing on non-linear tasks where different step sequences can lead to success. In particular, we design visualizations for understanding user behavior through functional near-infrared spectroscopy (fNIRS) data as a proxy for perception, attention, and memory as well as corresponding motion data (acceleration, angular velocity, and gaze). We distill these insights into embedding representations that allow users to easily select groups of sessions with similar behaviors. We provide two case studies that demonstrate how to use these visualizations to gain insights about task performance using data collected during helicopter copilot training tasks. Finally, we evaluate our approach through an in-depth examination of a think-aloud experiment with five domain experts.
Collapse
|
2
|
Cortes CAT, Thurow S, Ong A, Sharples JJ, Bednarz T, Stevens G, Favero DD. Analysis of Wildfire Visualization Systems for Research and Training: Are They Up for the Challenge of the Current State of Wildfires? IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2024; 30:4285-4303. [PMID: 37030767 DOI: 10.1109/tvcg.2023.3258440] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
Wildfires affect many regions across the world. The accelerated progression of global warming has amplified their frequency and scale, deepening their impact on human life, the economy, and the environment. The temperature rise has been driving wildfires to behave unpredictably compared to those previously observed, challenging researchers and fire management agencies to understand the factors behind this behavioral change. Furthermore, this change has rendered fire personnel training outdated and lost its ability to adequately prepare personnel to respond to these new fires. Immersive visualization can play a key role in tackling the growing issue of wildfires. Therefore, this survey reviews various studies that use immersive and non-immersive data visualization techniques to depict wildfire behavior and train first responders and planners. This paper identifies the most useful characteristics of these systems. While these studies support knowledge creation for certain situations, there is still scope to comprehensively improve immersive systems to address the unforeseen dynamics of wildfires.
Collapse
|
3
|
Acuña K, Sapahia R, Jiménez IN, Antonietti M, Anzola I, Cruz M, García MT, Krishnan V, Leveille LA, Resch MD, Galor A, Habash R, DeBuc DC. Functional Near-Infrared Spectrometry as a Useful Diagnostic Tool for Understanding the Visual System: A Review. J Clin Med 2024; 13:282. [PMID: 38202288 PMCID: PMC10779649 DOI: 10.3390/jcm13010282] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2023] [Revised: 12/24/2023] [Accepted: 12/27/2023] [Indexed: 01/12/2024] Open
Abstract
This comprehensive review explores the role of Functional Near-Infrared Spectroscopy (fNIRS) in advancing our understanding of the visual system. Beginning with an introduction to fNIRS, we delve into its historical development, highlighting how this technology has evolved over time. The core of the review critically examines the advantages and disadvantages of fNIRS, offering a balanced view of its capabilities and limitations in research and clinical settings. We extend our discussion to the diverse applications of fNIRS beyond its traditional use, emphasizing its versatility across various fields. In the context of the visual system, this review provides an in-depth analysis of how fNIRS contributes to our understanding of eye function, including eye diseases. We discuss the intricacies of the visual cortex, how it responds to visual stimuli and the implications of these findings in both health and disease. A unique aspect of this review is the exploration of the intersection between fNIRS, virtual reality (VR), augmented reality (AR) and artificial intelligence (AI). We discuss how these cutting-edge technologies are synergizing with fNIRS to open new frontiers in visual system research. The review concludes with a forward-looking perspective, envisioning the future of fNIRS in a rapidly evolving technological landscape and its potential to revolutionize our approach to studying and understanding the visual system.
Collapse
Affiliation(s)
- Kelly Acuña
- School of Medicine, Georgetown University, Washington, DC 20007, USA;
| | - Rishav Sapahia
- Department of Ophthalmology, Bascom Palmer Eye Institute, University of Miami, Miami, FL 33136, USA; (R.S.); (M.A.); (M.T.G.); (V.K.); (L.A.L.); (A.G.)
| | - Irene Newman Jiménez
- Department of Cognitive Science, Faculty of Arts & Science, McGill University, Montreal, QC H4A 3J1, Canada;
| | - Michael Antonietti
- Department of Ophthalmology, Bascom Palmer Eye Institute, University of Miami, Miami, FL 33136, USA; (R.S.); (M.A.); (M.T.G.); (V.K.); (L.A.L.); (A.G.)
| | - Ignacio Anzola
- Department of Ophthalmology, Bascom Palmer Eye Institute, University of Miami, Miami, FL 33136, USA; (R.S.); (M.A.); (M.T.G.); (V.K.); (L.A.L.); (A.G.)
| | - Marvin Cruz
- Department of Ophthalmology, Bascom Palmer Eye Institute, University of Miami, Miami, FL 33136, USA; (R.S.); (M.A.); (M.T.G.); (V.K.); (L.A.L.); (A.G.)
| | - Michael T. García
- Department of Ophthalmology, Bascom Palmer Eye Institute, University of Miami, Miami, FL 33136, USA; (R.S.); (M.A.); (M.T.G.); (V.K.); (L.A.L.); (A.G.)
| | - Varun Krishnan
- Department of Ophthalmology, Bascom Palmer Eye Institute, University of Miami, Miami, FL 33136, USA; (R.S.); (M.A.); (M.T.G.); (V.K.); (L.A.L.); (A.G.)
| | - Lynn A. Leveille
- Department of Ophthalmology, Bascom Palmer Eye Institute, University of Miami, Miami, FL 33136, USA; (R.S.); (M.A.); (M.T.G.); (V.K.); (L.A.L.); (A.G.)
| | - Miklós D. Resch
- Department of Ophthalmology, Semmelweis University, 1085 Budapest, Hungary;
| | - Anat Galor
- Department of Ophthalmology, Bascom Palmer Eye Institute, University of Miami, Miami, FL 33136, USA; (R.S.); (M.A.); (M.T.G.); (V.K.); (L.A.L.); (A.G.)
| | - Ranya Habash
- Department of Ophthalmology, Bascom Palmer Eye Institute, University of Miami, Miami, FL 33136, USA; (R.S.); (M.A.); (M.T.G.); (V.K.); (L.A.L.); (A.G.)
| | - Delia Cabrera DeBuc
- Department of Ophthalmology, Bascom Palmer Eye Institute, University of Miami, Miami, FL 33136, USA; (R.S.); (M.A.); (M.T.G.); (V.K.); (L.A.L.); (A.G.)
| |
Collapse
|
4
|
Li X. Visualization Display System of Gannan Hakka Paper-Cut Works Based on Computer Graphics Algorithm. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE 2022; 2022:2419689. [PMID: 35371254 PMCID: PMC8970904 DOI: 10.1155/2022/2419689] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/26/2022] [Revised: 02/18/2022] [Accepted: 02/21/2022] [Indexed: 11/18/2022]
Abstract
Today, computer graphics and graphic image processing techniques have been widely used in daily life and industrial production. Due to the development of computers, computer graphics has brought more convenience to our daily life. In order to give full play to the value of computers, this paper takes the Hakka paper-cut art with local characteristics as the starting point, first of all its development history, artistic characteristics, compositional forms, expression techniques, cultural connotations, Hakka paper-cut patterns, and the symbolic meaning of folk customs, and then we design a visualization system for the paper-cut works of Gannan Hakka based on computer graphics. In addition, the system provides a solution for the integration of Gannan Hakka paper-cut art and Jiangxi native product packaging design and provides a reference for the theory and practice of modern native product packaging design.
Collapse
Affiliation(s)
- Xingping Li
- Ganzhou Teachers College, Ganzhou, Jiangxi 341000, China
| |
Collapse
|