1
|
Zhao L, Isenberg T, Xie F, Liang HN, Yu L. SpatialTouch: Exploring Spatial Data Visualizations in Cross-Reality. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2025; 31:897-907. [PMID: 39255119 DOI: 10.1109/tvcg.2024.3456368] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/12/2024]
Abstract
We propose and study a novel cross-reality environment that seamlessly integrates a monoscopic 2D surface (an interactive screen with touch and pen input) with a stereoscopic 3D space (an augmented reality HMD) to jointly host spatial data visualizations. This innovative approach combines the best of two conventional methods of displaying and manipulating spatial 3D data, enabling users to fluidly explore diverse visual forms using tailored interaction techniques. Providing such effective 3D data exploration techniques is pivotal for conveying its intricate spatial structures-often at multiple spatial or semantic scales-across various application domains and requiring diverse visual representations for effective visualization. To understand user reactions to our new environment, we began with an elicitation user study, in which we captured their responses and interactions. We observed that users adapted their interaction approaches based on perceived visual representations, with natural transitions in spatial awareness and actions while navigating across the physical surface. Our findings then informed the development of a design space for spatial data exploration in cross-reality. We thus developed cross-reality environments tailored to three distinct domains: for 3D molecular structure data, for 3D point cloud data, and for 3D anatomical data. In particular, we designed interaction techniques that account for the inherent features of interactions in both spaces, facilitating various forms of interaction, including mid-air gestures, touch interactions, pen interactions, and combinations thereof, to enhance the users' sense of presence and engagement. We assessed the usability of our environment with biologists, focusing on its use for domain research. In addition, we evaluated our interaction transition designs with virtual and mixed-reality experts to gather further insights. As a result, we provide our design suggestions for the cross-reality environment, emphasizing the interaction with diverse visual representations and seamless interaction transitions between 2D and 3D spaces.
Collapse
|
2
|
Ayyanchira A, Mahfoud E, Wang W, Lu A. Toward cross-platform immersive visualization for indoor navigation and collaboration with augmented reality. J Vis (Tokyo) 2022. [DOI: 10.1007/s12650-022-00852-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
3
|
VeLight: A 3D virtual reality tool for CT-based anatomy teaching and training. J Vis (Tokyo) 2021. [DOI: 10.1007/s12650-021-00790-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
4
|
Fonnet A, Prie Y. Survey of Immersive Analytics. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2021; 27:2101-2122. [PMID: 31352344 DOI: 10.1109/tvcg.2019.2929033] [Citation(s) in RCA: 22] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Immersive analytics (IA) is a new term referring to the use of immersive technologies for data analysis. Yet such applications are not new, and numerous contributions have been made in the last three decades. However, no survey reviewing all these contributions is available. Here we propose a survey of IA from the early nineties until the present day, describing how rendering technologies, data, sensory mapping, and interaction means have been used to build IA systems, as well as how these systems have been evaluated. The conclusions that emerge from our analysis are that: multi-sensory aspects of IA are under-exploited, the 3DUI and VR community knowledge regarding immersive interaction is not sufficiently utilised, the IA community should focus on converging towards best practices, as well as aim for real life IA systems.
Collapse
|
5
|
Kloesel B, Juhnke B, Irvine L, Donadio JV, Erdman A, Belani K. Computer-Generated Three-Dimensional Airway Models as a Decision-Support Tool for Preoperative Evaluation and Procedure-Planning in Pediatric Anesthesiology. J Med Syst 2021; 45:21. [PMID: 33426609 PMCID: PMC7797200 DOI: 10.1007/s10916-020-01698-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2020] [Accepted: 12/09/2020] [Indexed: 11/26/2022]
Abstract
Technology improvements have rapidly advanced medicine over the last few decades. New approaches are constantly being developed and utilized. Anesthesiology strongly relies on technology for resuscitation, life-support, monitoring, safety, clinical care, and education. This manuscript describes a reverse engineering process to confirm the fit of a medical device in a pediatric patient. The method uses virtual reality and three-dimensional printing technologies to evaluate the feasibility of a complex procedure requiring one-lung isolation and one-lung ventilation. Based on the results of the device fit analysis, the anesthesiology team confidently proceeded with the operation. The approach used and described serves as an example of the advantages available when coupling new technologies to visualize patient anatomy during the procedural planning process.
Collapse
Affiliation(s)
- Benjamin Kloesel
- Department of Anesthesiology, Division of Pediatric Anesthesiology, University of Minnesota, B515 Mayo Building, 420 Delaware Street SE, Minneapolis, MN, 55455, USA.
| | - Bethany Juhnke
- Earl E. Bakken Medical Devices Center, University of Minnesota, Minneapolis, MN, USA
- Department of Mechanical Engineering, University of Minnesota, Minneapolis, MN, USA
| | - Laura Irvine
- Department of Mechanical Engineering, University of Minnesota, Minneapolis, MN, USA
| | - James V Donadio
- Department of Mechanical Engineering, University of Minnesota, Minneapolis, MN, USA
| | - Arthur Erdman
- Earl E. Bakken Medical Devices Center, University of Minnesota, Minneapolis, MN, USA
- Department of Mechanical Engineering, University of Minnesota, Minneapolis, MN, USA
| | - Kumar Belani
- Department of Anesthesiology, Division of Pediatric Anesthesiology, University of Minnesota, B515 Mayo Building, 420 Delaware Street SE, Minneapolis, MN, 55455, USA
| |
Collapse
|
6
|
Kunert A, Weissker T, Froehlich B, Kulik A. Multi-Window 3D Interaction for Collaborative Virtual Reality. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2020; 26:3271-3284. [PMID: 31059449 DOI: 10.1109/tvcg.2019.2914677] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
We present a novel collaborative virtual reality system that offers multiple immersive 3D views at large 3D scenes. The physical setup consists of two synchronized multi-user 3D displays: a tabletop and a large vertical projection screen. These displays afford different presentations of the shared 3D scene. The wall display lends itself to the egocentric exploration at 1:1 scale, while the tabletop affords an allocentric overview. Additionally, handheld 3D portals facilitate the personal exploration of the scene, the comparison of views, and the exchange with others. Our developments enable seamless 3D interaction across these independent 3D views. This requires the simultaneous representation of user input in the different viewing contexts. However, the resulting interactions cannot be executed independently. The application must coordinate the interactions and resolve potential ambiguities to provide plausible effects. We analyze and document the challenges of seamless 3D interaction across multiple independent viewing windows, propose a high-level software design to realize the necessary functionality, and apply the design to a set of interaction tools. Our setup was tested in a formal user study, which revealed general advantages of collaborative 3D data exploration with multiple views in terms of user preference, comfort, and task performance.
Collapse
|
7
|
Juhnke B, Mattson AR, Saltzman D, Azakie A, Hoggard E, Ambrose M, Iaizzo PA, Erdman A, Fischer G. Use of virtual reality for pre-surgical planning in separation of conjoined twins: A case report. Proc Inst Mech Eng H 2019; 233:1327-1332. [PMID: 31554483 DOI: 10.1177/0954411919878067] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
We describe the use of virtual reality technology for surgical planning in the successful separation of thoracopagus conjoined twins. Three-dimensional models were created from computed tomography angiograms to simulate the patient's anatomy on a virtual stereoscopic display. Members of the surgical teams reviewed the anatomical models to localize an interatrial communication that allowed blood to flow between the two hearts. The surgical plan to close the 1-mm interatrial communication was significantly modified based on the pre-procedural spatial awareness of the anatomy presented in the virtual visualization. The virtual stereoscopic display was critical for the surgical team to successfully separate the twins and provides a useful case study for the use of virtual reality technology in surgical planning. Both twins survived the operation and were subsequently discharged from the hospital.
Collapse
Affiliation(s)
- Bethany Juhnke
- Earl E. Bakken Medical Devices Center, University of Minnesota, Minneapolis, MN, USA
| | - Alex R Mattson
- Visible Heart® Laboratories, University of Minnesota, Minneapolis, MN, USA.,Department of Surgery, University of Minnesota, Minneapolis, MN, USA
| | - Daniel Saltzman
- Department of Surgery, University of Minnesota, Minneapolis, MN, USA
| | - Anthony Azakie
- Pediatric Cardiovascular Surgery and Congenital Heart Program, Peyton Manning Children's Hospital at St. Vincent, Indianapolis, IN, USA
| | - Eric Hoggard
- Department of Radiology, University of Minnesota, Minneapolis, MN, USA
| | - Matthew Ambrose
- Department of Pediatrics, University of Minnesota, Minneapolis, MN, USA
| | - Paul A Iaizzo
- Visible Heart® Laboratories, University of Minnesota, Minneapolis, MN, USA.,Department of Surgery, University of Minnesota, Minneapolis, MN, USA
| | - Arthur Erdman
- Earl E. Bakken Medical Devices Center, University of Minnesota, Minneapolis, MN, USA
| | - Gwenyth Fischer
- Department of Pediatrics, University of Minnesota, Minneapolis, MN, USA
| |
Collapse
|
8
|
Bruckner S, Isenberg T, Ropinski T, Wiebel A. A Model of Spatial Directness in Interactive Visualization. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2019; 25:2514-2528. [PMID: 29994478 DOI: 10.1109/tvcg.2018.2848906] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
We discuss the concept of directness in the context of spatial interaction with visualization. In particular, we propose a model that allows practitioners to analyze and describe the spatial directness of interaction techniques, ultimately to be able to better understand interaction issues that may affect usability. To reach these goals, we distinguish between different types of directness. Each type of directness depends on a particular mapping between different spaces, for which we consider the data space, the visualization space, the output space, the user space, the manipulation space, and the interaction space. In addition to the introduction of the model itself, we also show how to apply it to several real-world interaction scenarios in visualization, and thus discuss the resulting types of spatial directness, without recommending either more direct or more indirect interaction techniques. In particular, we will demonstrate descriptive and evaluative usage of the proposed model, and also briefly discuss its generative usage.
Collapse
|
9
|
Johnson S, Orban D, Runesha HB, Meng L, Juhnke B, Erdman A, Samsel F, Keefe DF. Bento Box: An Interactive and Zoomable Small Multiples Technique for Visualizing 4D Simulation Ensembles in Virtual Reality. Front Robot AI 2019; 6:61. [PMID: 33501076 PMCID: PMC7805880 DOI: 10.3389/frobt.2019.00061] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2018] [Accepted: 07/05/2019] [Indexed: 11/13/2022] Open
Abstract
We present Bento Box, a virtual reality data visualization technique and bimanual 3D user interface for exploratory analysis of 4D data ensembles. Bento Box helps scientists and engineers make detailed comparative judgments about multiple time-varying data instances that make up a data ensemble (e.g., a group of 10 parameterized simulation runs). The approach is to present an organized set of complementary volume visualizations juxtaposed in a grid arrangement, where each column visualizes a single data instance and each row provides a new view of the volume from a different perspective and/or scale. A novel bimanual interface enables users to select a sub-volume of interest to create a new row on-the-fly, scrub through time, and quickly navigate through the resulting virtual "bento box." The technique is evaluated through a real-world case study, supporting a team of medical device engineers and computational scientists using in-silico testing (supercomputer simulations) to redesign cardiac leads. The engineers confirmed hypotheses and developed new insights using a Bento Box visualization. An evaluation of the technical performance demonstrates that the proposed combination of data sampling strategies and clipped volume rendering is successful in displaying a juxtaposed visualization of fluid-structure-interaction simulation data (39 GB of raw data) at interactive VR frame rates.
Collapse
Affiliation(s)
- Seth Johnson
- Interactive Visualization Lab, Department of Computer Science, University of Minnesota, Minneapolis, MN, United States
| | - Daniel Orban
- Interactive Visualization Lab, Department of Computer Science, University of Minnesota, Minneapolis, MN, United States
| | | | - Lingyu Meng
- Research Computing Center, University of Chicago, Chicago, IL, United States
| | - Bethany Juhnke
- Department of Mechanical Engineering, Earl E. Bakken Medical Devices Center, University of Minnesota, Minneapolis, MN, United States
| | - Arthur Erdman
- Department of Mechanical Engineering, Earl E. Bakken Medical Devices Center, University of Minnesota, Minneapolis, MN, United States
| | - Francesca Samsel
- Texas Advanced Computing Center, University of Texas, Austin, TX, United States
| | - Daniel F Keefe
- Interactive Visualization Lab, Department of Computer Science, University of Minnesota, Minneapolis, MN, United States
| |
Collapse
|
10
|
Mirhosseini S, Gutenko I, Ojal S, Marino J, Kaufman A. Immersive Virtual Colonoscopy. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2019; 25:2011-2021. [PMID: 30762554 DOI: 10.1109/tvcg.2019.2898763] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Virtual colonoscopy (VC) is a non-invasive screening tool for colorectal polyps which employs volume visualization of a colon model reconstructed from a CT scan of the patient's abdomen. We present an immersive analytics system for VC which enhances and improves the traditional desktop VC through the use of VR technologies. Our system, using a head-mounted display (HMD), includes all of the standard VC features, such as the volume rendered endoluminal fly-through, measurement tool, bookmark modes, electronic biopsy, and slice views. The use of VR immersion, stereo, and wider field of view and field of regard has a positive effect on polyp search and analysis tasks in our immersive VC system, a volumetric-based immersive analytics application. Navigation includes enhanced automatic speed and direction controls, based on the user's head orientation, in conjunction with physical navigation for exploration of local proximity. In order to accommodate the resolution and frame rate requirements for HMDs, new rendering techniques have been developed, including mesh-assisted volume raycasting and a novel lighting paradigm. Feedback and further suggestions from expert radiologists show the promise of our system for immersive analysis for VC and encourage new avenues for exploring the use of VR in visualization systems for medical diagnosis.
Collapse
|
11
|
Guy Erdman A. Lessons Learned From Kinematics Research Applied to Medical Device Design. J Biomech Eng 2018; 140:2666966. [PMID: 29247252 DOI: 10.1115/1.4038764] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2017] [Indexed: 11/08/2022]
Affiliation(s)
- Arthur Guy Erdman
- Professor Department of Mechanical Engineering, University of Minnesota, Minneapolis, MN 55455-0150 e-mail:
| |
Collapse
|
12
|
Lopes DS, Parreira PDDF, Paulo SF, Nunes V, Rego PA, Neves MC, Rodrigues PS, Jorge JA. On the utility of 3D hand cursors to explore medical volume datasets with a touchless interface. J Biomed Inform 2017; 72:140-149. [PMID: 28720438 DOI: 10.1016/j.jbi.2017.07.009] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2016] [Revised: 06/30/2017] [Accepted: 07/10/2017] [Indexed: 10/19/2022]
Abstract
Analyzing medical volume datasets requires interactive visualization so that users can extract anatomo-physiological information in real-time. Conventional volume rendering systems rely on 2D input devices, such as mice and keyboards, which are known to hamper 3D analysis as users often struggle to obtain the desired orientation that is only achieved after several attempts. In this paper, we address which 3D analysis tools are better performed with 3D hand cursors operating on a touchless interface comparatively to a 2D input devices running on a conventional WIMP interface. The main goals of this paper are to explore the capabilities of (simple) hand gestures to facilitate sterile manipulation of 3D medical data on a touchless interface, without resorting on wearables, and to evaluate the surgical feasibility of the proposed interface next to senior surgeons (N=5) and interns (N=2). To this end, we developed a touchless interface controlled via hand gestures and body postures to rapidly rotate and position medical volume images in three-dimensions, where each hand acts as an interactive 3D cursor. User studies were conducted with laypeople, while informal evaluation sessions were carried with senior surgeons, radiologists and professional biomedical engineers. Results demonstrate its usability as the proposed touchless interface improves spatial awareness and a more fluent interaction with the 3D volume than with traditional 2D input devices, as it requires lesser number of attempts to achieve the desired orientation by avoiding the composition of several cumulative rotations, which is typically necessary in WIMP interfaces. However, tasks requiring precision such as clipping plane visualization and tagging are best performed with mouse-based systems due to noise, incorrect gestures detection and problems in skeleton tracking that need to be addressed before tests in real medical environments might be performed.
Collapse
Affiliation(s)
- Daniel Simões Lopes
- INESC-ID Lisboa, IST Taguspark, Avenida Professor Cavaco Silva, 2744-016 Porto Salvo, Portugal.
| | | | - Soraia Figueiredo Paulo
- INESC-ID Lisboa, IST Taguspark, Avenida Professor Cavaco Silva, 2744-016 Porto Salvo, Portugal.
| | - Vitor Nunes
- Surgery Department, Hospital Prof. Doutor Fernando Fonseca, E.P.E., IC19, 2720-276 Amadora, Portugal.
| | - Paulo Amaral Rego
- Hip Surgery Unit, Orthopedic Surgery Department, Hospital Beatriz Ângelo, Av. Carlos Teixeira, 3, 2674-514 Loures, Portugal; Department of Orthopaedic Surgery, Hospital da Luz, Avenida Lusíada, 100, 1500-650 Lisboa, Portugal.
| | - Manuel Cassiano Neves
- Department of Pediatric & Adolescent Orthopaedic Surgery, Hospital CUF Descobertas, Rua Mário Botas, Parque das Nações, 1998-018 Lisboa, Portugal.
| | - Pedro Silva Rodrigues
- Oral Implantology Group, Clínica Universitária Egas Moniz, Rua D. João IV Nº 23ª, 2800-114 Almada, Portugal.
| | - Joaquim Armando Jorge
- INESC-ID Lisboa, IST Taguspark, Avenida Professor Cavaco Silva, 2744-016 Porto Salvo, Portugal; Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais 1, 1049-001 Lisboa, Portugal.
| |
Collapse
|
13
|
Besancon L, Issartel P, Ammi M, Isenberg T. Hybrid Tactile/Tangible Interaction for 3D Data Exploration. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2017; 23:881-890. [PMID: 27875202 DOI: 10.1109/tvcg.2016.2599217] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
We present the design and evaluation of an interface that combines tactile and tangible paradigms for 3D visualization. While studies have demonstrated that both tactile and tangible input can be efficient for a subset of 3D manipulation tasks, we reflect here on the possibility to combine the two complementary input types. Based on a field study and follow-up interviews, we present a conceptual framework of the use of these different interaction modalities for visualization both separately and combined-focusing on free exploration as well as precise control. We present a prototypical application of a subset of these combined mappings for fluid dynamics data visualization using a portable, position-aware device which offers both tactile input and tangible sensing. We evaluate our approach with domain experts and report on their qualitative feedback.
Collapse
|
14
|
Laha B, Bowman DA, Socha JJ. Bare-Hand Volume Cracker for Raw Volume Data Analysis. Front Robot AI 2016. [DOI: 10.3389/frobt.2016.00056] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
|
15
|
López D, Oehlberg L, Doger C, Isenberg T. Towards An Understanding of Mobile Touch Navigation in a Stereoscopic Viewing Environment for 3D Data Exploration. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2016; 22:1616-1629. [PMID: 27045916 DOI: 10.1109/tvcg.2015.2440233] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
We discuss touch-based navigation of 3D visualizations in a combined monoscopic and stereoscopic viewing environment. We identify a set of interaction modes, and a workflow that helps users transition between these modes to improve their interaction experience. In our discussion we analyze, in particular, the control-display space mapping between the different reference frames of the stereoscopic and monoscopic displays. We show how this mapping supports interactive data exploration, but may also lead to conflicts between the stereoscopic and monoscopic views due to users' movement in space; we resolve these problems through synchronization. To support our discussion, we present results from an exploratory observational evaluation with domain experts in fluid mechanics and structural biology. These experts explored domain-specific datasets using variations of a system that embodies the interaction modes and workflows; we report on their interactions and qualitative feedback on the system and its workflow.
Collapse
|
16
|
Jackson B, Lau TY, Schroeder D, Toussaint KC, Keefe DF. A lightweight tangible 3D interface for interactive visualization of thin fiber structures. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2013; 19:2802-2809. [PMID: 24051847 DOI: 10.1109/tvcg.2013.121] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
Abstract
We present a prop-based, tangible interface for 3D interactive visualization of thin fiber structures. These data are commonly found in current bioimaging datasets, for example second-harmonic generation microscopy of collagen fibers in tissue. Our approach uses commodity visualization technologies such as a depth sensing camera and low-cost 3D display. Unlike most current uses of these emerging technologies in the games and graphics communities, we employ the depth sensing camera to create a fish-tank stereoscopic virtual reality system at the scientist's desk that supports tracking of small-scale gestures with objects already found in the work space. We apply the new interface to the problem of interactive exploratory visualization of three-dimensional thin fiber data. A critical task for the visual analysis of these data is understanding patterns in fiber orientation throughout a volume.The interface enables a new, fluid style of data exploration and fiber orientation analysis by using props to provide needed passive-haptic feedback, making 3D interactions with these fiber structures more controlled. We also contribute a low-level algorithm for extracting fiber centerlines from volumetric imaging. The system was designed and evaluated with two biophotonic experts who currently use it in their lab. As compared to typical practice within their field, the new visualization system provides a more effective way to examine and understand the 3D bioimaging datasets they collect.
Collapse
|
17
|
Erdman AG, Keefe DF, Schiestl R. Grand challenge: applying regulatory science and big data to improve medical device innovation. IEEE Trans Biomed Eng 2013; 60:700-6. [PMID: 23380845 DOI: 10.1109/tbme.2013.2244600] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
Understanding how proposed medical devices will interface with humans is a major challenge that impacts both the design of innovative new devices and approval and regulation of existing devices. Today, designing and manufacturing medical devices requires extensive and expensive product cycles. Bench tests and other preliminary analyses are used to understand the range of anatomical conditions, and animal and clinical trials are used to understand the impact of design decisions upon actual device success. Unfortunately, some scenarios are impossible to replicate on the bench, and competitive pressures often accelerate initiation of animal trials without sufficient understanding of parameter selections. We believe that these limitations can be overcome through advancements in data-driven and simulation-based medical device design and manufacturing, a research topic that draws upon and combines emerging work in the areas of Regulatory Science and Big Data. We propose a cross-disciplinary grand challenge to develop and holistically apply new thinking and techniques in these areas to medical devices in order to improve and accelerate medical device innovation.
Collapse
Affiliation(s)
- Arthur G Erdman
- Department of Mechanical Engineering and Medical Devices Center at the University of Minnesota, Minneapolis, MN 55455, USA.
| | | | | |
Collapse
|
18
|
Jackson B, Coffey D, Thorson L, Schroeder D, Ellingson AM, Nuckley DJ, Keefe DF. Toward Mixed Method Evaluations of Scientific Visualizations and Design Process as an Evaluation Tool. PROCEEDINGS OF THE 2012 BELIV WORKSHOP : BEYOND TIME AND ERRORS - NOVEL EVALUATION METHODS FOR VISUALIZATION : SEATTLE WA, USA - OCTOBER 14-15, 2012. BELIV (CONFERENCE) (2012 : SEATTLE, WASH.) 2012; 2012. [PMID: 28944349 DOI: 10.1145/2442576.2442580] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
Abstract
In this position paper we discuss successes and limitations of current evaluation strategies for scientific visualizations and argue for embracing a mixed methods strategy of evaluation. The most novel contribution of the approach that we advocate is a new emphasis on employing design processes as practiced in related fields (e.g., graphic design, illustration, architecture) as a formalized mode of evaluation for data visualizations. To motivate this position we describe a series of recent evaluations of scientific visualization interfaces and computer graphics strategies conducted within our research group. Complementing these more traditional evaluations our visualization research group also regularly employs sketching, critique, and other design methods that have been formalized over years of practice in design fields. Our experience has convinced us that these activities are invaluable, often providing much more detailed evaluative feedback about our visualization systems than that obtained via more traditional user studies and the like. We believe that if design-based evaluation methodologies (e.g., ideation, sketching, critique) can be taught and embraced within the visualization community then these may become one of the most effective future strategies for both formative and summative evaluations.
Collapse
Affiliation(s)
- Bret Jackson
- University of Minnesota Department of Computer Science and Engineering
| | - Dane Coffey
- University of Minnesota Department of Computer Science and Engineering
| | - Lauren Thorson
- University of Minnesota Department of Computer Science and Engineering.,University of Minnesota Department of Design, Housing and Apparel
| | - David Schroeder
- University of Minnesota Department of Computer Science and Engineering
| | | | - David J Nuckley
- University of Minnesota Department of Biomedical Engineering.,University of Minnesota Program in Physical Therapy
| | - Daniel F Keefe
- University of Minnesota Department of Computer Science and Engineering
| |
Collapse
|