1
|
Völk C, Bernhard L, Völk D, Weiten M, Wilhelm D, Biberthaler P. [Mobile C-arm-Radiation exposure and workflow killer? : Potential of an innovative assistance system for intraoperative positioning]. UNFALLCHIRURGIE (HEIDELBERG, GERMANY) 2023; 126:928-934. [PMID: 37878125 DOI: 10.1007/s00113-023-01380-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 09/28/2023] [Indexed: 10/26/2023]
Abstract
Despite its versatile applicability the intraoperative use of a mobile C‑arm is often problematic and potentially associated with increased radiation exposure for both the patient and the personnel. In particular, the correct positioning for adequate imaging can become a problem as the nonsterile circulating nurse has to coordinate the various maneuvers together with the surgeon without having a good view of the surgical field. The sluggishness of the equipment and the intraoperative setting (sterile borders, additional hardware, etc.) pose further challenges. A light detection and ranging (LIDAR)-based assistance system shows promise to provide accurate and intuitive repositioning support as part of an initial series of experimental trials. For this purpose, the sensors are attached to the C‑arm base unit and enable navigation of the device in the operating room to a stored target position using a simultaneous localization and mapping (SLAM) algorithm. An improvement of the workflow as well as a reduction of radiation exposure represent the possible potential of this system. The advantages over other experimental approaches are the lack of external hardware and the ease of use without isolating the operator from the rest of the operating room environment; however, the suitability for daily use in the presence of additional interfering factors should be verified in further studies.
Collapse
Affiliation(s)
- Christopher Völk
- Klinik und Poliklinik für Unfallchirurgie, Klinikum rechts der Isar der TU München, Ismaningerstr. 22, 81675, München, Deutschland.
| | - Lukas Bernhard
- Forschungsgruppe MITI, Klinikum rechts der Isar der TU München, München, Deutschland
| | - Dominik Völk
- Klinik und Poliklinik für Unfallchirurgie, Klinikum rechts der Isar der TU München, Ismaningerstr. 22, 81675, München, Deutschland
| | | | - Dirk Wilhelm
- Forschungsgruppe MITI, Klinikum rechts der Isar der TU München, München, Deutschland
- Klinik und Poliklinik für Chirurgie, Klinikum rechts der Isar der TU München, München, Deutschland
| | - Peter Biberthaler
- Klinik und Poliklinik für Unfallchirurgie, Klinikum rechts der Isar der TU München, Ismaningerstr. 22, 81675, München, Deutschland
| |
Collapse
|
2
|
Application of an Optical Tracking System for Motor Skill Assessment in Laparoscopic Surgery. COMPUTATIONAL AND MATHEMATICAL METHODS IN MEDICINE 2022; 2022:2332628. [PMID: 35912156 PMCID: PMC9337947 DOI: 10.1155/2022/2332628] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/26/2022] [Accepted: 06/25/2022] [Indexed: 11/18/2022]
Abstract
Objective Motion analysis of surgical instruments can be used to evaluate laparoscopic surgical skills, and this study assessed the validity of an optical tracking system for the assessment of laparoscopic surgical motor skills. Methods Ten experienced surgeons and ten novices were recruited to complete the transferring tasks on a laparoscopic simulator. An optical tracking system, Micron Tracker, was used to capture the marker points on each instrument and to obtain the coordinates of the marker points and the corresponding instrument tip coordinates. The data are processed to create a coordinate system based on the laparoscopic simulator and to calculate the movement parameters of the instruments, such as operating time, path length, speed, acceleration, and smoothness. At the same time, the range of motion of the instrument (insertion depth and pivoting angle) is also calculated. Results The position that the tip of the instrument can reach is a small, irregularly shaped spatial area. Significant differences (p < 0.05) were found between the surgeon and novice groups in parameters such as operating time, path length, mean speed, mean acceleration, and mean smoothness. The range of insertion depth of the instruments was approximately 150 mm to 240 mm, and the pivoting angles of the left and right instruments were 30.9° and 46.6° up and down and 28.0° and 35.0° left and right, respectively. Conclusions The optical tracking system was effective in subjectively evaluating laparoscopic surgical skills, with significant differences between the surgeon and novice groups in terms of movement parameters, but not in terms of range of motion.
Collapse
|
3
|
Sheth N, Vagdargi P, Sisniega A, Uneri A, Osgood G, Siewerdsen JH. Preclinical evaluation of a prototype freehand drill video guidance system for orthopedic surgery. J Med Imaging (Bellingham) 2022; 9:045004. [PMID: 36046335 PMCID: PMC9411797 DOI: 10.1117/1.jmi.9.4.045004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2022] [Accepted: 08/09/2022] [Indexed: 08/28/2023] Open
Abstract
Purpose: Internal fixation of pelvic fractures is a challenging task requiring the placement of instrumentation within complex three-dimensional bone corridors, typically guided by fluoroscopy. We report a system for two- and three-dimensional guidance using a drill-mounted video camera and fiducial markers with evaluation in first preclinical studies. Approach: The system uses a camera affixed to a surgical drill and multimodality (optical and radio-opaque) markers for real-time trajectory visualization in fluoroscopy and/or CT. Improvements to a previously reported prototype include hardware components (mount, camera, and fiducials) and software (including a system for detecting marker perturbation) to address practical requirements necessary for translation to clinical studies. Phantom and cadaver experiments were performed to quantify the accuracy of video-fluoroscopy and video-CT registration, the ability to detect marker perturbation, and the conformance in placing guidewires along realistic pelvic trajectories. The performance was evaluated in terms of geometric accuracy and conformance within bone corridors. Results: The studies demonstrated successful guidewire delivery in a cadaver, with a median entry point error of 1.00 mm (1.56 mm IQR) and median angular error of 1.94 deg (1.23 deg IQR). Such accuracy was sufficient to guide K-wire placement through five of the six trajectories investigated with a strong level of conformance within bone corridors. The sixth case demonstrated a cortical breach due to extrema in the registration error. The system was able to detect marker perturbations and alert the user to potential registration issues. Feasible workflows were identified for orthopedic-trauma scenarios involving emergent cases (with no preoperative imaging) or cases with preoperative CT. Conclusions: A prototype system for guidewire placement was developed providing guidance that is potentially compatible with orthopedic-trauma workflow. First preclinical (cadaver) studies demonstrated accurate guidance of K-wire placement in pelvic bone corridors and the ability to automatically detect perturbations that degrade registration accuracy. The preclinical prototype demonstrated performance and utility supporting translation to clinical studies.
Collapse
Affiliation(s)
- Niral Sheth
- Johns Hopkins University, Department of Biomedical Engineering, Baltimore, Maryland, United States
| | - Prasad Vagdargi
- Johns Hopkins University, Department of Computer Science, Baltimore, Maryland, United States
| | - Alejandro Sisniega
- Johns Hopkins University, Department of Biomedical Engineering, Baltimore, Maryland, United States
| | - Ali Uneri
- Johns Hopkins University, Department of Biomedical Engineering, Baltimore, Maryland, United States
| | - Gregory Osgood
- Johns Hopkins Medicine, Department of Orthopedic Surgery, Baltimore, Maryland, United States
| | - Jeffrey H. Siewerdsen
- Johns Hopkins University, Department of Biomedical Engineering, Baltimore, Maryland, United States
- Johns Hopkins University, Department of Computer Science, Baltimore, Maryland, United States
| |
Collapse
|
4
|
Bernhard L, Völk C, Völk D, Rothmeyer F, Xu Z, Ostler D, Biberthaler P, Wilhelm D. RAY-POS: a LIDAR-based assistance system for intraoperative repositioning of mobile C-arms without external aids. Int J Comput Assist Radiol Surg 2022; 17:719-729. [PMID: 35195830 PMCID: PMC8948129 DOI: 10.1007/s11548-022-02571-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2021] [Accepted: 01/26/2022] [Indexed: 11/26/2022]
Abstract
PURPOSE In current clinical practice, intraoperative repositioning of mobile C-arms is challenging due to a lack of visual cues and efficient guiding tools. This can be detrimental to the surgical workflow and lead to additional radiation burdens for both patient and personnel. To overcome this problem, we present our novel approach Lidar-based X-ray Positioning for Mobile C-arms (RAY-POS) for assisting circulating nurses during intraoperative C-arm repositioning without requiring external aids. METHODS RAY-POS consists of a localization module and a graphical user interface for guiding the user back to a previously recorded C-Arm position. We conducted a systematic comparison of simultaneous localization and mapping (SLAM) algorithms using different attachment positions of light detection and ranging (LIDAR) sensors to benchmark localization performance within the operating room (OR). For two promising combinations, we conducted further end-to-end repositioning tests within a realistic OR setup. RESULTS SLAM algorithm gmapping with a LIDAR sensor mounted 40 cm above the C-arm's horizontal unit performed best regarding localization accuracy and long-term stability. The distribution of the repositioning error yielded an effective standard deviation of 7.61 mm. CONCLUSION We conclude that a proof-of-concept for LIDAR-based C-arm repositioning without external aids has been achieved. In future work, we mainly aim at extending the capabilities of our system and evaluating the usability together with clinicians.
Collapse
Affiliation(s)
- Lukas Bernhard
- Klinikum Rechts Der Isar der Technischen Universität München, Research Group MITI, Munich, Germany.
| | - Christopher Völk
- Department of Trauma Surgery, Klinikum Rechts Der Isar der Technischen Universität München, Munich, Germany
| | - Dominik Völk
- Department of Trauma Surgery, Klinikum Rechts Der Isar der Technischen Universität München, Munich, Germany
| | - Florian Rothmeyer
- Technische Universität München, Chair of Materials Handling, Material Flow, Logistics, Munich, Germany
| | - Zhencan Xu
- Klinikum Rechts Der Isar der Technischen Universität München, Research Group MITI, Munich, Germany
| | - Daniel Ostler
- Klinikum Rechts Der Isar der Technischen Universität München, Research Group MITI, Munich, Germany
| | - Peter Biberthaler
- Department of Trauma Surgery, Klinikum Rechts Der Isar der Technischen Universität München, Munich, Germany
| | - Dirk Wilhelm
- Klinikum Rechts Der Isar der Technischen Universität München, Research Group MITI, Munich, Germany
- Department of Surgery, Klinikum Rechts Der Isar der Technischen Universität München, Munich, Germany
| |
Collapse
|
5
|
Augmented Reality (AR) in Orthopedics: Current Applications and Future Directions. Curr Rev Musculoskelet Med 2021; 14:397-405. [PMID: 34751894 DOI: 10.1007/s12178-021-09728-1] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 09/27/2021] [Indexed: 01/05/2023]
Abstract
PURPOSE OF REVIEW Imaging technologies (X-ray, CT, MRI, and ultrasound) have revolutionized orthopedic surgery, allowing for the more efficient diagnosis, monitoring, and treatment of musculoskeletal aliments. The current review investigates recent literature surrounding the impact of augmented reality (AR) imaging technologies on orthopedic surgery. In particular, it investigates the impact that AR technologies may have on provider cognitive burden, operative times, occupational radiation exposure, and surgical precision and outcomes. RECENT FINDINGS Many AR technologies have been shown to lower provider cognitive burden and reduce operative time and radiation exposure while improving surgical precision in pre-clinical cadaveric and sawbones models. So far, only a few platforms focusing on pedicle screw placement have been approved by the FDA. These technologies have been implemented clinically with mixed results when compared to traditional free-hand approaches. It remains to be seen if current AR technologies can deliver upon their multitude of promises, and the ability to do so seems contingent upon continued technological progress. Additionally, the impact of these platforms will likely be highly conditional on clinical indication and provider type. It remains unclear if AR will be broadly accepted and utilized or if it will be reserved for niche indications where it adds significant value. One thing is clear, orthopedics' high utilization of pre- and intra-operative imaging, combined with the relative ease of tracking rigid structures like bone as compared to soft tissues, has made it the clear beachhead market for AR technologies in medicine.
Collapse
|
6
|
Zhang X, Uneri A, Wu P, Ketcha MD, Jones CK, Huang Y, Lo SFL, Helm PA, Siewerdsen JH. Long-length tomosynthesis and 3D-2D registration for intraoperative assessment of spine instrumentation. Phys Med Biol 2021; 66:055008. [PMID: 33477120 DOI: 10.1088/1361-6560/abde96] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
PURPOSE A system for long-length intraoperative imaging is reported based on longitudinal motion of an O-arm gantry featuring a multi-slot collimator. We assess the utility of long-length tomosynthesis and the geometric accuracy of 3D image registration for surgical guidance and evaluation of long spinal constructs. METHODS A multi-slot collimator with tilted apertures was integrated into an O-arm system for long-length imaging. The multi-slot projective geometry leads to slight view disparity in both long-length projection images (referred to as 'line scans') and tomosynthesis 'slot reconstructions' produced using a weighted-backprojection method. The radiation dose for long-length imaging was measured, and the utility of long-length, intraoperative tomosynthesis was evaluated in phantom and cadaver studies. Leveraging the depth resolution provided by parallax views, an algorithm for 3D-2D registration of the patient and surgical devices was adapted for registration with line scans and slot reconstructions. Registration performance using single-plane or dual-plane long-length images was evaluated and compared to registration accuracy achieved using standard dual-plane radiographs. RESULTS Longitudinal coverage of ∼50-64 cm was achieved with a single long-length slot scan, providing a field-of-view (FOV) up to (40 × 64) cm2, depending on patient positioning. The dose-area product (reference point air kerma × x-ray field area) for a slot scan ranged from ∼702-1757 mGy·cm2, equivalent to ∼2.5 s of fluoroscopy and comparable to other long-length imaging systems. Long-length scanning produced high-resolution tomosynthesis reconstructions, covering ∼12-16 vertebral levels. 3D image registration using dual-plane slot reconstructions achieved median target registration error (TRE) of 1.2 mm and 0.6° in cadaver studies, outperforming registration to dual-plane line scans (TRE = 2.8 mm and 2.2°) and radiographs (TRE = 2.5 mm and 1.1°). 3D registration using single-plane slot reconstructions leveraged the ∼7-14° angular separation between slots to achieve median TRE ∼2 mm and <2° from a single scan. CONCLUSION The multi-slot configuration provided intraoperative visualization of long spine segments, facilitating target localization, assessment of global spinal alignment, and evaluation of long surgical constructs. 3D-2D registration to long-length tomosynthesis reconstructions yielded a promising means of guidance and verification with accuracy exceeding that of 3D-2D registration to conventional radiographs.
Collapse
Affiliation(s)
- Xiaoxuan Zhang
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD, United States of America
| | | | | | | | | | | | | | | | | |
Collapse
|
7
|
Vagdargi P, Sheth N, Sisniega A, Uneri A, De Silva T, Osgood GM, Siewerdsen JH. Drill-mounted video guidance for orthopaedic trauma surgery. J Med Imaging (Bellingham) 2021; 8:015002. [PMID: 33604409 DOI: 10.1117/1.jmi.8.1.015002] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2020] [Accepted: 01/19/2021] [Indexed: 11/14/2022] Open
Abstract
Purpose: Percutaneous fracture fixation is a challenging procedure that requires accurate interpretation of fluoroscopic images to insert guidewires through narrow bone corridors. We present a guidance system with a video camera mounted onboard the surgical drill to achieve real-time augmentation of the drill trajectory in fluoroscopy and/or CT. Approach: The camera was mounted on the drill and calibrated with respect to the drill axis. Markers identifiable in both video and fluoroscopy are placed about the surgical field and co-registered by feature correspondences. If available, a preoperative CT can also be co-registered by 3D-2D image registration. Real-time guidance is achieved by virtual overlay of the registered drill axis on fluoroscopy or in CT. Performance was evaluated in terms of target registration error (TRE), conformance within clinically relevant pelvic bone corridors, and runtime. Results: Registration of the drill axis to fluoroscopy demonstrated median TRE of 0.9 mm and 2.0 deg when solved with two views (e.g., anteroposterior and lateral) and five markers visible in both video and fluoroscopy-more than sufficient to provide Kirschner wire (K-wire) conformance within common pelvic bone corridors. Registration accuracy was reduced when solved with a single fluoroscopic view ( TRE = 3.4 mm and 2.7 deg) but was also sufficient for K-wire conformance within pelvic bone corridors. Registration was robust with as few as four markers visible within the field of view. Runtime of the initial implementation allowed fluoroscopy overlay and/or 3D CT navigation with freehand manipulation of the drill up to 10 frames / s . Conclusions: A drill-mounted video guidance system was developed to assist with K-wire placement. Overall workflow is compatible with fluoroscopically guided orthopaedic trauma surgery and does not require markers to be placed in preoperative CT. The initial prototype demonstrates accuracy and runtime that could improve the accuracy of K-wire placement, motivating future work for translation to clinical studies.
Collapse
Affiliation(s)
- Prasad Vagdargi
- Johns Hopkins University, Department of Computer Science, Baltimore, Maryland, United States
| | - Niral Sheth
- Johns Hopkins University, Department of Biomedical Engineering, Baltimore, Maryland, United States
| | - Alejandro Sisniega
- Johns Hopkins University, Department of Biomedical Engineering, Baltimore, Maryland, United States
| | - Ali Uneri
- Johns Hopkins University, Department of Biomedical Engineering, Baltimore, Maryland, United States
| | - Tharindu De Silva
- Johns Hopkins University, Department of Biomedical Engineering, Baltimore, Maryland, United States
| | - Greg M Osgood
- Johns Hopkins Medicine, Department of Orthopaedic Surgery, Baltimore, Maryland, United States
| | - Jeffrey H Siewerdsen
- Johns Hopkins University, Department of Computer Science, Baltimore, Maryland, United States.,Johns Hopkins University, Department of Biomedical Engineering, Baltimore, Maryland, United States
| |
Collapse
|
8
|
The effect of artificial X-rays on C-arm positioning performance in a simulated orthopaedic surgical setting. Int J Comput Assist Radiol Surg 2020; 16:11-22. [PMID: 33146849 DOI: 10.1007/s11548-020-02280-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2020] [Accepted: 10/09/2020] [Indexed: 10/23/2022]
Abstract
PURPOSE We designed an Artificial X-ray Imaging System (AXIS) that generates simulated fluoroscopic X-ray images on the fly and assessed its utility in improving C-arm positioning performance by C-arm users with little or no C-arm experience. METHODS The AXIS system was comprised of an optical tracking system to monitor C-arm movement, a manikin, a reference CT volume registered to the manikin, and a Digitally Reconstructed Radiograph algorithm to generate live simulated fluoroscopic images. A user study was conducted with 30 participants who had little or no C-arm experience. Each participant carried out four tasks using a real C-arm: an introduction session, an AXIS-guided set of pelvic imaging tasks, a non-AXIS guided set of pelvic imaging tasks, and a questionnaire. For each imaging task, the participant replicated a set of three target X-ray images by taking real radiographs of a manikin with a C-arm. The number of X-rays required, task time, and C-arm positioning accuracy were recorded. RESULTS We found a significant 53% decrease in the number of X-rays used and a moderate 10-26% improvement in lateral C-arm axis positioning accuracy without requiring more time to complete the tasks when the participants were guided by artificial X-rays. The questionnaires showed that the participants felt significantly more confident in their C-arm positioning ability when they were guided by AXIS. They rated the usefulness of AXIS as very good to excellent, and the realism and accuracy of AXIS as good to very good. CONCLUSION Novice users working with a C-arm machine supplemented with the ability to generate simulated X-ray images could successfully accomplish positioning tasks in a simulated surgical setting using markedly fewer X-ray images than when unassisted. In future work, we plan to determine whether such a system can produce similar results in the live operating room without lengthening surgical procedures.
Collapse
|
9
|
Du H, Hu L, Hao M, Zhang L. Application of binocular visual navigation technique in diaphyseal fracture reduction. Int J Med Robot 2020; 16:e2082. [PMID: 31967377 DOI: 10.1002/rcs.2082] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2019] [Revised: 01/15/2020] [Accepted: 01/17/2020] [Indexed: 11/11/2022]
Abstract
BACKGROUND Computer-assisted surgical navigation techniques have shown promise; however, currently popular systems have limitations. This paper presents the characterization and application of a binocular visual navigation technique in diaphyseal fracture reduction. METHODS A binocular visual tracker (MicronTracker) was introduced to reduce diaphyseal fractures. A transformation matrix was used to acquire the reduction parameters. A transverse diaphyseal fracture was used as a control group. RESULTS Precision tests were performed with the binocular system using a simulation femoral model with a transverse fracture 12 times. All residual deformations were compared and P < 0.01. CONCLUSIONS The binocular visual navigation technique produces good results with advantages of flexibility and high positional accuracy and shows promise. The MicronTracker might lead to further application in the remote navigation field.
Collapse
Affiliation(s)
- Hailong Du
- Department of Orthopaedics, Chinese PLA General Hospital, Beijing, China
| | - Lei Hu
- Robotics Institute, Beihang University, Beijing, China
| | - Ming Hao
- Department of Orthopaedics, Chinese PLA General Hospital, Beijing, China
| | - Lihai Zhang
- Department of Orthopaedics, Chinese PLA General Hospital, Beijing, China
| |
Collapse
|
10
|
Aydemir CA, Arısan V. Accuracy of dental implant placement via dynamic navigation or the freehand method: A split-mouth randomized controlled clinical trial. Clin Oral Implants Res 2019; 31:255-263. [PMID: 31829457 DOI: 10.1111/clr.13563] [Citation(s) in RCA: 92] [Impact Index Per Article: 15.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2019] [Revised: 12/06/2019] [Accepted: 12/07/2019] [Indexed: 11/28/2022]
Abstract
OBJECTIVES The aim of this split-mouth randomized controlled clinical trial was to compare the deviations of planned and placed implants placed by the assistance of a micron tracker-based dynamic navigation device or freehand methods. MATERIAL AND METHODS A thermoplastic fiducial marker was adapted on the anterior teeth, and cone-beam computerized tomography was used for imaging. A minimum of one implant was planned for each side of the posterior maxilla, and the dynamic navigation device or freehand method was randomly used for surgical insertion. Deviations were measured by matching the planning data with a final CBCT image. Linear deviations (mm) between the planned and placed implants were the primary outcome. The results were analysed by generalized linear mixed models (p < .05). (NCT03471208). RESULTS A total of 92 implants were placed to 32 volunteers, and 86 implants were included in the final analysis. For the linear deviations, mean of differences (Δ) was 0.72mm (Standard deviation (SD): 0.26); (95% Confidence interval (CI): 0.39-1.02) in the shoulder of the implants (p < .001) and 0.69mm (SD: 0.36); (95% CI: 0.19-1.19) in the tip of the implants (p < .001). For the angular deviations, Δ was 5.33° (SD: 1.63); (95% CI: 7.17-3.48); (p < .001). CONCLUSIONS The navigation technique can be used to transfer virtual implant planning to the patient's jaw with increased accuracy.
Collapse
Affiliation(s)
- Ceyda Aktolun Aydemir
- Department of Oral Implantology, Faculty of Dentistry, Istanbul University, Istanbul, Turkey
| | - Volkan Arısan
- Department of Oral Implantology, Faculty of Dentistry, Istanbul University, Istanbul, Turkey
| |
Collapse
|
11
|
Wang J, Ji X, Zhang X, Sun Z, Wang T. Real-time robust individual X point localization for stereoscopic tracking. Pattern Recognit Lett 2018. [DOI: 10.1016/j.patrec.2018.07.002] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
12
|
Lee SC, Fuerst B, Tateno K, Johnson A, Fotouhi J, Osgood G, Tombari F, Navab N. Multi-modal imaging, model-based tracking, and mixed reality visualisation for orthopaedic surgery. Healthc Technol Lett 2017; 4:168-173. [PMID: 29184659 PMCID: PMC5683202 DOI: 10.1049/htl.2017.0066] [Citation(s) in RCA: 28] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2017] [Accepted: 08/02/2017] [Indexed: 12/12/2022] Open
Abstract
Orthopaedic surgeons are still following the decades old workflow of using dozens of two-dimensional fluoroscopic images to drill through complex 3D structures, e.g. pelvis. This Letter presents a mixed reality support system, which incorporates multi-modal data fusion and model-based surgical tool tracking for creating a mixed reality environment supporting screw placement in orthopaedic surgery. A red–green–blue–depth camera is rigidly attached to a mobile C-arm and is calibrated to the cone-beam computed tomography (CBCT) imaging space via iterative closest point algorithm. This allows real-time automatic fusion of reconstructed surface and/or 3D point clouds and synthetic fluoroscopic images obtained through CBCT imaging. An adapted 3D model-based tracking algorithm with automatic tool segmentation allows for tracking of the surgical tools occluded by hand. This proposed interactive 3D mixed reality environment provides an intuitive understanding of the surgical site and supports surgeons in quickly localising the entry point and orienting the surgical tool during screw placement. The authors validate the augmentation by measuring target registration error and also evaluate the tracking accuracy in the presence of partial occlusion.
Collapse
Affiliation(s)
- Sing Chun Lee
- Computer Aided Medical Procedures, Laboratory for Computational Sensing & Robotics, Johns Hopkins University, Baltimore, MD, USA
| | | | - Keisuke Tateno
- Fakultät für Informatik, Lehrstuhl für Informatikanwendungen in der Medizin & Augmented Reality, Technische Universität München, Garching, Bayern, Germany.,Canon Inc., Shimomaruko, Tokyo, Japan
| | - Alex Johnson
- Orthopaedic Trauma, Department of Orthopaedic Surgery, Johns Hopkins Hospital, Baltimore, MD, USA
| | - Javad Fotouhi
- Computer Aided Medical Procedures, Laboratory for Computational Sensing & Robotics, Johns Hopkins University, Baltimore, MD, USA
| | - Greg Osgood
- Orthopaedic Trauma, Department of Orthopaedic Surgery, Johns Hopkins Hospital, Baltimore, MD, USA
| | - Federico Tombari
- Fakultät für Informatik, Lehrstuhl für Informatikanwendungen in der Medizin & Augmented Reality, Technische Universität München, Garching, Bayern, Germany
| | - Nassir Navab
- Computer Aided Medical Procedures, Laboratory for Computational Sensing & Robotics, Johns Hopkins University, Baltimore, MD, USA.,Fakultät für Informatik, Lehrstuhl für Informatikanwendungen in der Medizin & Augmented Reality, Technische Universität München, Garching, Bayern, Germany
| |
Collapse
|
13
|
The status of augmented reality in laparoscopic surgery as of 2016. Med Image Anal 2017; 37:66-90. [DOI: 10.1016/j.media.2017.01.007] [Citation(s) in RCA: 183] [Impact Index Per Article: 22.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2016] [Revised: 01/16/2017] [Accepted: 01/23/2017] [Indexed: 12/27/2022]
|
14
|
Pei B, Zhu G, Wang Y, Qiao H, Chen X, Wang B, Li X, Zhang W, Liu W, Fan Y. The development and error analysis of a kinematic parameters based spatial positioning method for an orthopedic navigation robot system. Int J Med Robot 2016; 13. [DOI: 10.1002/rcs.1782] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2016] [Revised: 07/23/2016] [Accepted: 09/02/2016] [Indexed: 11/09/2022]
Affiliation(s)
- Baoqing Pei
- School of Biological Science and Medical Engineering; Beihang University; China
| | - Gang Zhu
- School of Biological Science and Medical Engineering; Beihang University; China
| | - Yu Wang
- School of Biological Science and Medical Engineering; Beihang University; China
| | - Huiting Qiao
- School of Biological Science and Medical Engineering; Beihang University; China
| | - Xiangqian Chen
- School of Biological Science and Medical Engineering; Beihang University; China
| | - Binbin Wang
- Beijing TINAVI Medical Technology Co., Ltd; China
| | - Xiaoyun Li
- Beijing TINAVI Medical Technology Co., Ltd; China
| | - Weijun Zhang
- Beijing TINAVI Medical Technology Co., Ltd; China
| | - Wenyong Liu
- School of Biological Science and Medical Engineering; Beihang University; China
| | - Yubo Fan
- School of Biological Science and Medical Engineering; Beihang University; China
| |
Collapse
|
15
|
Wang X, Habert S, Zu Berge CS, Fallavollita P, Navab N. Inverse visualization concept for RGB-D augmented C-arms. Comput Biol Med 2016; 77:135-47. [PMID: 27544070 DOI: 10.1016/j.compbiomed.2016.08.008] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2016] [Revised: 08/03/2016] [Accepted: 08/10/2016] [Indexed: 11/19/2022]
Abstract
X-ray is still the essential imaging for many minimally-invasive interventions. Overlaying X-ray images with an optical view of the surgery scene has been demonstrated to be an efficient way to reduce radiation exposure and surgery time. However, clinicians are recommended to place the X-ray source under the patient table while the optical view of the real scene must be captured from the top in order to see the patient, surgical tools, and the surgical site. With the help of a RGB-D (red-green-blue-depth) camera, which can measure depth in addition to color, the 3D model of the real scene is registered to the X-ray image. However, fusing two opposing viewpoints and visualizing them in the context of medical applications has never been attempted. In this paper, we propose first experiences of a novel inverse visualization technique for RGB-D augmented C-arms. A user study consisting of 16 participants demonstrated that our method shows a meaningful visualization with potential in providing clinicians multi-modal fused data in real-time during surgery.
Collapse
Affiliation(s)
- Xiang Wang
- School of Automation Science and Electrical Engineering, Beihang University, Beijing, China; Computer Aided Medical Procedures, Technische Universität München, Germany.
| | - Severine Habert
- Computer Aided Medical Procedures, Technische Universität München, Germany
| | | | | | - Nassir Navab
- Computer Aided Medical Procedures, Technische Universität München, Germany; Johns Hopkins University, Baltimore, MD, USA
| |
Collapse
|
16
|
Lee SC, Fuerst B, Fotouhi J, Fischer M, Osgood G, Navab N. Calibration of RGBD camera and cone-beam CT for 3D intra-operative mixed reality visualization. Int J Comput Assist Radiol Surg 2016; 11:967-75. [DOI: 10.1007/s11548-016-1396-1] [Citation(s) in RCA: 28] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2016] [Accepted: 03/19/2016] [Indexed: 11/29/2022]
|
17
|
Preclinical usability study of multiple augmented reality concepts for K-wire placement. Int J Comput Assist Radiol Surg 2016; 11:1007-14. [PMID: 26995603 DOI: 10.1007/s11548-016-1363-x] [Citation(s) in RCA: 39] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2016] [Accepted: 02/24/2016] [Indexed: 10/22/2022]
Abstract
PURPOSE In many orthopedic surgeries, there is a demand for correctly placing medical instruments (e.g., K-wire or drill) to perform bone fracture repairs. The main challenge is the mental alignment of X-ray images acquired using a C-arm, the medical instruments, and the patient, which dramatically increases in complexity during pelvic surgeries. Current solutions include the continuous acquisition of many intra-operative X-ray images from various views, which will result in high radiation exposure, long surgical durations, and significant effort and frustration for the surgical staff. This work conducts a preclinical usability study to test and evaluate mixed reality visualization techniques using intra-operative X-ray, optical, and RGBD imaging to augment the surgeon's view to assist accurate placement of tools. METHOD We design and perform a usability study to compare the performance of surgeons and their task load using three different mixed reality systems during K-wire placements. The three systems are interventional X-ray imaging, X-ray augmentation on 2D video, and 3D surface reconstruction augmented by digitally reconstructed radiographs and live tool visualization. RESULTS The evaluation criteria include duration, number of X-ray images acquired, placement accuracy, and the surgical task load, which are observed during 21 clinically relevant interventions performed by surgeons on phantoms. Finally, we test for statistically significant improvements and show that the mixed reality visualization leads to a significantly improved efficiency. CONCLUSION The 3D visualization of patient, tool, and DRR shows clear advantages over the conventional X-ray imaging and provides intuitive feedback to place the medical tools correctly and efficiently.
Collapse
|
18
|
Wang M, Ding H, Wang X, Wang G. Target visibility enhancement for C-arm cone beam CT-fluoroscopy-guided hepatic needle placement: implementation and accuracy evaluation. Int J Comput Assist Radiol Surg 2014; 10:263-73. [PMID: 24830534 DOI: 10.1007/s11548-014-1070-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2014] [Accepted: 05/02/2014] [Indexed: 11/25/2022]
Abstract
PURPOSE Fluoroscopy-guided hepatic intervention is limited by target visibility and respiratory movement. A feasible procedure for visibility enhancement of the key regions and targets in 2D fluoroscopic images is needed. A system was developed to improve targeting by integrating the forward projection of objects extracted from 3D cone beam CT (CBCT) volumes. The target matching accuracy during regular respiration was measured to evaluate the system. METHOD 3D CBCT abdominal volumes were acquired and segmented to extract different regions, including the diaphragm, hepatic vessels, bony structures, and hepatic tumor. The segmented result was rendered and projected to generate augmented fluoroscopy fusion images. The target matching accuracy by applying these procedures was evaluated for the hepatic intervention guidance. RESULT Quantitative assessment of the target matching accuracy in the upper section of liver was performed for eight targets from four subjects. The 2D and 3D target matching accuracy were 0.98±0.37 and 1.47±0.26 mm, respectively. The 2D target matching accuracy was 1.46±0.67 mm for the target in the lower liver. This accuracy should be acceptable for the 5 mm safety margin required in clinical use. CONCLUSION Visibility of targets in 2D fluoroscopy was enhanced to improve interactive navigation guidance for hepatic needle placement. The target matching accuracy for the C-arm cone beam CT-fluoroscopy-guided hepatic needle targeting was sufficient for clinical use.
Collapse
Affiliation(s)
- Mengjiao Wang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Room C249, Beijing, 100084, China,
| | | | | | | |
Collapse
|
19
|
A low-cost tracked C-arm (TC-arm) upgrade system for versatile quantitative intraoperative imaging. Int J Comput Assist Radiol Surg 2013; 9:695-711. [PMID: 24323400 DOI: 10.1007/s11548-013-0957-9] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2013] [Accepted: 10/23/2013] [Indexed: 10/25/2022]
Abstract
PURPOSE C-arm fluoroscopy is frequently used in clinical applications as a low-cost and mobile real-time qualitative assessment tool. C-arms, however, are not widely accepted for applications involving quantitative assessments, mainly due to the lack of reliable and low-cost position tracking methods, as well as adequate calibration and registration techniques. The solution suggested in this work is a tracked C-arm (TC-arm) which employs a low-cost sensor tracking module that can be retrofitted to any conventional C-arm for tracking the individual joints of the device. METHODS Registration and offline calibration methods were developed that allow accurate tracking of the gantry and determination of the exact intrinsic and extrinsic parameters of the imaging system for any acquired fluoroscopic image. The performance of the system was evaluated in comparison to an Optotrak[Formula: see text] motion tracking system and by a series of experiments on accurately built ball-bearing phantoms. Accuracies of the system were determined for 2D-3D registration, three-dimensional landmark localization, and for generating panoramic stitched views in simulated intraoperative applications. RESULTS The system was able to track the center point of the gantry with an accuracy of [Formula: see text] mm or better. Accuracies of 2D-3D registrations were [Formula: see text] mm and [Formula: see text]. Three-dimensional landmark localization had an accuracy of [Formula: see text] of the length (or [Formula: see text] mm) on average, depending on whether the landmarks were located along, above, or across the table. The overall accuracies of the two-dimensional measurements conducted on stitched panoramic images of the femur and lumbar spine were 2.5 [Formula: see text] 2.0 % [Formula: see text] and [Formula: see text], respectively. CONCLUSION The TC-arm system has the potential to achieve sophisticated quantitative fluoroscopy assessment capabilities using an existing C-arm imaging system. This technology may be useful to improve the quality of orthopedic surgery and interventional radiology.
Collapse
|