1
|
Bhimreddy M, Menta AK, Fuleihan AA, Davidar AD, Kramer P, Jillala R, Najeed M, Wang X, Theodore N. Beyond Pedicle Screw Placement: Future Minimally Invasive Applications of Robotics in Spine Surgery. Neurosurgery 2025; 96:S94-S102. [PMID: 39950789 DOI: 10.1227/neu.0000000000003335] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2024] [Accepted: 10/07/2024] [Indexed: 05/09/2025] Open
Abstract
Advancements in spine surgery have dramatically enhanced minimally invasive techniques, prominently through integrating robotic systems. Although pedicle screw placement remains the most widespread application of this technology, new developments are emerging to create innovative future avenues for these tools. This review explores the promising applications of robotic technology in minimally invasive spinal procedures, ranging from assistance with laminectomies and vertebroplasty to pain management and treatment of spinal tumors. We also discuss the potential for integrating artificial intelligence and augmented reality with robotic systems. If the current trajectory of research and innovation continues, there is promise in creating fully autonomous robotic systems that can revolutionize spine surgery by processing, planning, and performing procedures without heavy reliance on the surgeon.
Collapse
Affiliation(s)
- Meghana Bhimreddy
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, Maryland, USA
| | | | | | | | | | | | | | | | | |
Collapse
|
2
|
Shida Y, Kumagai S, Iwata H. Robotic navigation with deep reinforcement learning in transthoracic echocardiography. Int J Comput Assist Radiol Surg 2025; 20:191-202. [PMID: 39304590 PMCID: PMC11757869 DOI: 10.1007/s11548-024-03275-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2024] [Accepted: 09/11/2024] [Indexed: 09/22/2024]
Abstract
PURPOSE The search for heart components in robotic transthoracic echocardiography is a time-consuming process. This paper proposes an optimized robotic navigation system for heart components using deep reinforcement learning to achieve an efficient and effective search technique for heart components. METHOD The proposed method introduces (i) an optimized search behavior generation algorithm that avoids multiple local solutions and searches for the optimal solution and (ii) an optimized path generation algorithm that minimizes the search path, thereby realizing short search times. RESULTS The mitral valve search with the proposed method reaches the optimal solution with a probability of 74.4%, the mitral valve confidence loss rate when the local solution stops is 16.3% on average, and the inspection time with the generated path is 48.6 s on average, which is 56.6% of the time cost of the conventional method. CONCLUSION The results indicate that the proposed method improves the search efficiency, and the optimal location can be searched in many cases with the proposed method, and the loss rate of the confidence in the mitral valve was low even when a local solution rather than the optimal solution was reached. It is suggested that the proposed method enables accurate and quick robotic navigation to find heart components.
Collapse
Affiliation(s)
- Yuuki Shida
- Graduate School of Creative Science and Engineering, Waseda University, Tokyo, 169-8050, Japan.
| | - Souto Kumagai
- Graduate School of Creative Science and Engineering, Waseda University, Tokyo, 169-8050, Japan
| | - Hiroyasu Iwata
- Faculty of Science and Engineering, Waseda University, Tokyo, 169-8050, Japan
| |
Collapse
|
3
|
Jiang B, Wang L, Xu K, Hossbach M, Demir A, Rajan P, Taylor RH, Moghekar A, Foroughi P, Kazanzides P, Boctor EM. Wearable Mechatronic Ultrasound-integrated AR Navigation System for Lumbar Puncture Guidance. IEEE TRANSACTIONS ON MEDICAL ROBOTICS AND BIONICS 2023; 5:966-977. [PMID: 38779126 PMCID: PMC11107797 DOI: 10.1109/tmrb.2023.3319963] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2024]
Abstract
As one of the most commonly performed spinal interventions in routine clinical practice, lumbar punctures are usually done with only hand palpation and trial-and-error. Failures can prolong procedure time and introduce complications such as cerebrospinal fluid leaks and headaches. Therefore, an effective needle insertion guidance method is desired. In this work, we present a complete lumbar puncture guidance system with the integration of (1) a wearable mechatronic ultrasound imaging device, (2) volume-reconstruction and bone surface estimation algorithms and (3) two alternative augmented reality user interfaces for needle guidance, including a HoloLens-based and a tablet-based solution. We conducted a quantitative evaluation of the end-to-end navigation accuracy, which shows that our system can achieve an overall needle navigation accuracy of 2.83 mm and 2.76 mm for the Tablet-based and the HoloLens-based solutions, respectively. In addition, we conducted a preliminary user study to qualitatively evaluate the effectiveness and ergonomics of our system on lumbar phantoms. The results show that users were able to successfully reach the target in an average of 1.12 and 1.14 needle insertion attempts for Tablet-based and HoloLens-based systems, respectively, exhibiting the potential to reduce the failure rates of lumbar puncture procedures with the proposed lumbar-puncture guidance.
Collapse
Affiliation(s)
- Baichuan Jiang
- Department of Computer Science, Johns Hopkins University, Baltimore, MD 21218, USA
| | - Liam Wang
- Department of Computer Science, Johns Hopkins University, Baltimore, MD 21218, USA
| | - Keshuai Xu
- Department of Computer Science, Johns Hopkins University, Baltimore, MD 21218, USA
| | | | - Alican Demir
- Clear Guide Medical Inc., Baltimore, MD 21211, USA
| | | | - Russell H. Taylor
- Department of Computer Science, Johns Hopkins University, Baltimore, MD 21218, USA
| | - Abhay Moghekar
- Department of Neurology, Johns Hopkins Medical Institute, Baltimore, MD 21205, USA
| | | | - Peter Kazanzides
- Department of Computer Science, Johns Hopkins University, Baltimore, MD 21218, USA
| | - Emad M. Boctor
- Department of Computer Science, Johns Hopkins University, Baltimore, MD 21218, USA
| |
Collapse
|
4
|
Jiang Z, Salcudean SE, Navab N. Robotic ultrasound imaging: State-of-the-art and future perspectives. Med Image Anal 2023; 89:102878. [PMID: 37541100 DOI: 10.1016/j.media.2023.102878] [Citation(s) in RCA: 32] [Impact Index Per Article: 16.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2022] [Revised: 04/27/2023] [Accepted: 06/22/2023] [Indexed: 08/06/2023]
Abstract
Ultrasound (US) is one of the most widely used modalities for clinical intervention and diagnosis due to the merits of providing non-invasive, radiation-free, and real-time images. However, free-hand US examinations are highly operator-dependent. Robotic US System (RUSS) aims at overcoming this shortcoming by offering reproducibility, while also aiming at improving dexterity, and intelligent anatomy and disease-aware imaging. In addition to enhancing diagnostic outcomes, RUSS also holds the potential to provide medical interventions for populations suffering from the shortage of experienced sonographers. In this paper, we categorize RUSS as teleoperated or autonomous. Regarding teleoperated RUSS, we summarize their technical developments, and clinical evaluations, respectively. This survey then focuses on the review of recent work on autonomous robotic US imaging. We demonstrate that machine learning and artificial intelligence present the key techniques, which enable intelligent patient and process-specific, motion and deformation-aware robotic image acquisition. We also show that the research on artificial intelligence for autonomous RUSS has directed the research community toward understanding and modeling expert sonographers' semantic reasoning and action. Here, we call this process, the recovery of the "language of sonography". This side result of research on autonomous robotic US acquisitions could be considered as valuable and essential as the progress made in the robotic US examination itself. This article will provide both engineers and clinicians with a comprehensive understanding of RUSS by surveying underlying techniques. Additionally, we present the challenges that the scientific community needs to face in the coming years in order to achieve its ultimate goal of developing intelligent robotic sonographer colleagues. These colleagues are expected to be capable of collaborating with human sonographers in dynamic environments to enhance both diagnostic and intraoperative imaging.
Collapse
Affiliation(s)
- Zhongliang Jiang
- Computer Aided Medical Procedures, Technical University of Munich, Munich, Germany.
| | - Septimiu E Salcudean
- Department of Electrical and Computer Engineering, University of British Columbia, Vancouver, BC V6T 1Z4, Canada
| | - Nassir Navab
- Computer Aided Medical Procedures, Technical University of Munich, Munich, Germany; Computer Aided Medical Procedures, Johns Hopkins University, Baltimore, MD, USA
| |
Collapse
|
5
|
Shida Y, Sugawara M, Tsumura R, Chiba H, Uejima T, Iwata H. Diagnostic posture control system for seated-style echocardiography robot. Int J Comput Assist Radiol Surg 2023; 18:887-897. [PMID: 36881353 PMCID: PMC10113306 DOI: 10.1007/s11548-022-02829-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2022] [Accepted: 12/26/2022] [Indexed: 03/08/2023]
Abstract
PURPOSE Conventional robotic ultrasound systems were utilized with patients in supine positions. Meanwhile, the limitation of the systems is that it is difficult to evacuate the patients in case of emergency (e.g., patient discomfort and system failure) because the patients are restricted between the robot system and bed. Therefore, we validated a feasibility study of seated-style echocardiography using a robot. METHOD Preliminary experiments were conducted to verify the following two points: (1) diagnostic image quality due to the sitting posture angle and (2) physical load due to the sitting posture angle. For reducing the physical burden, two unique mechanisms were incorporated into the system: (1) a leg pendulum base mechanism to reduce the load on the legs when the lateral bending angle increases, and (2) a roll angle division by a lumbar lateral bending and thoracic rotation mechanisms. RESULTS Preliminary results demonstrated that adjusting the diagnostic posture angle allowed to obtain the views, including cardiac disease features, as in the conventional examination. The results also demonstrated that the body load reduction mechanism incorporated in the results could reduce the physical load in the seated echocardiography. Furthermore, this system was shown to provide greater safety and shorter evacuation times than conventional systems. CONCLUSION These results indicate that diagnostic echocardiographic images can be obtained by seated-style echocardiography. It was also suggested that the proposed system can reduce the physical load and guarantee a sense of safety and emergency evacuation. These results demonstrated the possibility of the usage of the seated-style echocardiography robot.
Collapse
Affiliation(s)
- Yuuki Shida
- Graduate School of Creative Science and Engineering, Waseda University, Tokyo, 169-8050, Japan.
| | - Masami Sugawara
- Graduate School of Creative Science and Engineering, Waseda University, Tokyo, 169-8050, Japan
| | - Ryosuke Tsumura
- Global Robot Academia Laboratory, Waseda University, Tokyo, 169-8050, Japan
| | - Haruaki Chiba
- NSK Ltd Technology Development Department 1New Field Products Development Center Technology Development Division Headquarters, Kanagawa, 251-8501, Japan
| | | | - Hiroyasu Iwata
- Faculty of Science and Engineering, Waseda University, Tokyo, 169-8050, Japan
| |
Collapse
|
6
|
Fan X, Zhu Q, Tu P, Joskowicz L, Chen X. A review of advances in image-guided orthopedic surgery. Phys Med Biol 2023; 68. [PMID: 36595258 DOI: 10.1088/1361-6560/acaae9] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2022] [Accepted: 12/12/2022] [Indexed: 12/15/2022]
Abstract
Orthopedic surgery remains technically demanding due to the complex anatomical structures and cumbersome surgical procedures. The introduction of image-guided orthopedic surgery (IGOS) has significantly decreased the surgical risk and improved the operation results. This review focuses on the application of recent advances in artificial intelligence (AI), deep learning (DL), augmented reality (AR) and robotics in image-guided spine surgery, joint arthroplasty, fracture reduction and bone tumor resection. For the pre-operative stage, key technologies of AI and DL based medical image segmentation, 3D visualization and surgical planning procedures are systematically reviewed. For the intra-operative stage, the development of novel image registration, surgical tool calibration and real-time navigation are reviewed. Furthermore, the combination of the surgical navigation system with AR and robotic technology is also discussed. Finally, the current issues and prospects of the IGOS system are discussed, with the goal of establishing a reference and providing guidance for surgeons, engineers, and researchers involved in the research and development of this area.
Collapse
Affiliation(s)
- Xingqi Fan
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Qiyang Zhu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Puxun Tu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Leo Joskowicz
- School of Computer Science and Engineering, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China.,Institute of Medical Robotics, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| |
Collapse
|
7
|
Gao C, Phalen H, Margalit A, Ma JH, Ku PC, Unberath M, Taylor RH, Jain A, Armand M. Fluoroscopy-Guided Robotic System for Transforaminal Lumbar Epidural Injections. IEEE TRANSACTIONS ON MEDICAL ROBOTICS AND BIONICS 2022; 4:901-909. [PMID: 37790985 PMCID: PMC10544812 DOI: 10.1109/tmrb.2022.3196321] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/05/2023]
Abstract
We present an autonomous robotic spine needle injection system using fluoroscopic image-based navigation. Our system includes patient-specific planning, intra-operative image-based 2D/3D registration and navigation, and automatic robot-guided needle injection. We performed intensive simulation studies to validate the registration accuracy. We achieved a mean spine vertebrae registration error of 0.8 ± 0.3 mm, 0.9 ± 0.7 degrees, mean injection device registration error of 0.2 ± 0.6 mm, 1.2 ± 1.3 degrees, in translation and rotation, respectively. We then conducted cadaveric studies comparing our system to an experienced clinician's free-hand injections. We achieved a mean needle tip translational error of 5.1 ± 2.4 mm and needle orientation error of 3.6 ± 1.9 degrees for robotic injections, compared to 7.6 ± 2.8 mm and 9.9 ± 4.7 degrees for clinician's free-hand injections, respectively. During injections, all needle tips were placed within the defined safety zones for this application. The results suggest the feasibility of using our image-guided robotic injection system for spinal orthopedic applications.
Collapse
Affiliation(s)
- Cong Gao
- Department of Computer Science, Johns Hopkins University, Baltimore, MD, USA 21211
| | - Henry Phalen
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA 21211
| | - Adam Margalit
- Department of Orthopaedic Surgery, Baltimore, MD, USA 21224
| | - Justin H Ma
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA 21211
| | - Ping-Cheng Ku
- Department of Computer Science, Johns Hopkins University, Baltimore, MD, USA 21211
| | - Mathias Unberath
- Department of Computer Science, Johns Hopkins University, Baltimore, MD, USA 21211
| | - Russell H Taylor
- Department of Computer Science, Johns Hopkins University, Baltimore, MD, USA 21211
| | - Amit Jain
- Department of Orthopaedic Surgery, Baltimore, MD, USA 21224
| | - Mehran Armand
- Department of Computer Science, Johns Hopkins University, Baltimore, MD, USA 21211
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA 21211
- Department of Orthopaedic Surgery, Baltimore, MD, USA 21224
- Johns Hopkins Applied Physics Laboratory, Baltimore, MD, USA 21224
| |
Collapse
|
8
|
Wendler T, van Leeuwen FWB, Navab N, van Oosterom MN. How molecular imaging will enable robotic precision surgery : The role of artificial intelligence, augmented reality, and navigation. Eur J Nucl Med Mol Imaging 2021; 48:4201-4224. [PMID: 34185136 PMCID: PMC8566413 DOI: 10.1007/s00259-021-05445-6] [Citation(s) in RCA: 32] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2021] [Accepted: 06/01/2021] [Indexed: 02/08/2023]
Abstract
Molecular imaging is one of the pillars of precision surgery. Its applications range from early diagnostics to therapy planning, execution, and the accurate assessment of outcomes. In particular, molecular imaging solutions are in high demand in minimally invasive surgical strategies, such as the substantially increasing field of robotic surgery. This review aims at connecting the molecular imaging and nuclear medicine community to the rapidly expanding armory of surgical medical devices. Such devices entail technologies ranging from artificial intelligence and computer-aided visualization technologies (software) to innovative molecular imaging modalities and surgical navigation (hardware). We discuss technologies based on their role at different steps of the surgical workflow, i.e., from surgical decision and planning, over to target localization and excision guidance, all the way to (back table) surgical verification. This provides a glimpse of how innovations from the technology fields can realize an exciting future for the molecular imaging and surgery communities.
Collapse
Affiliation(s)
- Thomas Wendler
- Chair for Computer Aided Medical Procedures and Augmented Reality, Technische Universität München, Boltzmannstr. 3, 85748 Garching bei München, Germany
| | - Fijs W. B. van Leeuwen
- Department of Radiology, Interventional Molecular Imaging Laboratory, Leiden University Medical Center, Leiden, The Netherlands
- Department of Urology, The Netherlands Cancer Institute - Antonie van Leeuwenhoek Hospital, Amsterdam, The Netherlands
- Orsi Academy, Melle, Belgium
| | - Nassir Navab
- Chair for Computer Aided Medical Procedures and Augmented Reality, Technische Universität München, Boltzmannstr. 3, 85748 Garching bei München, Germany
- Chair for Computer Aided Medical Procedures Laboratory for Computational Sensing + Robotics, Johns-Hopkins University, Baltimore, MD USA
| | - Matthias N. van Oosterom
- Department of Radiology, Interventional Molecular Imaging Laboratory, Leiden University Medical Center, Leiden, The Netherlands
- Department of Urology, The Netherlands Cancer Institute - Antonie van Leeuwenhoek Hospital, Amsterdam, The Netherlands
| |
Collapse
|
9
|
Tattoo tomography: Freehand 3D photoacoustic image reconstruction with an optical pattern. Int J Comput Assist Radiol Surg 2021; 16:1101-1110. [PMID: 33993409 PMCID: PMC8260532 DOI: 10.1007/s11548-021-02399-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2021] [Accepted: 05/02/2021] [Indexed: 11/28/2022]
Abstract
Purpose Photoacoustic tomography (PAT) is a novel imaging technique that can spatially resolve both morphological and functional tissue properties, such as vessel topology and tissue oxygenation. While this capacity makes PAT a promising modality for the diagnosis, treatment, and follow-up of various diseases, a current drawback is the limited field of view provided by the conventionally applied 2D probes.
Methods In this paper, we present a novel approach to 3D reconstruction of PAT data (Tattoo tomography) that does not require an external tracking system and can smoothly be integrated into clinical workflows. It is based on an optical pattern placed on the region of interest prior to image acquisition. This pattern is designed in a way that a single tomographic image of it enables the recovery of the probe pose relative to the coordinate system of the pattern, which serves as a global coordinate system for image compounding.
Results To investigate the feasibility of Tattoo tomography, we assessed the quality of 3D image reconstruction with experimental phantom data and in vivo forearm data. The results obtained with our prototype indicate that the Tattoo method enables the accurate and precise 3D reconstruction of PAT data and may be better suited for this task than the baseline method using optical tracking. Conclusions In contrast to previous approaches to 3D ultrasound (US) or PAT reconstruction, the Tattoo approach neither requires complex external hardware nor training data acquired for a specific application. It could thus become a valuable tool for clinical freehand PAT.
Collapse
|
10
|
Chen S, Li Z, Lin Y, Wang F, Cao Q. Automatic ultrasound scanning robotic system with optical waveguide-based force measurement. Int J Comput Assist Radiol Surg 2021; 16:1015-1025. [PMID: 33939078 DOI: 10.1007/s11548-021-02385-2] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/25/2020] [Accepted: 04/19/2021] [Indexed: 12/18/2022]
Abstract
PURPOSE The three-dimensional (3D) ultrasound (US) imaging realized by continuous scanning of a region is of great value for medical diagnosis and robot-assisted needle insertion. During scanning, the contact force and posture between the probe and skin of the patient are crucial factors that determine the quality of US imaging. We propose a robotic system for automatic scanning of curved surfaces with a stable contact force and vertical contact posture (the probe is parallel to the normal of the surface at the contact point). METHODS A 6-DOF robotic arm is used to hold and drive a two-dimensional (2D) US probe to complete automatic scanning. Further, a path-planning strategy is proposed to generate the scan path covering the selected area automatically. We also developed a novel force-measuring device based on optical waveguides to measure the distributed contact force and contact posture. Based on the measured force and posture, the robotic arm automatically adjusts the position and orientation of the probe and maintains a stable contact force and vertical contact posture at each scan point. RESULTS The novel force-measuring device is easy to fabricate, integrates with the probe and has the capacity of measuring the force distributed on the contact surface and estimating the contact posture. The experimental results of automatic scanning of a US phantom and parts of the human body demonstrate that the proposed system performs well in automatically scanning curved surfaces, maintaining a stable contact force and vertical contact posture and producing a good quality 3D US volume. CONCLUSION An automatic US scanning robotic system with an optical waveguide-based force-measuring device was developed and tested successfully. Experimental results demonstrated the feasibility of the proposed system to scan the human body.
Collapse
Affiliation(s)
- Shihang Chen
- School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Zhaojun Li
- Shanghai General Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Yanping Lin
- School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China. .,Institute of Medical Robotics, Shanghai Jiao Tong University, Shanghai, People's Republic of China.
| | - Fang Wang
- Shanghai General Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Qixin Cao
- School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China.,Institute of Medical Robotics, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| |
Collapse
|
11
|
Ipsen S, Wulff D, Kuhlemann I, Schweikard A, Ernst F. Towards automated ultrasound imaging-robotic image acquisition in liver and prostate for long-term motion monitoring. Phys Med Biol 2021; 66. [PMID: 33770768 DOI: 10.1088/1361-6560/abf277] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2021] [Accepted: 03/26/2021] [Indexed: 11/12/2022]
Abstract
Real-time volumetric (4D) ultrasound has shown high potential for diagnostic and therapy guidance tasks. One of the main drawbacks of ultrasound imaging to date is the reliance on manual probe positioning and the resulting user dependence. Robotic assistance could help overcome this issue and facilitate the acquisition of long-term image data to observe dynamic processesin vivoover time. The aim of this study is to assess the feasibility of robotic probe manipulation and organ motion quantification during extended imaging sessions. The system consists of a collaborative robot and a 4D ultrasound system providing real-time data access. Five healthy volunteers received liver and prostate scans during free breathing over 30 min. Initial probe placement was performed with real-time remote control with a predefined contact force of 10 N. During scan acquisition, the probe position was continuously adjusted to the body surface motion using impedance control. Ultrasound volumes, the pose of the end-effector and the estimated contact forces were recorded. For motion analysis, one anatomical landmark was manually annotated in a subset of ultrasound frames for each experiment. Probe contact was uninterrupted over the entire scan duration in all ten sessions. Organ drift and imaging artefacts were successfully compensated using remote control. The median contact force along the probe's longitudinal axis was 10.0 N with maximum values of 13.2 and 21.3 N for liver and prostate, respectively. Forces exceeding 11 N only occurred in 0.3% of the time. Probe and landmark motion were more pronounced in the liver, with median interquartile ranges of 1.5 and 9.6 mm, compared to 0.6 and 2.7 mm in the prostate. The results show that robotic ultrasound imaging with dynamic force control can be used for stable, long-term imaging of anatomical regions affected by motion. The system facilitates the acquisition of 4D image datain vivoover extended scanning periods for the first time and holds the potential to be used for motion monitoring for therapy guidance as well as diagnostic tasks.
Collapse
Affiliation(s)
- Svenja Ipsen
- Institute for Robotics and Cognitive Systems, University of Luebeck, Luebeck, Germany.,Fraunhofer Research Institution for Individualized and Cell-Based Medical Engineering IMTE, Luebeck, Germany
| | - Daniel Wulff
- Institute for Robotics and Cognitive Systems, University of Luebeck, Luebeck, Germany
| | - Ivo Kuhlemann
- Institute for Robotics and Cognitive Systems, University of Luebeck, Luebeck, Germany
| | - Achim Schweikard
- Institute for Robotics and Cognitive Systems, University of Luebeck, Luebeck, Germany
| | - Floris Ernst
- Institute for Robotics and Cognitive Systems, University of Luebeck, Luebeck, Germany
| |
Collapse
|
12
|
Housden J, Wang S, Bao X, Zheng J, Skelton E, Matthew J, Noh Y, Eltiraifi O, Singh A, Singh D, Rhode K. Towards Standardized Acquisition with a Dual-probe Ultrasound Robot for Fetal Imaging. IEEE Robot Autom Lett 2021; 6:1059-1065. [PMID: 33912664 PMCID: PMC7610692 DOI: 10.1109/lra.2021.3056033] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/23/2023]
Abstract
Standardized acquisitions and diagnoses using robots and AI would potentially increase the general usability and reliability of medical ultrasound. Working towards this prospect, this paper presents the recent developments of a standardized acquisition workflow using a novel dual-probe ultrasound robot, for a project known as intelligent Fetal Imaging and Diagnosis (iFIND). The workflow includes an abdominal surface mapping step to obtain a non-parametric spline surface, a rule-based end-point calculation method to position each individual joint, and a motor synchronization method to achieve a smooth motion towards a target point. The design and implementation of the robot are first presented in this paper and the proposed workflow is then explained in detail with simulation and volunteer experiments performed and analyzed. The closed-form analytical solution to the specific motion planning problem has demonstrated a reliable performance controlling the robot to move towards the expected scanning areas and the calculated proximity of the robot to the surface shows that the robot maintains a safe distance while moving around the abdomen. The volunteer study has successfully demonstrated the reliable working and controllability of the robot in terms of acquiring desired ultrasound views. Our future work will focus on improving the motion planning, and on integrating the proposed standardized acquisition workflow with newly- developed ultrasound image processing methods to obtain diagnostic results in an accurate and consistent way.
Collapse
Affiliation(s)
- James Housden
- School of Biomedical Engineering and Imaging Sciences, King’s College London, London SE1 7EH, UK
| | - Shuangyi Wang
- State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences, Beijing 100190, China
| | - Xianqiang Bao
- School of Biomedical Engineering and Imaging Sciences, King’s College London, London SE1 7EH, UK
| | - Jia Zheng
- School of General Engineering, Beihang University, Beijing 100191, China
| | - Emily Skelton
- School of Biomedical Engineering and Imaging Sciences, King’s College London, London SE1 7EH, UK
| | - Jacqueline Matthew
- School of Biomedical Engineering and Imaging Sciences, King’s College London, London SE1 7EH, UK
| | - Yohan Noh
- Department of Mechanical and Aerospace Engineering, Brunel University, London UB8 3PH, UK
| | - Olla Eltiraifi
- School of Biomedical Engineering and Imaging Sciences, King’s College London, London SE1 7EH, UK
| | | | | | - Kawal Rhode
- School of Biomedical Engineering and Imaging Sciences, King’s College London, London SE1 7EH, UK
| |
Collapse
|
13
|
von Haxthausen F, Böttger S, Wulff D, Hagenah J, García-Vázquez V, Ipsen S. Medical Robotics for Ultrasound Imaging: Current Systems and Future Trends. ACTA ACUST UNITED AC 2021; 2:55-71. [PMID: 34977593 PMCID: PMC7898497 DOI: 10.1007/s43154-020-00037-y] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/21/2020] [Indexed: 12/17/2022]
Abstract
Abstract
Purpose of Review
This review provides an overview of the most recent robotic ultrasound systems that have contemporary emerged over the past five years, highlighting their status and future directions. The systems are categorized based on their level of robot autonomy (LORA).
Recent Findings
Teleoperating systems show the highest level of technical maturity. Collaborative assisting and autonomous systems are still in the research phase, with a focus on ultrasound image processing and force adaptation strategies. However, missing key factors are clinical studies and appropriate safety strategies. Future research will likely focus on artificial intelligence and virtual/augmented reality to improve image understanding and ergonomics.
Summary
A review on robotic ultrasound systems is presented in which first technical specifications are outlined. Hereafter, the literature of the past five years is subdivided into teleoperation, collaborative assistance, or autonomous systems based on LORA. Finally, future trends for robotic ultrasound systems are reviewed with a focus on artificial intelligence and virtual/augmented reality.
Collapse
Affiliation(s)
- Felix von Haxthausen
- Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
| | - Sven Böttger
- Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
| | - Daniel Wulff
- Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
| | - Jannis Hagenah
- Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
| | - Verónica García-Vázquez
- Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
| | - Svenja Ipsen
- Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
| |
Collapse
|
14
|
Chen S, Wang F, Lin Y, Shi Q, Wang Y. Ultrasound-guided needle insertion robotic system for percutaneous puncture. Int J Comput Assist Radiol Surg 2021; 16:475-484. [PMID: 33484429 DOI: 10.1007/s11548-020-02300-1] [Citation(s) in RCA: 24] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2020] [Accepted: 12/11/2020] [Indexed: 12/16/2022]
Abstract
PURPOSE Ultrasound (US)-guided percutaneous puncture technology can realize real-time, minimally invasive interventional therapy without radiation. The location accuracy of the puncture needle directly determines the precision and safety of the operation. It is a challenge for novices and young surgeons to perform a free-hand puncture guided by the ultrasound images to achieve the desired accuracy. This work aims to develop a robotic system to assist surgeons to perform percutaneous punctures with high precision. METHODS An US-guided puncture robot was designed to allow the mounting and control of the needle to achieve localization and insertion. The US probe fitted within the puncture robot was held by a passive arm. Moreover, the puncture robot was calibrated with a novel calibration method to achieve coordinate transformation between the robot and the US image. The system allowed the operators to plan the puncture target and puncture path on US images, and the robot performed needle insertion automatically. Five groups of puncture experiments were performed to verify the validity and accuracy of the proposed robotic system. RESULTS Assisted by the robotic system, the positioning and orientation accuracies of the needle insertion were 0.9 ± 0.29 mm and 0.76 ± 0.34°, respectively. These are improved compared with the results obtained with the free-hand puncture (1.82 ± 0.51 mm and 2.79 ± 1.32°, respectively). Moreover, the proposed robotic system can reduce the operation time and number of needle insertions (14.28 ± 3.21 s and one needle insertion, respectively), compared with the free-hand puncture (25.14 ± 6.09 s and 1.96 ± 0.68 needle insertions, respectively). CONCLUSION A robotic system for percutaneous puncture guided by US images was developed and demonstrated. The experimental results indicate that the proposed system is accurate and feasible. It can assist novices and young surgeons to perform the puncture operation with increased accuracy.
Collapse
Affiliation(s)
- Shihang Chen
- School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Fang Wang
- Shanghai General Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Yanping Lin
- School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China.
| | - Qiusheng Shi
- Shanghai General Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Yanli Wang
- Fudan University Shanghai Cancer Center, Shanghai, People's Republic of China
| |
Collapse
|
15
|
Tirindelli M, Victorova M, Esteban J, Kim ST, Navarro-Alarcon D, Zheng YP, Navab N. Force-Ultrasound Fusion: Bringing Spine Robotic-US to the Next “Level”. IEEE Robot Autom Lett 2020. [DOI: 10.1109/lra.2020.3009069] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
16
|
Kaminski JT, Rafatzand K, Zhang HK. Feasibility of Robot-Assisted Ultrasound Imaging with Force Feedback for Assessment of Thyroid Diseases. PROCEEDINGS OF SPIE--THE INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING 2020; 11315:113151D. [PMID: 32742057 PMCID: PMC7392820 DOI: 10.1117/12.2551118] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
Medical ultrasound is extensively used to define tissue textures and to characterize lesions, and it is the modality of choice for detection and follow-up assessment of thyroid diseases. Classical medical ultrasound procedures are performed manually by an occupational operator with a hand-held ultrasound probe. These procedures require high physical and cognitive burden and yield clinical results that are highly operator-dependent, therefore frequently diminishing trust in ultrasound imaging data accuracy in repetitive assessment. A robotic ultrasound procedure, on the other hand, is an emerging paradigm integrating a robotic arm with an ultrasound probe. It achieves an automated or semi-automated ultrasound scanning by controlling the scanning trajectory, region of interest, and the contact force. Therefore, the scanning becomes more informative and comparable in subsequent examinations over a long-time span. In this work, we present a technique for allowing operators to reproduce reliably comparable ultrasound images with the combination of predefined trajectory execution and real-time force feedback control. The platform utilized features a 7-axis robotic arm capable of 6-DoF force-torque sensing and a linear-array ultrasound probe. The measured forces and torques affecting the probe are used to adaptively modify the predefined trajectory during autonomously performed examinations and probe-phantom interaction force accuracy is evaluated. In parallel, by processing and combining ultrasound B-Mode images with probe spatial information, structural features can be extracted from the scanning volume through a 3D scan. The validation was performed on a tissue-mimicking phantom containing thyroid features, and we successfully demonstrated high image registration accuracy between multiple trials.
Collapse
Affiliation(s)
- Jakub T. Kaminski
- Robotics Engineering Program, Worcester Polytechnic Institute, MA, USA
| | - Khashayar Rafatzand
- Department of Radiology, University of Massachusetts Medical School, MA, USA
| | - Haichong K. Zhang
- Robotics Engineering Program, Worcester Polytechnic Institute, MA, USA
- Department of Biomedical Engineering, Worcester Polytechnic Institute, MA, USA
| |
Collapse
|