1
|
Tao Q, Liu J, Zheng Y, Yang Y, Lin C, Guang C. Evaluation of an Active Disturbance Rejection Controller for Ophthalmic Robots with Piezo-Driven Injector. MICROMACHINES 2024; 15:833. [PMID: 39064342 PMCID: PMC11278564 DOI: 10.3390/mi15070833] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/05/2024] [Revised: 06/24/2024] [Accepted: 06/25/2024] [Indexed: 07/28/2024]
Abstract
Retinal vein cannulation involves puncturing an occluded vessel on the micron scale. Even single millinewton force can cause permanent damage. An ophthalmic robot with a piezo-driven injector is precise enough to perform this delicate procedure, but the uncertain viscoelastic characteristics of the vessel make it difficult to achieve the desired contact force without harming the retina. The paper utilizes a viscoelastic contact model to explain the mechanical characteristics of retinal blood vessels to address this issue. The uncertainty in the viscoelastic properties is considered an internal disturbance of the contact model, and an active disturbance rejection controller is then proposed to precisely control the contact force. The experimental results show that this method can precisely adjust the contact force at the millinewton level even when the viscoelastic parameters vary significantly (up to 403.8%). The root mean square (RMS) and maximum value of steady-state error are 0.32 mN and 0.41 mN. The response time is below 2.51 s with no obvious overshoot.
Collapse
Affiliation(s)
- Qiannan Tao
- School of Energy and Power Engineering, Beihang University, Beijing 100191, China;
| | - Jianjun Liu
- School of Mechanical Engineering and Automation, Beihang University, Beijing 100191, China; (J.L.); (C.L.)
| | - Yu Zheng
- College of Automation and College of Artificial Intelligence, Nanjing University of Posts and Telecommunications, Nanjing 210023, China
| | - Yang Yang
- School of Mechanical Engineering and Automation, Beihang University, Beijing 100191, China; (J.L.); (C.L.)
| | - Chuang Lin
- School of Mechanical Engineering and Automation, Beihang University, Beijing 100191, China; (J.L.); (C.L.)
| | - Chenhan Guang
- School of Mechanical and Materials Engineering, North China University of Technology, Beijing 100144, China;
| |
Collapse
|
2
|
Wang T, Li H, Pu T, Yang L. Microsurgery Robots: Applications, Design, and Development. SENSORS (BASEL, SWITZERLAND) 2023; 23:8503. [PMID: 37896597 PMCID: PMC10611418 DOI: 10.3390/s23208503] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/24/2023] [Revised: 10/07/2023] [Accepted: 10/09/2023] [Indexed: 10/29/2023]
Abstract
Microsurgical techniques have been widely utilized in various surgical specialties, such as ophthalmology, neurosurgery, and otolaryngology, which require intricate and precise surgical tool manipulation on a small scale. In microsurgery, operations on delicate vessels or tissues require high standards in surgeons' skills. This exceptionally high requirement in skills leads to a steep learning curve and lengthy training before the surgeons can perform microsurgical procedures with quality outcomes. The microsurgery robot (MSR), which can improve surgeons' operation skills through various functions, has received extensive research attention in the past three decades. There have been many review papers summarizing the research on MSR for specific surgical specialties. However, an in-depth review of the relevant technologies used in MSR systems is limited in the literature. This review details the technical challenges in microsurgery, and systematically summarizes the key technologies in MSR with a developmental perspective from the basic structural mechanism design, to the perception and human-machine interaction methods, and further to the ability in achieving a certain level of autonomy. By presenting and comparing the methods and technologies in this cutting-edge research, this paper aims to provide readers with a comprehensive understanding of the current state of MSR research and identify potential directions for future development in MSR.
Collapse
Affiliation(s)
- Tiexin Wang
- ZJU-UIUC Institute, International Campus, Zhejiang University, Haining 314400, China; (T.W.); (H.L.); (T.P.)
- School of Mechanical Engineering, Zhejiang University, Hangzhou 310058, China
| | - Haoyu Li
- ZJU-UIUC Institute, International Campus, Zhejiang University, Haining 314400, China; (T.W.); (H.L.); (T.P.)
| | - Tanhong Pu
- ZJU-UIUC Institute, International Campus, Zhejiang University, Haining 314400, China; (T.W.); (H.L.); (T.P.)
| | - Liangjing Yang
- ZJU-UIUC Institute, International Campus, Zhejiang University, Haining 314400, China; (T.W.); (H.L.); (T.P.)
- School of Mechanical Engineering, Zhejiang University, Hangzhou 310058, China
- Department of Mechanical Engineering, University of Illinois Urbana-Champaign, Urbana, IL 61801, USA
| |
Collapse
|
3
|
Zhou M, Hennerkes F, Liu J, Jiang Z, Wendler T, Nasseri MA, Iordachita I, Navab N. Theoretical error analysis of spotlight-based instrument localization for retinal surgery. ROBOTICA 2023; 41:1536-1549. [PMID: 37982126 PMCID: PMC10655674 DOI: 10.1017/s0263574722001862] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/27/2023]
Abstract
Retinal surgery is widely considered to be a complicated and challenging task even for specialists. Image-guided robot-assisted intervention is among the novel and promising solutions that may enhance human capabilities therein. In this paper, we demonstrate the possibility of using spotlights for 5D guidance of a microsurgical instrument. The theoretical basis of the localization for the instrument based on the projection of a single spotlight is analyzed to deduce the position and orientation of the spotlight source. The usage of multiple spotlights is also proposed to check the possibility of further improvements for the performance boundaries. The proposed method is verified within a high-fidelity simulation environment using the 3D creation suite Blender. Experimental results show that the average positioning error is 0.029 mm using a single spotlight and 0.025 mm with three spotlights, respectively, while the rotational errors are 0.124 and 0.101, which shows the application to be promising in instrument localization for retinal surgery.
Collapse
Affiliation(s)
- Mingchuan Zhou
- College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou, China
| | - Felix Hennerkes
- Chair for Computer Aided Medical Procedures and Augmented Reality, Computer Science Department, Technische Universität München, Munchen, Germany
| | - Jingsong Liu
- Chair for Computer Aided Medical Procedures and Augmented Reality, Computer Science Department, Technische Universität München, Munchen, Germany
| | - Zhongliang Jiang
- Chair for Computer Aided Medical Procedures and Augmented Reality, Computer Science Department, Technische Universität München, Munchen, Germany
| | - Thomas Wendler
- Chair for Computer Aided Medical Procedures and Augmented Reality, Computer Science Department, Technische Universität München, Munchen, Germany
| | - M Ali Nasseri
- Augenklinik und Poliklinik, Klinikum rechts der Isar der Technische Universität München, München, Germany
| | - Iulian Iordachita
- Department of Mechanical Engineering and Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
| | - Nassir Navab
- Chair for Computer Aided Medical Procedures and Augmented Reality, Computer Science Department, Technische Universität München, Munchen, Germany
| |
Collapse
|
4
|
Iordachita II, de Smet MD, Naus G, Mitsuishi M, Riviere CN. Robotic Assistance for Intraocular Microsurgery: Challenges and Perspectives. PROCEEDINGS OF THE IEEE. INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS 2022; 110:893-908. [PMID: 36588782 PMCID: PMC9799958 DOI: 10.1109/jproc.2022.3169466] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/17/2023]
Abstract
Intraocular surgery, one of the most challenging discipline of microsurgery, requires sensory and motor skills at the limits of human physiological capabilities combined with tremendously difficult requirements for accuracy and steadiness. Nowadays, robotics combined with advanced imaging has opened conspicuous and significant directions in advancing the field of intraocular microsurgery. Having patient treatment with greater safety and efficiency as the final goal, similar to other medical applications, robotics has a real potential to fundamentally change microsurgery by combining human strengths with computer and sensor-based technology in an information-driven environment. Still in its early stages, robotic assistance for intraocular microsurgery has been accepted with precaution in the operating room and successfully tested in a limited number of clinical trials. However, owing to its demonstrated capabilities including hand tremor reduction, haptic feedback, steadiness, enhanced dexterity, micrometer-scale accuracy, and others, microsurgery robotics has evolved as a very promising trend in advancing retinal surgery. This paper will analyze the advances in retinal robotic microsurgery, its current drawbacks and limitations, as well as the possible new directions to expand retinal microsurgery to techniques currently beyond human boundaries or infeasible without robotics.
Collapse
Affiliation(s)
- Iulian I Iordachita
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA
| | - Marc D de Smet
- Microinvasive Ocular Surgery Center (MIOS), Lausanne, Switzerland
| | | | - Mamoru Mitsuishi
- Department of Mechanical Engineering, The University of Tokyo, Japan
| | | |
Collapse
|
5
|
Zhou M, Wu J, Ebrahimi A, Patel N, He C, Gehlbach P, Taylor RH, Knoll A, Nasseri MA, Iordachita I. Spotlight-based 3D Instrument Guidance for Retinal Surgery. ... INTERNATIONAL SYMPOSIUM ON MEDICAL ROBOTICS. INTERNATIONAL SYMPOSIUM ON MEDICAL ROBOTICS 2021; 2020. [PMID: 34595483 DOI: 10.1109/ismr48331.2020.9312952] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
Retinal surgery is a complex activity that can be challenging for a surgeon to perform effectively and safely. Image guided robot-assisted surgery is one of the promising solutions that bring significant surgical enhancement in treatment outcome and reduce the physical limitations of human surgeons. In this paper, we demonstrate a novel method for 3D guidance of the instrument based on the projection of spotlight in the single microscope images. The spotlight projection mechanism is firstly analyzed and modeled with a projection on both a plane and a sphere surface. To test the feasibility of the proposed method, a light fiber is integrated into the instrument which is driven by the Steady-Hand Eye Robot (SHER). The spot of light is segmented and tracked on a phantom retina using the proposed algorithm. The static calibration and dynamic test results both show that the proposed method can easily archive 0.5 mm of tip-to-surface distance which is within the clinically acceptable accuracy for intraocular visual guidance.
Collapse
Affiliation(s)
- Mingchuan Zhou
- Department of Mechanical Engineering and Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD 21218 USA.,Chair of Robotics, Artificial Intelligence and Real-time Systems, Technische Universität München, München 85748 Germany
| | - Jiahao Wu
- Department of Mechanical Engineering and Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD 21218 USA.,T Stone Robotics Institute, the Department of Mechanical and Automation Engineering, The Chinese University of Hong Kong, HKSAR, China
| | - Ali Ebrahimi
- Department of Mechanical Engineering and Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD 21218 USA
| | - Niravkumar Patel
- Department of Mechanical Engineering and Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD 21218 USA
| | - Changyan He
- Department of Mechanical Engineering and Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD 21218 USA
| | - Peter Gehlbach
- Wilmer Eye Institute, Johns Hopkins Hospital, Baltimore, MD 21287 USA
| | - Russell H Taylor
- Department of Mechanical Engineering and Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD 21218 USA
| | - Alois Knoll
- Chair of Robotics, Artificial Intelligence and Real-time Systems, Technische Universität München, München 85748 Germany
| | - M Ali Nasseri
- Augenklinik und Poliklinik, Klinikum rechts der Isar der Technische Universität München, München 81675 Germany
| | - Iulian Iordachita
- Department of Mechanical Engineering and Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD 21218 USA
| |
Collapse
|
6
|
|
7
|
O'Sullivan S, Leonard S, Holzinger A, Allen C, Battaglia F, Nevejans N, van Leeuwen FWB, Sajid MI, Friebe M, Ashrafian H, Heinsen H, Wichmann D, Hartnett M, Gallagher AG. Operational framework and training standard requirements for AI‐empowered robotic surgery. Int J Med Robot 2020; 16:1-13. [DOI: 10.1002/rcs.2020] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/31/2022]
Affiliation(s)
- Shane O'Sullivan
- Department of Pathology, Faculdade de Medicina Universidade de São Paulo São Paulo Brazil
| | - Simon Leonard
- Department of Computer Science Johns Hopkins University Baltimore Maryland USA
| | - Andreas Holzinger
- Holzinger Group, HCI‐KDD, Institute for Medical Informatics/Statistics Medical University of Graz Graz Austria
| | - Colin Allen
- Department of History & Philosophy of Science University of Pittsburgh Pittsburgh Pennsylvania USA
| | - Fiorella Battaglia
- Faculty of Philosophy, Philosophy of Science and the Study of Religion Ludwig‐Maximilians‐Universität München München Germany
| | - Nathalie Nevejans
- Research Center in Law, Ethics and Procedures, Faculty of Law of Douai University of Artois Arras France
| | - Fijs W. B. van Leeuwen
- Interventional Molecular Imaging Laboratory ‐ Radiology department Leiden University Medical Center Leiden the Netherlands
| | - Mohammed Imran Sajid
- Department of Upper GI Surgery Wirral University Teaching Hospital Birkenhead UK
| | - Michael Friebe
- Institute of Medical Engineering Otto‐von‐Guericke‐University Magdeburg Germany
| | - Hutan Ashrafian
- Department of Surgery & Cancer Institute of Global Health Innovation Imperial College London London UK
| | - Helmut Heinsen
- Department of Pathology, Faculdade de Medicina Universidade de São Paulo São Paulo Brazil
- Morphological Brain Research Unit University of Würzburg Würzburg Germany
| | - Dominic Wichmann
- Department of Intensive Care University Hospital Hamburg Eppendorf Hamburg Germany
| | | | - Anthony G. Gallagher
- Faculty of Life and Health Sciences Ulster University Londonderry UK
- ORSI Academy Melle Belgium
| |
Collapse
|
8
|
Andras I, Mazzone E, van Leeuwen FWB, De Naeyer G, van Oosterom MN, Beato S, Buckle T, O'Sullivan S, van Leeuwen PJ, Beulens A, Crisan N, D'Hondt F, Schatteman P, van Der Poel H, Dell'Oglio P, Mottrie A. Artificial intelligence and robotics: a combination that is changing the operating room. World J Urol 2019; 38:2359-2366. [PMID: 31776737 DOI: 10.1007/s00345-019-03037-6] [Citation(s) in RCA: 43] [Impact Index Per Article: 7.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2019] [Accepted: 11/21/2019] [Indexed: 12/12/2022] Open
Abstract
PURPOSE The aim of the current narrative review was to summarize the available evidence in the literature on artificial intelligence (AI) methods that have been applied during robotic surgery. METHODS A narrative review of the literature was performed on MEDLINE/Pubmed and Scopus database on the topics of artificial intelligence, autonomous surgery, machine learning, robotic surgery, and surgical navigation, focusing on articles published between January 2015 and June 2019. All available evidences were analyzed and summarized herein after an interactive peer-review process of the panel. LITERATURE REVIEW The preliminary results of the implementation of AI in clinical setting are encouraging. By providing a readout of the full telemetry and a sophisticated viewing console, robot-assisted surgery can be used to study and refine the application of AI in surgical practice. Machine learning approaches strengthen the feedback regarding surgical skills acquisition, efficiency of the surgical process, surgical guidance and prediction of postoperative outcomes. Tension-sensors on the robotic arms and the integration of augmented reality methods can help enhance the surgical experience and monitor organ movements. CONCLUSIONS The use of AI in robotic surgery is expected to have a significant impact on future surgical training as well as enhance the surgical experience during a procedure. Both aim to realize precision surgery and thus to increase the quality of the surgical care. Implementation of AI in master-slave robotic surgery may allow for the careful, step-by-step consideration of autonomous robotic surgery.
Collapse
Affiliation(s)
- Iulia Andras
- ORSI Academy, Melle, Belgium
- Department of Urology, Iuliu Hatieganu University of Medicine and Pharmacy, Cluj-Napoca, Romania
| | - Elio Mazzone
- ORSI Academy, Melle, Belgium
- Department of Urology, Onze Lieve Vrouw Hospital, Aalst, Belgium
- Department of Urology and Division of Experimental Oncology, URI, Urological Research Institute, IRCCS San Raffaele Scientific Institute, Milan, Italy
| | - Fijs W B van Leeuwen
- ORSI Academy, Melle, Belgium
- Interventional Molecular Imaging Laboratory, Department of Radiology, Leiden University Medical Centre, Leiden, The Netherlands
- Department of Urology, Antoni Van Leeuwenhoek Hospital, The Netherlands Cancer Institute, Amsterdam, The Netherlands
| | - Geert De Naeyer
- ORSI Academy, Melle, Belgium
- Department of Urology, Onze Lieve Vrouw Hospital, Aalst, Belgium
| | - Matthias N van Oosterom
- Interventional Molecular Imaging Laboratory, Department of Radiology, Leiden University Medical Centre, Leiden, The Netherlands
- Department of Urology, Antoni Van Leeuwenhoek Hospital, The Netherlands Cancer Institute, Amsterdam, The Netherlands
| | | | - Tessa Buckle
- Interventional Molecular Imaging Laboratory, Department of Radiology, Leiden University Medical Centre, Leiden, The Netherlands
| | - Shane O'Sullivan
- Department of Pathology, Faculdade de Medicina, Universidade de São Paulo, São Paulo, Brazil
| | - Pim J van Leeuwen
- Department of Urology, Antoni Van Leeuwenhoek Hospital, The Netherlands Cancer Institute, Amsterdam, The Netherlands
| | - Alexander Beulens
- Department of Urology, Catharina Hospital, Eindhoven, The Netherlands
- Netherlands Institute for Health Services (NIVEL), Utrecht, The Netherlands
| | - Nicolae Crisan
- Department of Urology, Iuliu Hatieganu University of Medicine and Pharmacy, Cluj-Napoca, Romania
| | - Frederiek D'Hondt
- ORSI Academy, Melle, Belgium
- Department of Urology, Onze Lieve Vrouw Hospital, Aalst, Belgium
| | - Peter Schatteman
- ORSI Academy, Melle, Belgium
- Department of Urology, Onze Lieve Vrouw Hospital, Aalst, Belgium
| | - Henk van Der Poel
- Department of Urology, Antoni Van Leeuwenhoek Hospital, The Netherlands Cancer Institute, Amsterdam, The Netherlands
| | - Paolo Dell'Oglio
- ORSI Academy, Melle, Belgium.
- Department of Urology, Onze Lieve Vrouw Hospital, Aalst, Belgium.
- Interventional Molecular Imaging Laboratory, Department of Radiology, Leiden University Medical Centre, Leiden, The Netherlands.
- Department of Urology, Antoni Van Leeuwenhoek Hospital, The Netherlands Cancer Institute, Amsterdam, The Netherlands.
| | - Alexandre Mottrie
- ORSI Academy, Melle, Belgium
- Department of Urology, Onze Lieve Vrouw Hospital, Aalst, Belgium
| |
Collapse
|
9
|
Zhang T, Gong L, Wang S, Zuo S. Hand-Held Instrument with Integrated Parallel Mechanism for Active Tremor Compensation During Microsurgery. Ann Biomed Eng 2019; 48:413-425. [PMID: 31531791 DOI: 10.1007/s10439-019-02358-2] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2019] [Accepted: 09/09/2019] [Indexed: 11/29/2022]
Abstract
Physiological hand tremor seriously influences the surgical instrument's tip positioning accuracy during microsurgery. To solve this problem, hand-held active tremor compensation instruments are developed to improve tip positioning accuracy during microsurgery. This paper presents the design and performance of a new hand-held instrument that aims to stabilize hand tremors and increase accuracy in microsurgery. The key components are a three degrees of freedom (DOF) integrated parallel manipulator and a high-performance inertial measurement unit (IMU). The IMU was developed to sense the 3-DOF motion of the instrument tip. A customized filter was applied to extract specific hand tremor motion. Then, the instrument was employed to generate the reverse motion simultaneously to reduce tremor motion. Experimental results show that the tremor compensation mechanism is effective. The average RMS reduction ratio of bench test is 56.5% that is a significant tremor reduction ratio. For hand-held test, it has an average RMS reduction ratio of 41.0%. Hence, it could reduce hand tremor magnitudes by 31.7% RMS in 2-DOF.
Collapse
Affiliation(s)
- Tianci Zhang
- Key Laboratory of Mechanism Theory and Equipment Design of Ministry of Education, Tianjin University, Tianjin, 300072, China
| | - Lun Gong
- Key Laboratory of Mechanism Theory and Equipment Design of Ministry of Education, Tianjin University, Tianjin, 300072, China
| | - Shuxin Wang
- Key Laboratory of Mechanism Theory and Equipment Design of Ministry of Education, Tianjin University, Tianjin, 300072, China
| | - Siyang Zuo
- Key Laboratory of Mechanism Theory and Equipment Design of Ministry of Education, Tianjin University, Tianjin, 300072, China.
| |
Collapse
|
10
|
Charreyron SL, Gabbi E, Boehler Q, Becker M, Nelson BJ. A Magnetically Steered Endolaser Probe for Automated Panretinal Photocoagulation. IEEE Robot Autom Lett 2019. [DOI: 10.1109/lra.2018.2888894] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
11
|
Yang S, Martel JN, Lobes LA, Riviere CN. Techniques for robot-aided intraocular surgery using monocular vision. Int J Rob Res 2018; 37:931-952. [PMID: 30739976 DOI: 10.1177/0278364918778352] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
This paper presents techniques for robot-aided intraocular surgery using monocular vision in order to overcome erroneous stereo reconstruction in an intact eye. We propose a new retinal surface estimation method based on a structured-light approach. A handheld robot known as the Micron enables automatic scanning of a laser probe, creating projected beam patterns on the retinal surface. Geometric analysis of the patterns then allows planar reconstruction of the surface. To realize automated surgery in an intact eye, monocular hybrid visual servoing is accomplished through a scheme that incorporates surface reconstruction and partitioned visual servoing. We investigate the sensitivity of the estimation method according to relevant parameters and also evaluate its performance in both dry and wet conditions. The approach is validated through experiments for automated laser photocoagulation in a realistic eye phantom in vitro. Finally, we present the first demonstration of automated intraocular laser surgery in porcine eyes ex vivo.
Collapse
Affiliation(s)
- Sungwook Yang
- Center for BioMicrosystems, Korea Institute of Science and Technology, Korea
| | - Joseph N Martel
- Department of Ophthalmology, University of Pittsburgh, Pittsburgh, USA
| | - Louis A Lobes
- Department of Ophthalmology, University of Pittsburgh, Pittsburgh, USA
| | | |
Collapse
|
12
|
|
13
|
Nuzzi R, Brusasco L. State of the art of robotic surgery related to vision: brain and eye applications of newly available devices. Eye Brain 2018; 10:13-24. [PMID: 29440943 PMCID: PMC5798758 DOI: 10.2147/eb.s148644] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022] Open
Abstract
Background Robot-assisted surgery has revolutionized many surgical subspecialties, mainly where procedures have to be performed in confined, difficult to visualize spaces. Despite advances in general surgery and neurosurgery, in vivo application of robotics to ocular surgery is still in its infancy, owing to the particular complexities of microsurgery. The use of robotic assistance and feedback guidance on surgical maneuvers could improve the technical performance of expert surgeons during the initial phase of the learning curve. Evidence acquisition We analyzed the advantages and disadvantages of surgical robots, as well as the present applications and future outlook of robotics in neurosurgery in brain areas related to vision and ophthalmology. Discussion Limitations to robotic assistance remain, that need to be overcome before it can be more widely applied in ocular surgery. Conclusion There is heightened interest in studies documenting computerized systems that filter out hand tremor and optimize speed of movement, control of force, and direction and range of movement. Further research is still needed to validate robot-assisted procedures.
Collapse
Affiliation(s)
- Raffaele Nuzzi
- Department of Surgical Sciences, Eye Clinic, University of Torino, Turin, Italy
| | - Luca Brusasco
- Department of Surgical Sciences, Eye Clinic, University of Torino, Turin, Italy
| |
Collapse
|
14
|
Abstract
PURPOSE To review the current literature on robotic assistance for ophthalmic surgery, especially vitreoretinal procedures. METHODS MEDLINE, Embase, and Web of Science databases were searched from inception to August, 2016, for articles relevant to the review topic. Queries included combinations of the terms: robotic eye surgery, ophthalmology, and vitreoretinal. RESULTS In ophthalmology, proof-of-concept papers have shown the feasibility of performing many delicate anterior segment and vitreoretinal surgical procedures accurately with robotic assistance. Multiple surgical platforms have been designed and tested in animal eyes and phantom models. These platforms have the capability to measure forces generated and velocities of different surgical movements. "Smart" instruments have been designed to improve certain tasks such as membrane peeling and retinal vessel cannulations. CONCLUSION Ophthalmic surgery, particularly vitreoretinal surgery, might have reached the limits of human physiologic performance. Robotic assistance can help overcome biologic limitations and improve our surgical performance. Clinical studies of robotic-assisted surgeries are needed to determine safety and feasibility of using this technology in patients.
Collapse
|