1
|
El-Sayed C, Yiu A, Burke J, Vaughan-Shaw P, Todd J, Lin P, Kasmani Z, Munsch C, Rooshenas L, Campbell M, Bach SP. Measures of performance and proficiency in robotic assisted surgery: a systematic review. J Robot Surg 2024; 18:16. [PMID: 38217749 DOI: 10.1007/s11701-023-01756-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2023] [Accepted: 11/07/2023] [Indexed: 01/15/2024]
Abstract
Robotic assisted surgery (RAS) has seen a global rise in adoption. Despite this, there is not a standardised training curricula nor a standardised measure of performance. We performed a systematic review across the surgical specialties in RAS and evaluated tools used to assess surgeons' technical performance. Using the PRISMA 2020 guidelines, Pubmed, Embase and the Cochrane Library were searched systematically for full texts published on or after January 2020-January 2022. Observational studies and RCTs were included; review articles and systematic reviews were excluded. The papers' quality and bias score were assessed using the Newcastle Ottawa Score for the observational studies and Cochrane Risk Tool for the RCTs. The initial search yielded 1189 papers of which 72 fit the eligibility criteria. 27 unique performance metrics were identified. Global assessments were the most common tool of assessment (n = 13); the most used was GEARS (Global Evaluative Assessment of Robotic Skills). 11 metrics (42%) were objective tools of performance. Automated performance metrics (APMs) were the most widely used objective metrics whilst the remaining (n = 15, 58%) were subjective. The results demonstrate variation in tools used to assess technical performance in RAS. A large proportion of the metrics are subjective measures which increases the risk of bias amongst users. A standardised objective metric which measures all domains of technical performance from global to cognitive is required. The metric should be applicable to all RAS procedures and easily implementable. Automated performance metrics (APMs) have demonstrated promise in their wide use of accurate measures.
Collapse
Affiliation(s)
- Charlotte El-Sayed
- RCS England/HEE Robotics Research Fellow, University of Birmingham, Birmingham, United Kingdom.
| | - A Yiu
- RCS England/HEE Robotics Research Fellow, University of Birmingham, Birmingham, United Kingdom
| | - J Burke
- RCS England/HEE Robotics Research Fellow, University of Birmingham, Birmingham, United Kingdom
| | - P Vaughan-Shaw
- RCS England/HEE Robotics Research Fellow, University of Birmingham, Birmingham, United Kingdom
| | - J Todd
- RCS England/HEE Robotics Research Fellow, University of Birmingham, Birmingham, United Kingdom
| | - P Lin
- RCS England/HEE Robotics Research Fellow, University of Birmingham, Birmingham, United Kingdom
| | - Z Kasmani
- RCS England/HEE Robotics Research Fellow, University of Birmingham, Birmingham, United Kingdom
| | - C Munsch
- RCS England/HEE Robotics Research Fellow, University of Birmingham, Birmingham, United Kingdom
| | - L Rooshenas
- RCS England/HEE Robotics Research Fellow, University of Birmingham, Birmingham, United Kingdom
| | - M Campbell
- RCS England/HEE Robotics Research Fellow, University of Birmingham, Birmingham, United Kingdom
| | - S P Bach
- RCS England/HEE Robotics Research Fellow, University of Birmingham, Birmingham, United Kingdom
| |
Collapse
|
2
|
Rahimi AM, Hardon SF, Willuth E, Lang F, Haney CM, Felinska EA, Kowalewski KF, Müller-Stich BP, Horeman T, Nickel F, Daams F. Force-based assessment of tissue handling skills in simulation training for robot-assisted surgery. Surg Endosc 2023:10.1007/s00464-023-09905-y. [PMID: 36759353 DOI: 10.1007/s00464-023-09905-y] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2022] [Accepted: 01/21/2023] [Indexed: 02/11/2023]
Abstract
INTRODUCTION Although robotic-assisted surgery is increasingly performed, objective assessment of technical skills is lacking. The aim of this study is to provide validity evidence for objective assessment of technical skills for robotic-assisted surgery. METHODS An international multicenter study was conducted with participants from the academic hospitals Heidelberg University Hospital (Germany, Heidelberg) and the Amsterdam University Medical Centers (The Netherlands, Amsterdam). Trainees with distinctly different levels of robotic surgery experience were divided into three groups (novice, intermediate, expert) and enrolled in a training curriculum. Each trainee performed six trials of a standardized suturing task using the da Vinci Surgical System. Using the ForceSense system, five force-based parameters were analyzed, for objective assessment of tissue handling skills. Mann-Whitney U test and linear regression were used to analyze performance differences and the Wilcoxon signed-rank test to analyze skills progression. RESULTS A total of 360 trials, performed by 60 participants, were analyzed. Significant differences between the novices, intermediates and experts were observed regarding the total completion time (41 s vs 29 s vs 22 s p = 0.003), mean non zero force (29 N vs 33 N vs 19 N p = 0.032), maximum impulse (40 Ns vs 31 Ns vs 20 Ns p = 0.001) and force volume (38 N3 vs 32 N3 vs 22 N3 p = 0.018). Furthermore, the experts showed better results in mean non-zero force (22 N vs 13 N p = 0.015), maximum impulse (24 Ns vs 17 Ns p = 0.043) and force volume (25 N3 vs 16 N3 p = 0.025) compared to the intermediates (p ≤ 0.05). Lastly, learning curve improvement was observed for the total task completion time, mean non-zero force, maximum impulse and force volume (p ≤ 0.05). CONCLUSION Construct validity for force-based assessment of tissue handling skills in robot-assisted surgery is established. It is advised to incorporate objective assessment and feedback in robot-assisted surgery training programs to determine technical proficiency and, potentially, to prevent tissue trauma.
Collapse
Affiliation(s)
- A Masie Rahimi
- Department of Surgery, Amsterdam UMC-VU University Medical Center, Amsterdam, The Netherlands. .,Amsterdam Skills Centre for Health Sciences, Tafelbergweg 47, 1105 BD, Amsterdam, The Netherlands. .,Cancer Center Amsterdam, Amsterdam, The Netherlands.
| | - Sem F Hardon
- Department of Surgery, Amsterdam UMC-VU University Medical Center, Amsterdam, The Netherlands.,Department of Biomechanical Engineering, Delft University of Technology, Delft, The Netherlands
| | - E Willuth
- Department of General, Visceral and Transplantation Surgery, Heidelberg University, Heidelberg, Germany
| | - F Lang
- Department of General, Visceral and Transplantation Surgery, Heidelberg University, Heidelberg, Germany
| | - Caelan M Haney
- Department of General, Visceral and Transplantation Surgery, Heidelberg University, Heidelberg, Germany
| | - Eleni A Felinska
- Department of General, Visceral and Transplantation Surgery, Heidelberg University, Heidelberg, Germany
| | - Karl-Friedrich Kowalewski
- Department of General, Visceral and Transplantation Surgery, Heidelberg University, Heidelberg, Germany
| | - Beat P Müller-Stich
- Department of General, Visceral and Transplantation Surgery, Heidelberg University, Heidelberg, Germany
| | - Tim Horeman
- Department of Biomechanical Engineering, Delft University of Technology, Delft, The Netherlands
| | - F Nickel
- Department of General, Visceral and Transplantation Surgery, Heidelberg University, Heidelberg, Germany
| | - Freek Daams
- Department of Surgery, Amsterdam UMC-VU University Medical Center, Amsterdam, The Netherlands
| |
Collapse
|
3
|
Spatiotemporal Modeling of Grip Forces Captures Proficiency in Manual Robot Control. BIOENGINEERING (BASEL, SWITZERLAND) 2023; 10:bioengineering10010059. [PMID: 36671631 PMCID: PMC9854605 DOI: 10.3390/bioengineering10010059] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/10/2022] [Revised: 12/19/2022] [Accepted: 12/27/2022] [Indexed: 01/06/2023]
Abstract
New technologies for monitoring grip forces during hand and finger movements in non-standard task contexts have provided unprecedented functional insights into somatosensory cognition. Somatosensory cognition is the basis of our ability to manipulate and transform objects of the physical world and to grasp them with the right amount of force. In previous work, the wireless tracking of grip-force signals recorded from biosensors in the palm of the human hand has permitted us to unravel some of the functional synergies that underlie perceptual and motor learning under conditions of non-standard and essentially unreliable sensory input. This paper builds on this previous work and discusses further, functionally motivated, analyses of individual grip-force data in manual robot control. Grip forces were recorded from various loci in the dominant and non-dominant hands of individuals with wearable wireless sensor technology. Statistical analyses bring to the fore skill-specific temporal variations in thousands of grip forces of a complete novice and a highly proficient expert in manual robot control. A brain-inspired neural network model that uses the output metric of a self-organizing pap with unsupervised winner-take-all learning was run on the sensor output from both hands of each user. The neural network metric expresses the difference between an input representation and its model representation at any given moment in time and reliably captures the differences between novice and expert performance in terms of grip-force variability.Functionally motivated spatiotemporal analysis of individual average grip forces, computed for time windows of constant size in the output of a restricted amount of task-relevant sensors in the dominant (preferred) hand, reveal finger-specific synergies reflecting robotic task skill. The analyses lead the way towards grip-force monitoring in real time. This will permit tracking task skill evolution in trainees, or identify individual proficiency levels in human robot-interaction, which represents unprecedented challenges for perceptual and motor adaptation in environmental contexts of high sensory uncertainty. Cross-disciplinary insights from systems neuroscience and cognitive behavioral science, and the predictive modeling of operator skills using parsimonious Artificial Intelligence (AI), will contribute towards improving the outcome of new types of surgery, in particular the single-port approaches such as NOTES (Natural Orifice Transluminal Endoscopic Surgery) and SILS (Single-Incision Laparoscopic Surgery).
Collapse
|
4
|
Dresp-Langley B. Grip force as a functional window to somatosensory cognition. Front Psychol 2022; 13:1026439. [DOI: 10.3389/fpsyg.2022.1026439] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/23/2022] [Accepted: 09/26/2022] [Indexed: 11/13/2022] Open
Abstract
Analysis of grip force signals tailored to hand and finger movement evolution and changes in grip force control during task execution provide unprecedented functional insight into somatosensory cognition. Somatosensory cognition is the basis of our ability to act upon and to transform the physical world around us, to recognize objects on the basis of touch alone, and to grasp them with the right amount of force for lifting and manipulating them. Recent technology has permitted the wireless monitoring of grip force signals recorded from biosensors in the palm of the human hand to track and trace human grip forces deployed in cognitive tasks executed under conditions of variable sensory (visual, auditory) input. Non-invasive multi-finger grip force sensor technology can be exploited to explore functional interactions between somatosensory brain mechanisms and motor control, in particular during learning a cognitive task where the planning and strategic execution of hand movements is essential. Sensorial and cognitive processes underlying manual skills and/or hand-specific (dominant versus non-dominant hand) behaviors can be studied in a variety of contexts by probing selected measurement loci in the fingers and palm of the human hand. Thousands of sensor data recorded from multiple spatial locations can be approached statistically to breathe functional sense into the forces measured under specific task constraints. Grip force patterns in individual performance profiling may reveal the evolution of grip force control as a direct result of cognitive changes during task learning. Grip forces can be functionally mapped to from-global-to-local coding principles in brain networks governing somatosensory processes for motor control in cognitive tasks leading to a specific task expertise or skill. Under the light of a comprehensive overview of recent discoveries into the functional significance of human grip force variations, perspectives for future studies in cognition, in particular the cognitive control of strategic and task relevant hand movements in complex real-world precision task, are pointed out.
Collapse
|
5
|
Soangra R, Jiang P, Haik D, Xu P, Brevik A, Peta A, Tapiero S, Landman J, John EB, Clayman R. Beyond Efficiency: Surface Electromyography Enables Further Insights into the Surgical Movements of Urologists. J Endourol 2022; 36:1355-1361. [PMID: 35726396 DOI: 10.1089/end.2022.0120] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
INTRODUCTION Surgical skill evaluation while performing minimally invasive surgeries is a highly complex task. It is important to objectively assess an individual's technical skills throughout surgical training to monitor progress and to intervene when skills are not commensurate with the year of training. The miniaturization of wireless wearable platforms integrated with sensor technology has made it possible to non-invasively assess muscle activations and movement variability during performance of minimally invasive surgical tasks. Our objective was to use electromyography to deconstruct the motions of a surgeon during robotic suturing and distinguish quantifiable movements that characterize the skill of an experienced, expert urologic surgeon from trainees. METHODS Three skill groups of participants: novice (n=11), intermediate (n=12) and expert (n=3) were enrolled in the study. A total of 12 wireless wearable sensors consisting of surface electromyograms (EMGs) and accelerometers were placed along upper extremity muscles to assess muscle activations and movement variability, respectively. Participants then performed a robotic suturing task. RESULTS EMG-based parameters: total time, dominant frequency, cumulative muscular workload (CMW were significantly different across the three skill groups. We also found nonlinear movement variability parameters such as correlation dimension, Lyapunov exponent trended differently across the three skill groups. CONCLUSIONS These findings suggest that economy of motion variables and nonlinear movement variabilities are affected by surgical experience level. Wearable sensor signal analysis could make it possible to objectively evaluate surgical skill level periodically throughout the residency training experience.
Collapse
Affiliation(s)
- Rahul Soangra
- Chapman University System, 240092, Orange, California, United States;
| | - Pengbo Jiang
- University of California Irvine, 8788, Urology, Irvine, California, United States;
| | - Daniel Haik
- University of California Irvine, 8788, Irvine, California, United States;
| | - Perry Xu
- University of California Irvine, 8788, 3800 Chapman Avenue - Suite 7200, Irvine, California, United States, 92697;
| | - Andrew Brevik
- University of California Irvine, 8788, Urology, 333 City Blvd West, Orange, California, United States, 92868.,Kansas City University of Medicine and Biosciences, 32959, Kansas City, Missouri, United States, 64106-1453;
| | - Akhil Peta
- University of California Irvine, 8788, Urology, 333 City Blvd, Suite 2170, Orange, California, United States, 92868;
| | - Shlomi Tapiero
- University of California Irvine, 8788, Urology, 333 City Blvd W, Suite 2100, Irvine, California, United States, 92697;
| | - Jaime Landman
- University of California Irvine, 8788, Urology, Orange, California, United States;
| | | | - Ralph Clayman
- University of California Irvine, 8788, Urology, Orange, California, United States;
| |
Collapse
|
6
|
Application of Design Structure Matrix to Simulate Surgical Procedures and Predict Surgery Duration. Minim Invasive Surg 2021; 2021:6340754. [PMID: 34912579 PMCID: PMC8668307 DOI: 10.1155/2021/6340754] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2021] [Accepted: 10/05/2021] [Indexed: 12/02/2022] Open
Abstract
Background The complexities of surgery require an efficient and explicit method to evaluate and standardize surgical procedures. A reliable surgical evaluation tool will be able to serve various purposes such as development of surgery training programs and improvement of surgical skills. Objectives (a) To develop a modeling framework based on integration of dexterity analysis and design structure matrix (DSM), to be generally applicable to predict total duration of a surgical procedure, and (b) to validate the model by comparing its results with laparoscopic cholecystectomy surgery protocol. Method A modeling framework is developed through DSM, a tool used in engineering design, systems engineering and management, to hierarchically decompose and describe relationships among individual surgical activities. Individual decomposed activities are assumed to have uncertain parameters so that a rework probability is introduced. The simulation produces a distribution of the duration of the modeled procedure. A statistical approach is then taken to evaluate surgery duration through integrated numerical parameters. The modeling framework is applied for the first time to analyze a surgery; laparoscopic cholecystectomy, a common surgical procedure, is selected for the analysis. Results The present simulation model is validated by comparing its results of predicted surgery duration with the standard laparoscopic cholecystectomy protocols from the Atlas of Minimally Invasive Surgery with 2.5% error and that from the Atlas of Pediatric Laparoscopy and Thoracoscopy with 4% error. Conclusion The present model, developed based on dexterity analysis and DSM, demonstrates a validated capability of predicting laparoscopic cholecystectomy surgery duration. Future studies will explore its potential applications to other surgery procedures and in improving surgeons' performance and training novices.
Collapse
|
7
|
Visual Intelligence: Prediction of Unintentional Surgical-Tool-Induced Bleeding during Robotic and Laparoscopic Surgery. ROBOTICS 2021. [DOI: 10.3390/robotics10010037] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Unintentional vascular damage can result from a surgical instrument’s abrupt movements during minimally invasive surgery (laparoscopic or robotic). A novel real-time image processing algorithm based on local entropy is proposed that can detect abrupt movements of surgical instruments and predict bleeding occurrence. The uniform nature of the texture of surgical tools is utilized to segment the tools from the background. By comparing changes in entropy over time, the algorithm determines when the surgical instruments are moved abruptly. We tested the algorithm using 17 videos of minimally invasive surgery, 11 of which had tool-induced bleeding. Our preliminary testing shows that the algorithm is 88% accurate and 90% precise in predicting bleeding. The average advance warning time for the 11 videos is 0.662 s, with the standard deviation being 0.427 s. The proposed approach has the potential to eventually lead to a surgical early warning system or even proactively attenuate tool movement (for robotic surgery) to avoid dangerous surgical outcomes.
Collapse
|
8
|
Hung AJ, Chen J, Ghodoussipour S, Oh PJ, Liu Z, Nguyen J, Purushotham S, Gill IS, Liu Y. A deep-learning model using automated performance metrics and clinical features to predict urinary continence recovery after robot-assisted radical prostatectomy. BJU Int 2019; 124:487-495. [PMID: 30811828 PMCID: PMC6706286 DOI: 10.1111/bju.14735] [Citation(s) in RCA: 98] [Impact Index Per Article: 16.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/01/2023]
Abstract
OBJECTIVES To predict urinary continence recovery after robot-assisted radical prostatectomy (RARP) using a deep learning (DL) model, which was then used to evaluate surgeon's historical patient outcomes. SUBJECTS AND METHODS Robotic surgical automated performance metrics (APMs) during RARP, and patient clinicopathological and continence data were captured prospectively from 100 contemporary RARPs. We used a DL model (DeepSurv) to predict postoperative urinary continence. Model features were ranked based on their importance in prediction. We stratified eight surgeons based on the five top-ranked features. The top four surgeons were categorized in 'Group 1/APMs', while the remaining four were categorized in 'Group 2/APMs'. A separate historical cohort of RARPs (January 2015 to August 2016) performed by these two surgeon groups was then used for comparison. Concordance index (C-index) and mean absolute error (MAE) were used to measure the model's prediction performance. Outcomes of historical cases were compared using the Kruskal-Wallis, chi-squared and Fisher's exact tests. RESULTS Continence was attained in 79 patients (79%) after a median of 126 days. The DL model achieved a C-index of 0.6 and an MAE of 85.9 in predicting continence. APMs were ranked higher by the model than clinicopathological features. In the historical cohort, patients in Group 1/APMs had superior rates of urinary continence at 3 and 6 months postoperatively (47.5 vs 36.7%, P = 0.034, and 68.3 vs 59.2%, P = 0.047, respectively). CONCLUSION Using APMs and clinicopathological data, the DeepSurv DL model was able to predict continence after RARP. In this feasibility study, surgeons with more efficient APMs achieved higher continence rates at 3 and 6 months after RARP.
Collapse
Affiliation(s)
- Andrew J. Hung
- Center for Robotic Simulation & Education, USC Institute of Urology, Keck School of Medicine, University of Southern California, Los Angeles, United States
| | - Jian Chen
- Center for Robotic Simulation & Education, USC Institute of Urology, Keck School of Medicine, University of Southern California, Los Angeles, United States
| | - Saum Ghodoussipour
- Center for Robotic Simulation & Education, USC Institute of Urology, Keck School of Medicine, University of Southern California, Los Angeles, United States
| | - Paul J. Oh
- Center for Robotic Simulation & Education, USC Institute of Urology, Keck School of Medicine, University of Southern California, Los Angeles, United States
| | - Zequn Liu
- School of Electronics Engineering and Computer Science, Peking University, Beijing, China
| | - Jessica Nguyen
- Center for Robotic Simulation & Education, USC Institute of Urology, Keck School of Medicine, University of Southern California, Los Angeles, United States
| | - Sanjay Purushotham
- Department of Information Systems, University of Maryland, Baltimore, United States
| | - Inderbir S. Gill
- Center for Robotic Simulation & Education, USC Institute of Urology, Keck School of Medicine, University of Southern California, Los Angeles, United States
| | - Yan Liu
- Computer Science Department, Viterbi School of Engineering, University of Southern California, Los Angeles, United States
| |
Collapse
|
9
|
Chen J, Chu T, Ghodoussipour S, Bowman S, Patel H, King K, Hung AJ. Effect of surgeon experience and bony pelvic dimensions on surgical performance and patient outcomes in robot-assisted radical prostatectomy. BJU Int 2019; 124:828-835. [PMID: 31265207 DOI: 10.1111/bju.14857] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
Abstract
OBJECTIVES To evaluate the effects of surgeon experience, body habitus, and bony pelvic dimensions on surgeon performance and patient outcomes after robot-assisted radical prostatectomy (RARP). PATIENTS, SUBJECTS AND METHODS The pelvic dimensions of 78 RARP patients were measured on preoperative magnetic resonance imaging and computed tomography by three radiologists. Surgeon automated performance metrics (APMs [instrument motion tracking and system events data, i.e., camera movement, third-arm swap, energy use]) were obtained by a systems data recorder (Intuitive Surgical, Sunnyvale, CA, USA) during RARP. Two analyses were performed: Analysis 1, examined effects of patient characteristics, pelvic dimensions and prior surgeon RARP caseload on APMs using linear regression; Analysis 2, the effects of patient body habitus, bony pelvic measurement, and surgeon experience on short- and long-term outcomes were analysed by multivariable regression. RESULTS Analysis 1 showed that while surgeon experience affected the greatest number of APMs (P < 0.044), the patient's body mass index, bony pelvic dimensions, and prostate size also affected APMs during each surgical step (P < 0.043, P < 0.046, P < 0.034, respectively). Analysis 2 showed that RARP duration was significantly affected by pelvic depth (β = 13.7, P = 0.039) and prostate volume (β = 0.5, P = 0.024). A wider and shallower pelvis was less likely to result in a positive margin (odds ratio 0.25, 95% confidence interval [CI] 0.09-0.72). On multivariate analysis, urinary continence recovery was associated with surgeon's prior RARP experience (hazard ratio [HR] 2.38, 95% CI 1.18-4.81; P = 0.015), but not on pelvic dimensions (HR 1.44, 95% CI 0.95-2.17). CONCLUSION Limited surgical workspace, due to a narrower and deeper pelvis, does affect surgeon performance and patient outcomes, most notably in longer surgery time and an increased positive margin rate.
Collapse
Affiliation(s)
- Jian Chen
- Center for Robotic Simulation and Education, University of Southern California (USC) Institute of Urology, Keck School of Medicine, USC, Los Angeles, CA, USA
| | - Tiffany Chu
- Center for Robotic Simulation and Education, University of Southern California (USC) Institute of Urology, Keck School of Medicine, USC, Los Angeles, CA, USA
| | - Saum Ghodoussipour
- Center for Robotic Simulation and Education, University of Southern California (USC) Institute of Urology, Keck School of Medicine, USC, Los Angeles, CA, USA
| | - Sean Bowman
- Department of Radiology, Keck School of Medicine, USC, Los Angeles, CA, USA
| | - Heetabh Patel
- Department of Radiology, Keck School of Medicine, USC, Los Angeles, CA, USA
| | - Kevin King
- Department of Radiology, Keck School of Medicine, USC, Los Angeles, CA, USA
| | - Andrew J Hung
- Center for Robotic Simulation and Education, University of Southern California (USC) Institute of Urology, Keck School of Medicine, USC, Los Angeles, CA, USA
| |
Collapse
|
10
|
Performance Assessment. COMPREHENSIVE HEALTHCARE SIMULATION: SURGERY AND SURGICAL SUBSPECIALTIES 2019. [DOI: 10.1007/978-3-319-98276-2_9] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/16/2022]
|
11
|
|
12
|
Sparse Hidden Markov Models for Surgical Gesture Classification and Skill Evaluation. INFORMATION PROCESSING IN COMPUTER-ASSISTED INTERVENTIONS 2012. [DOI: 10.1007/978-3-642-30618-1_17] [Citation(s) in RCA: 73] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/02/2022]
|
13
|
Schreuder HWR, Wolswijk R, Zweemer RP, Schijven MP, Verheijen RHM. Training and learning robotic surgery, time for a more structured approach: a systematic review. BJOG 2011; 119:137-49. [PMID: 21981104 DOI: 10.1111/j.1471-0528.2011.03139.x] [Citation(s) in RCA: 159] [Impact Index Per Article: 11.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
BACKGROUND Robotic assisted laparoscopic surgery is growing rapidly and there is an increasing need for a structured approach to train future robotic surgeons. OBJECTIVES To review the literature on training and learning strategies for robotic assisted laparoscopic surgery. SEARCH STRATEGY A systematic search of MEDLINE, EMBASE, the Cochrane Library and the Journal of Robotic Surgery was performed. SELECTION CRITERIA We included articles concerning training, learning, education and teaching of robotic assisted laparoscopic surgery in any specialism. DATA COLLECTION AND ANALYSIS Two authors independently selected articles to be included. We categorised the included articles into: training modalities, learning curve, training future surgeons, curriculum design and implementation. MAIN RESULTS We included 114 full text articles. Training modalities such as didactic training, skills training (dry lab, virtual reality, animal or cadaver models), case observation, bedside assisting, proctoring and the mentoring console can be used for training in robotic assisted laparoscopic surgery. Several training programmes in general and specific programmes designed for residents, fellows and surgeons are described in the literature. We provide guidelines for development of a structured training programme. AUTHORS' CONCLUSIONS Robotic surgical training consists of system training and procedural training. System training should be formally organised and should be competence based, instead of time based. Virtual reality training will play an import role in the near future. Procedural training should be organised in a stepwise approach with objective assessment of each step. This review aims to facilitate and improve the implementation of structured robotic surgical training programmes.
Collapse
Affiliation(s)
- H W R Schreuder
- Division of Women and Baby, Department of Gynaecological Oncology, University Medical Centre Utrecht, The Netherlands.
| | | | | | | | | |
Collapse
|
14
|
Reiley CE, Lin HC, Yuh DD, Hager GD. Review of methods for objective surgical skill evaluation. Surg Endosc 2011; 25:356-66. [PMID: 20607563 DOI: 10.1007/s00464-010-1190-z] [Citation(s) in RCA: 124] [Impact Index Per Article: 8.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2009] [Accepted: 06/14/2010] [Indexed: 01/15/2023]
Abstract
BACKGROUND Rising health and financial costs associated with iatrogenic errors have drawn increasing attention to the dexterity of surgeons. With the advent of new technologies, such as robotic surgical systems and medical simulators, researchers now have the tools to analyze surgical motion with the goal of differentiating the level of technical skill in surgeons. METHODS The review for this paper is obtained from a Google Scholar and PubMed search of the key words "objective surgical skill evaluation." Only studies that included motion analysis were used. RESULTS In this paper, we provide a clinical motivation for the importance of surgical skill evaluation. We review the current methods of tracking surgical motion and the available data-collection systems. We also survey current methods of surgical skill evaluation and show that most approaches fall into one of three methods: (1) structured human grading; (2) descriptive statistics; or (3) statistical language models of surgical motion. We discuss the need for an encompassing approach to model human skill through statistical models to allow for objective skill evaluation.
Collapse
Affiliation(s)
- Carol E Reiley
- Laboratory of Computational Sensing and Robotics Department of Computer Science, Johns Hopkins University, 3400 N. Charles Street, Baltimore, MD 21218, USA.
| | | | | | | |
Collapse
|