1
|
Alruwaili FH, Clancy MP, Saeedi-Hosseiny MS, Logar JA, Papachristou C, Haydel C, Parvizi J, Iordachita II, Abedin-Nasab MH. Design and Experimental Evaluation of a Leader-follower Robot-assisted System for Femur Fracture Surgery. INTERNATIONAL JOURNAL OF CONTROL, AUTOMATION, AND SYSTEMS 2024; 22:2833-2846. [PMID: 39886261 PMCID: PMC11781588 DOI: 10.1007/s12555-024-0019-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/08/2024] [Revised: 05/31/2024] [Accepted: 06/23/2024] [Indexed: 02/01/2025]
Abstract
In the face of challenges encountered during femur fracture surgery, such as the high rates of malalignment and X-ray exposure to operating personnel, robot-assisted surgery has emerged as an alternative to conventional state-of-the-art surgical methods. This paper introduces the development of a leader-follower robot-assisted system for femur fracture surgery, called Robossis. Robossis comprises a 7-DOF haptic controller and a 6-DOF surgical robot. A control architecture is developed to address the kinematic mismatch and the motion transfer between the haptic controller and the Robossis surgical robot. A motion control pipeline is designed to address the motion transfer and evaluated through experimental testing. The analysis illustrates that the Robossis surgical robot can adhere to the desired trajectory from the haptic controller with an average translational error of 0.32 mm and a rotational error of 0.07°. Additionally, a haptic rendering pipeline is developed to resolve the kinematic mismatch by constraining the haptic controller's (user's hand) movement within the permissible joint limits of the Robossis surgical robot. Lastly, in a cadaveric lab test, the Robossis system was tested during a mock femur fracture surgery. The result shows that the Robossis system can provide an intuitive solution for surgeons to perform femur fracture surgery.
Collapse
Affiliation(s)
- Fayez H. Alruwaili
- Biomedical Engineering Department, Rowan University, Glassboro, NJ 08028, USA
| | - Michael P. Clancy
- Biomedical Engineering Department, Rowan University, Glassboro, NJ 08028, USA
| | | | - Jacob A. Logar
- Biomedical Engineering Department, Rowan University, Glassboro, NJ 08028, USA
| | | | | | - Javad Parvizi
- Rothman Orthopedic Institute, Thomas Jefferson University Hospital, Philadelphia, Pennsylvania, USA
| | - Iulian I. Iordachita
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD 21218, USA
| | | |
Collapse
|
2
|
Virtual Reality-Based Interface for Advanced Assisted Mobile Robot Teleoperation. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12126071] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/27/2023]
Abstract
This work proposes a new interface for the teleoperation of mobile robots based on virtual reality that allows a natural and intuitive interaction and cooperation between the human and the robot, which is useful for many situations, such as inspection tasks, the mapping of complex environments, etc. Contrary to previous works, the proposed interface does not seek the realism of the virtual environment but provides all the minimum necessary elements that allow the user to carry out the teleoperation task in a more natural and intuitive way. The teleoperation is carried out in such a way that the human user and the mobile robot cooperate in a synergistic way to properly accomplish the task: the user guides the robot through the environment in order to benefit from the intelligence and adaptability of the human, whereas the robot is able to automatically avoid collisions with the objects in the environment in order to benefit from its fast response. The latter is carried out using the well-known potential field-based navigation method. The efficacy of the proposed method is demonstrated through experimentation with the Turtlebot3 Burger mobile robot in both simulation and real-world scenarios. In addition, usability and presence questionnaires were also conducted with users of different ages and backgrounds to demonstrate the benefits of the proposed approach. In particular, the results of these questionnaires show that the proposed virtual reality based interface is intuitive, ergonomic and easy to use.
Collapse
|
3
|
Augmented Reality-Based Interface for Bimanual Robot Teleoperation. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12094379] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
Teleoperation of bimanual robots is being used to carry out complex tasks such as surgeries in medicine. Despite the technological advances, current interfaces are not natural to the users, who spend long periods of time in learning how to use these interfaces. In order to mitigate this issue, this work proposes a novel augmented reality-based interface for teleoperating bimanual robots. The proposed interface is more natural to the user and reduces the interface learning process. A full description of the proposed interface is detailed in the paper, whereas its effectiveness is shown experimentally using two industrial robot manipulators. Moreover, the drawbacks and limitations of the classic teleoperation interface using joysticks are analyzed in order to highlight the benefits of the proposed augmented reality-based interface approach.
Collapse
|
4
|
So JH, Sobucki S, Szewczyk J, Marturi N, Tamadazte B. Shared Control Schemes for Middle Ear Surgery. Front Robot AI 2022; 9:824716. [PMID: 35391943 PMCID: PMC8980232 DOI: 10.3389/frobt.2022.824716] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2021] [Accepted: 02/14/2022] [Indexed: 11/21/2022] Open
Abstract
This paper deals with the control of a redundant cobot arm to accomplish peg-in-hole insertion tasks in the context of middle ear surgery. It mainly focuses on the development of two shared control laws that combine local measurements provided by position or force sensors with the globally observed visual information. We first investigate the two classical and well-established control modes, i.e., a position-based end-frame tele-operation controller and a comanipulation controller. Based on these two control architectures, we then propose a combination of visual feedback and position/force-based inputs in the same control scheme. In contrast to the conventional control designs where all degrees of freedom (DoF) are equally controlled, the proposed shared controllers allow teleoperation of linear/translational DoFs while the rotational ones are simultaneously handled by a vision-based controller. Such controllers reduce the task complexity, e.g., a complex peg-in-hole task is simplified for the operator to basic translations in the space where tool orientations are automatically controlled. Various experiments are conducted, using a 7-DoF robot arm equipped with a force/torque sensor and a camera, validating the proposed controllers in the context of simulating a minimally invasive surgical procedure. The obtained results in terms of accuracy, ergonomics and rapidity are discussed in this paper.
Collapse
Affiliation(s)
- Jae-Hun So
- CNRS UMR 7222, INSERM U1150, ISIR, F-75005, Sorbonne Université, Paris, France
- *Correspondence: Jae-Hun So,
| | - Stéphane Sobucki
- CNRS UMR 7222, INSERM U1150, ISIR, F-75005, Sorbonne Université, Paris, France
| | - Jérôme Szewczyk
- CNRS UMR 7222, INSERM U1150, ISIR, F-75005, Sorbonne Université, Paris, France
| | - Naresh Marturi
- Extreme Robotics Lab, University of Birmingham, Birmingham, United Kingdom
| | - Brahim Tamadazte
- CNRS UMR 7222, INSERM U1150, ISIR, F-75005, Sorbonne Université, Paris, France
| |
Collapse
|
5
|
Selvaggio M, Cacace J, Pacchierotti C, Ruggiero F, Giordano PR. A Shared-Control Teleoperation Architecture for Nonprehensile Object Transportation. IEEE T ROBOT 2022. [DOI: 10.1109/tro.2021.3086773] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
6
|
Selvaggio M, Cognetti M, Nikolaidis S, Ivaldi S, Siciliano B. Autonomy in Physical Human-Robot Interaction: A Brief Survey. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2021.3100603] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
7
|
Chang P, Luo R, Dorostian M, Padr T. A Shared Control Method for Collaborative Human-Robot Plug Task. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2021.3098323] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
8
|
Zolotas M, Wonsick M, Long P, Padır T. Motion Polytopes in Virtual Reality for Shared Control in Remote Manipulation Applications. Front Robot AI 2021; 8:730433. [PMID: 34568439 PMCID: PMC8458706 DOI: 10.3389/frobt.2021.730433] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2021] [Accepted: 08/23/2021] [Indexed: 11/13/2022] Open
Abstract
In remote applications that mandate human supervision, shared control can prove vital by establishing a harmonious balance between the high-level cognition of a user and the low-level autonomy of a robot. Though in practice, achieving this balance is a challenging endeavor that largely depends on whether the operator effectively interprets the underlying shared control. Inspired by recent works on using immersive technologies to expose the internal shared control, we develop a virtual reality system to visually guide human-in-the-loop manipulation. Our implementation of shared control teleoperation employs end effector manipulability polytopes, which are geometrical constructs that embed joint limit and environmental constraints. These constructs capture a holistic view of the constrained manipulator’s motion and can thus be visually represented as feedback for users on their operable space of movement. To assess the efficacy of our proposed approach, we consider a teleoperation task where users manipulate a screwdriver attached to a robotic arm’s end effector. A pilot study with prospective operators is first conducted to discern which graphical cues and virtual reality setup are most preferable. Feedback from this study informs the final design of our virtual reality system, which is subsequently evaluated in the actual screwdriver teleoperation experiment. Our experimental findings support the utility of using polytopes for shared control teleoperation, but hint at the need for longer-term studies to garner their full benefits as virtual guides.
Collapse
Affiliation(s)
- Mark Zolotas
- Northeastern University, Boston, MA, United States
| | | | - Philip Long
- Northeastern University, Boston, MA, United States.,Irish Manufacturing Research, Dublin, Ireland
| | - Taşkın Padır
- Northeastern University, Boston, MA, United States
| |
Collapse
|
9
|
Aggravi M, Estima DAL, Krupa A, Misra S, Pacchierotti C. Haptic Teleoperation of Flexible Needles Combining 3D Ultrasound Guidance and Needle Tip Force Feedback. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2021.3068635] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
10
|
Amanhoud W, Hernandez Sanchez J, Bouri M, Billard A. Contact-initiated shared control strategies for four-arm supernumerary manipulation with foot interfaces. Int J Rob Res 2021. [DOI: 10.1177/02783649211017642] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
In industrial or surgical settings, to achieve many tasks successfully, at least two people are needed. To this end, robotic assistance could be used to enable a single person to perform such tasks alone, with the help of robots through direct, shared, or autonomous control. We are interested in four-arm manipulation scenarios, where both feet are used to control two robotic arms via bi-pedal haptic interfaces. The robotic arms complement the tasks of the biological arms, for instance, in supporting and moving an object while working on it (using both hands). To reduce fatigue, cognitive workload, and to ease the execution of the foot manipulation, we propose two types of assistance that can be enabled upon contact with the object (i.e., based on the interaction forces): autonomous-contact force generation and auto-coordination of the robotic arms. The latter relates to controlling both arms with a single foot, once the object is grasped. We designed four (shared) control strategies that are derived from the combinations (absence/presence) of both assistance modalities, and we compared them through a user study (with 12 participants) on a four-arm manipulation task. The results show that force assistance positively improves human–robot fluency in the four-arm task, the ease of use and usefulness; it also reduces the fatigue. Finally, to make the dual-assistance approach the preferred and most successful among the proposed control strategies, delegating the grasping force to the robotic arms is a crucial factor when controlling them both with a single foot.
Collapse
Affiliation(s)
- Walid Amanhoud
- Learning Algorithms and Systems Laboratory (LASA), Swiss Federal School of Technology in Lausanne EPFL, Lausanne, Switzerland
| | - Jacob Hernandez Sanchez
- Learning Algorithms and Systems Laboratory (LASA), Swiss Federal School of Technology in Lausanne EPFL, Lausanne, Switzerland
- Biorobotics Laboratory (BIOROB), Swiss Federal School of Technology in Lausanne EPFL, Lausanne, Switzerland
| | - Mohamed Bouri
- Biorobotics Laboratory (BIOROB), Swiss Federal School of Technology in Lausanne EPFL, Lausanne, Switzerland
- Translational Neural Engineering Laboratory (TNE), Swiss Federal Institute of Technology (EPFL), Geneva, Switzerland
| | - Aude Billard
- Learning Algorithms and Systems Laboratory (LASA), Swiss Federal School of Technology in Lausanne EPFL, Lausanne, Switzerland
| |
Collapse
|
11
|
Kam M, Saeidi H, Hsieh MH, Kang JU, Krieger A. A Confidence-Based Supervised-Autonomous Control Strategy for Robotic Vaginal Cuff Closure. IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION : ICRA : [PROCEEDINGS]. IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION 2021; 2021:10.1109/icra48506.2021.9561685. [PMID: 34840856 PMCID: PMC8612028 DOI: 10.1109/icra48506.2021.9561685] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Autonomous robotic suturing has the potential to improve surgery outcomes by leveraging accuracy, repeatability, and consistency compared to manual operations. However, achieving full autonomy in complex surgical environments is not practical and human supervision is required to guarantee safety. In this paper, we develop a confidence-based supervised autonomous suturing method to perform robotic suturing tasks via both Smart Tissue Autonomous Robot (STAR) and surgeon collaboratively with the highest possible degree of autonomy. Via the proposed method, STAR performs autonomous suturing when highly confident and otherwise asks the operator for possible assistance in suture positioning adjustments. We evaluate the accuracy of our proposed control method via robotic suturing tests on synthetic vaginal cuff tissues and compare them to the results of vaginal cuff closures performed by an experienced surgeon. Our test results indicate that by using the proposed confidence-based method, STAR can predict the success of pure autonomous suture placement with an accuracy of 94.74%. Moreover, via an additional 25% human intervention, STAR can achieve a 98.1% suture placement accuracy compared to an 85.4% accuracy of completely autonomous robotic suturing. Finally, our experiment results indicate that STAR using the proposed method achieves 1.6 times better consistency in suture spacing and 1.8 times better consistency in suture bite sizes than the manual results.
Collapse
Affiliation(s)
- Michael Kam
- Dep. of Mechanical Engineering, Johns Hopkins University, Baltimore, MD 21211, USA
| | - Hamed Saeidi
- Dep. of Mechanical Engineering, Johns Hopkins University, Baltimore, MD 21211, USA
| | - Michael H Hsieh
- Dep. of Urology, Children's National Hospital, 111 Michigan Ave. N.W., Washington, DC 20010, USA
| | - J U Kang
- Dep. of Electrical and Computer Engineering, Johns Hopkins University, Baltimore, MD 21211, USA
| | - Axel Krieger
- Dep. of Mechanical Engineering, Johns Hopkins University, Baltimore, MD 21211, USA
| |
Collapse
|
12
|
|
13
|
Rakita D, Mutlu B, Gleicher M, Hiatt LM. Shared control-based bimanual robot manipulation. Sci Robot 2021; 4:4/30/eaaw0955. [PMID: 33137728 DOI: 10.1126/scirobotics.aaw0955] [Citation(s) in RCA: 24] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2018] [Accepted: 05/02/2019] [Indexed: 11/02/2022]
Abstract
Human-centered environments provide affordances for and require the use of two-handed, or bimanual, manipulations. Robots designed to function in, and physically interact with, these environments have not been able to meet these requirements because standard bimanual control approaches have not accommodated the diverse, dynamic, and intricate coordinations between two arms to complete bimanual tasks. In this work, we enabled robots to more effectively perform bimanual tasks by introducing a bimanual shared-control method. The control method moves the robot's arms to mimic the operator's arm movements but provides on-the-fly assistance to help the user complete tasks more easily. Our method used a bimanual action vocabulary, constructed by analyzing how people perform two-hand manipulations, as the core abstraction level for reasoning about how to assist in bimanual shared autonomy. The method inferred which individual action from the bimanual action vocabulary was occurring using a sequence-to-sequence recurrent neural network architecture and turned on a corresponding assistance mode, signals introduced into the shared-control loop designed to make the performance of a particular bimanual action easier or more efficient. We demonstrate the effectiveness of our method through two user studies that show that novice users could control a robot to complete a range of complex manipulation tasks more successfully using our method compared to alternative approaches. We discuss the implications of our findings for real-world robot control scenarios.
Collapse
Affiliation(s)
- Daniel Rakita
- Department of Computer Sciences, University of Wisconsin-Madison, Madison, WI, USA.
| | - Bilge Mutlu
- Department of Computer Sciences, University of Wisconsin-Madison, Madison, WI, USA
| | - Michael Gleicher
- Department of Computer Sciences, University of Wisconsin-Madison, Madison, WI, USA
| | | |
Collapse
|
14
|
Girbes-Juan V, Schettino V, Demiris Y, Tornero J. Haptic and Visual Feedback Assistance for Dual-Arm Robot Teleoperation in Surface Conditioning Tasks. IEEE TRANSACTIONS ON HAPTICS 2021; 14:44-56. [PMID: 32746376 DOI: 10.1109/toh.2020.3004388] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Contact driven tasks, such as surface conditioning operations (wiping, polishing, sanding, etc.), are difficult to program in advance to be performed autonomously by a robotic system, specially when the objects involved are moving. In many applications, human-robot physical interaction can be used for the teaching, specially in learning from demonstrations frameworks, but this solution is not always available. Robot teleoperation is very useful when user and robot cannot share the same workspace due to hazardous environments, inaccessible locations, or because of ergonomic issues. In this sense, this article introduces a novel dual-arm teleoperation architecture with haptic and visual feedback to enhance the operator immersion in surface treatment tasks. Two task-based assistance systems are also proposed to control each robotic manipulator individually. To validate the remote assisted control, some usability tests have been carried out using Baxter, a dual-arm collaborative robot. After analysing several benchmark metrics, the results show that the proposed assistance method helps to reduce the task duration and improves the overall performance of the teleoperation.
Collapse
|
15
|
Singh J, Srinivasan AR, Neumann G, Kucukyilmaz A. Haptic-Guided Teleoperation of a 7-DoF Collaborative Robot Arm With an Identical Twin Master. IEEE TRANSACTIONS ON HAPTICS 2020; 13:246-252. [PMID: 32012028 DOI: 10.1109/toh.2020.2971485] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
In this article, we describe two techniques to enable haptic-guided teleoperation using 7-DoF cobot arms as master and slave devices. A shortcoming of using cobots as master-slave systems is the lack of force feedback at the master side. However, recent developments in cobot technologies have brought in affordable, flexible, and safe torque-controlled robot arms, which can be programmed to generate force feedback to mimic the operation of a haptic device. In this article, we use two Franka Emika Panda robot arms as a twin master-slave system to enable haptic-guided teleoperation. We propose a two layer mechanism to implement force feedback due to 1) object interactions in the slave workspace, and 2) virtual forces, e.g. those that can repel from static obstacles in the remote environment or provide task-related guidance forces. We present two different approaches for force rendering and conduct an experimental study to evaluate the performance and usability of these approaches in comparison to teleoperation without haptic guidance. Our results indicate that the proposed joint torque coupling method for rendering task forces improves energy requirements during haptic guided telemanipulation, providing realistic force feedback by accurately matching the slave torque readings at the master side.
Collapse
|
16
|
Rahal R, Matarese G, Gabiccini M, Artoni A, Prattichizzo D, Giordano PR, Pacchierotti C. Caring About the Human Operator: Haptic Shared Control for Enhanced User Comfort in Robotic Telemanipulation. IEEE TRANSACTIONS ON HAPTICS 2020; 13:197-203. [PMID: 31995500 DOI: 10.1109/toh.2020.2969662] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Haptic shared control enables a human operator and an autonomous controller to share the control of a robotic system using haptic active constraints. It has been used in robotic teleoperation for different purposes, such as navigating along paths minimizing the torques requested to the manipulator or avoiding possibly dangerous areas of the workspace. However, few works have focused on using these ideas to account for the user's comfort. In this article, we present an innovative haptic-enabled shared control approach aimed at minimizing the user's workload during a teleoperated manipulation task. Using an inverse kinematic model of the human arm and the rapid upper limb assessment (RULA) metric, the proposed approach estimates the current user's comfort online. From this measure and an a priori knowledge of the task, we then generate dynamic active constraints guiding the users towards a successful completion of the task, along directions that improve their posture and increase their comfort. Studies with human subjects show the effectiveness of the proposed approach, yielding a 30% perceived reduction of the workload with respect to using standard guided human-in-the-loop teleoperation.
Collapse
|